Every technology revolution has a unique inflection point. The spark that ignited the artificial intelligence movement was a statistical data analysis system developed by Jim Goodnight when he was a statistics professor at North Carolina State University 45 years ago.
He never imagined that the technology he created to improve crop yields would evolve into sophisticated data analytics software, a precursor to modern day AI. Back then computers could only compute 300 instructions a second and had 8K of memory. Today they can execute 3 billion instructions a second and contain multiple terabytes of memory.
For more on tech, transformation and the future of work, join CNBC at the @ Work: People + Machines Summit in San Francisco on Nov. 4. Leaders from Dropbox, SAS, McKinsey and more will teach us how to balance the needs of today with the possibilities of tomorrow, and the winning strategies to compete.
Goodnight — considered the Godfather of AI — now sits at the helm of the world's largest privately held software companies by revenue: SAS Institute. Despite its low profile, last year the Cary, North Carolina-based company had revenues of $3.27 billion, thanks to analytic and AI platforms used by more than 83,000 businesses, governments and universities.
In an interview with CNBC, the CEO gives his views on how AI is changing the U.S. workforce and what lies ahead.
Over the last four decades, how has data analytics software evolved? Did you ever imagine it would change the world as much as it has?
No. It has been a game changer for society. At first we were using analytics software and doing balanced experiments. Today we have moved into forecasting. Neural networks, which mimic the way the human brain operates, and other machine learning tools are being used to do all sorts of predictions in a host of industries.
As computer speeds grow and the amount of data explodes, this technology has become critical.
How has it become a mainstream tool for business and public institutions?
It is used by nearly every industry in a variety of ways. Drug companies use it for clinical trial analysis. Utilities use it to predict peak demand for electricity. Retailers use it to assess buying patterns so they can figure out what sizes to stock. Banks also are using neural networks to detect credit card fraud and to prevent money laundering.
Areas where I see a surge in demand are 5G technology, connected devices, cloud services, autonomous driving, machine learning and fintech.
What is your forecast for AI over the next decade?
I believe we will see things like computer vision — which involves machines capturing, processing and analyzing real-world images and video to extract information from the physical world — being used. Anything we can see with our eyes we can train a computer to recognize as well. This will be transformative — especially in the autonomous driving sector and in medicine.
Over the past few decades, sensors and image processors have been created to match or even exceed the human eye's capabilities. With larger, more optically perfect lenses and nanometer-scaled image sensors and processors, the precision and sensitivity of modern cameras are incredible, especially compared to common human eyes. Cameras can also record thousands of images per second, detect distances and see better in dark environments.
Already computer vision is making a difference in health care. The medical community is using it to interpret CT scans. SAS is working with Amsterdam University to identify the size of tumors in cancer patients more accurately.
How do you think it will change the workforce and the way companies manage operations?
The largest impact will be felt in the manufacturing industry on the factory floor. Robots with computer vision will become more sophisticated. The process has already begun; there are huge numbers of industrial robots already. Over the years, robots will take on many roles in the factory. Humans will be needed to maintain and program them.
But there are a lot of misconceptions about AI. We are nowhere near the time where robots can think like humans. That is an era far into the future. In today's world humans are needed to train these machines to recognize images and analyze data.
The talent war in the tech sector is fierce. How is SAS retaining and developing workers in this era?
Our turnover rate is 4%, and that is considered low in the tech industry, where rates hover around 14%. We lose a few people to larger tech companies, but we have no trouble replacing them. We do everything possible to make SAS Institute a great place to work, and that includes investing in training. The key is giving employees challenging work. That is more important to a tech worker than a salary.
We manage the company to unleash the power of creativity. We encourage creativity by having demo days so employees can share the products and technology they are working on, pitch management for funding or additional resources. Employees can also come to senior management meetings to pitch their ideas and innovations. Every employee is also expected to complete two training courses a year in a new software language so they can remain up to date on the latest technology.
What advice would you give other companies grappling with the skills shortage issue?
One thing is to create education and skills training programs to develop more data scientists in the U.S. We have partnered with 82 universities, such as Michigan State and the University of Arkansas, to develop master's programs for scientists trained on SAS software. Some of these programs are linked to local businesses that are looking for a talent pipeline.
This has been a big part of our outreach strategy. For example, at North Carolina State University we helped create the Institute for Advanced Analytics, which offers a one-year course simulating a work environment. It produces 120 graduates a year trained in SAS software.