Tech

Weaponized drones. Machines that attack on their own. 'That day is going to come'

Key Points
  • Artificial intelligence has clear positive uses, but it could be used to teach machines to attack people and their computer networks on their own.
  • Drones and autonomous vehicles could be hacked using AI and turned into weapons
  • Traditional cybersecurity methods won't know how to cope with new attacks carried out by smart machines.
Getty Images

The idea of a computer program learning by itself, growing in knowledge and becoming increasingly sophisticated may be a scary one. It's even scarier when it's learning to attack things.

It's easy to dismiss artificial intelligence as yet another tech buzzword, but it's already being used in everyday applications via algorithmic processes known as machine learning.

Far from the killer robots of “Blade Runner,” machine learning applications are designed to train a computer to fulfill a certain task on its own. Machines are essentially “taught” to complete that task by doing it over and over, learning the many obstacles that could inhibit them.

“Such attacks, which seem like science fiction today, might become reality in the next few years,” Guy Caspi, CEO of cybersecurity start-up Deep Instinct, told CNBC’s new podcast “Beyond the Valley.

GET YOUR ESSENTIAL TECH INSIGHT FROM ACROSS THE GLOBE
PODCAST SERIES
Subscribe
E-NEWSLETTER
Sign up

Such technology promises to provide many benefits, such as smoother computing and the automation of many tasks we may, in years’ time, consider manageable without human intervention. But it also has experts worried.

Hacking, then weaponizing, drones and cars

Technicians and researchers are cautioning about the threat such technology poses for cybersecurity, that fundamentally important practice that keeps our computers and data — and governments' and corporations' computers and data — safe from hackers.

In February, a study from teams at the University of Oxford and University of Cambridge warned that AI could be used as a tool to hack into drones and autonomous vehicles, and turn them into potential weapons.

“Autonomous cars like Google’s (Waymo) are already using deep learning, can already raid obstacles in the real world," Caspi said, "so raiding traditional anti-malware system in cyber domain is possible.”

Another study, by U.S. cybersecurity software giant Symantec, said that 978 million people across 20 countries were affected by cybercrime last year. Victims of cybercrime lost a total of $172 billion — an average of $142 per person — as a result, researchers said.

The fear for many is that AI will bring with it a dawn of new forms of cyber breaches that bypass traditional means of countering attacks.

“We’re still in the early days of the attackers using artificial intelligence themselves, but that day is going to come,” warns Nicole Eagan, CEO of cybersecurity firm Darktrace. “And I think once that switch is flipped on, there’s going to be no turning back, so we are very concerned about the use of AI by the attackers in many ways because they could try to use AI to blend into the background of these networks.”

CNBC’s “Beyond the Valley” brings listeners the brightest minds in technology discussing all the trends shaping the tech industry — and your world. Listen to the podcast or sign up for the “Beyond the Valley” newsletter here.