Food, Travel and Tech

These Chinese hackers tricked Tesla's Autopilot into suddenly switching lanes

Tesla Model S P85D dual electric motor sedan.
Myung J. Chun | Los Angeles Times | Getty Images

A group of Chinese hackers published a report showing how they tricked Tesla's Autopilot self-driving software into swerving into an oncoming traffic lane.

The group of cybersecurity researchers from Keen Security Labs in China placed bright-colored stickers on the road to create a "fake lane" that tricked the self-driving software of a Tesla Model S into veering from the appropriate driving lane into the opposing lane on a test course, where oncoming traffic would be driving in a real-life scenario.

Tesla CEO Elon Musk took to Twitter on Monday to commend the researchers for what Musk described as "solid work."


"Tesla autopilot module's lane recognition function has a good robustness in an ordinary external environment (no strong light, rain, snow, sand and dust interference), but it still doesn't handle the situation correctly in our test scenario," Keen Security Labs wrote in their report, which was published online in March.

In a video the researchers posted to YouTube last week, at around one minute and 20 seconds, a Tesla Model S can be seen driving on a test course and automatically switching lanes upon encountering the decoy stickers.

"[T]his kind of attack is simple to deploy," according to the researchers at Keen Security Labs, which is run by the Chinese tech giant Tencent, and they noted that it is "easy to obtain" the simple circular stickers they used to pull off the stunt.

In another experiment, the researchers were able to trick the Tesla's automatic windshield wipers — which are powered by cameras and the Autopilot's computer vision software — by placing a television screen with images of water on it in front of the vehicle, causing the wipers to spring into action.

In a statement provided to CNBC Make It, a Tesla spokesperson pointed out that in the Keen Security Labs' tests "the physical environment around the vehicle is artificially altered." The Tesla spokesperson added that the vulnerability "is not a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so, and can manually operate the windshield wiper settings at all times."

Elon Musk's personally responding to questions and complaints from his customers
Elon Musk's personally responding to questions and complaints from his customers

This isn't the first time that Keen Security Labs has successfully hacked Tesla's products.

The group of researchers, who are what is known as ethical or "white hat" hackers because their research is meant to improve the security of the products and companies they hack, are actually listed in Tesla's "Security Researcher Hall of Fame" on the electric automaker's website.

In the past, Keen Security Labs has taken part in the "bug bounty program" that Tesla launched in 2014, and which currently offers rewards of up to $15,000 to hackers who make Tesla aware of any potential vulnerabilities in the company's software and other products. Recently, Tesla even awarded a Model 3 car to a pair of hackers who exposed a security bug in the Tesla vehicle that allowed them to take control of the car's internal web browser.

In this case, though, Tesla noted that the research from Keen Security Labs does not qualify for Tesla's bug bounty program because of the ability of drivers to override Autopilot.

"We know it took an extraordinary amount of time, effort, and skill, and we look forward to reviewing future reports from this group," Tesla's spokesperson said.

The company also said it has issued security updates over the past two years that already fixed other vulnerabilities mentioned in the hackers' report.

Currently, Tesla's Autopilot software is not yet fully autonomous — it can perform all driving functions in some circumstances but a human driver has to be ready at all times to take control of the vehicle.

And federal law in the U.S. requires that any cars with autonomous technology still be designed with conventional driver controls so that a human can quickly and easily take control of the vehicle. (Laws in China, where the researchers' tests took place, have similar stipulations.) But those laws are expected to change as autonomous vehicles evolve.

Don't Miss: Tesla's Model 3 ranked 'most satisfying' car, more than Porsche or Corvette

Like this story? Subscribe to CNBC Make It on YouTube!

This YouTube star referred over $12 million in sales to Tesla—and won two free cars
This YouTube star referred over $12 million in sales to Tesla—and won two free cars