Elon Musk says Tesla will open part of its self-driving software to the public as a safety measure

  • Elon Musk tells hackers at the private DEFCON conference that Tesla will share its security software with other carmakers as open source.
  • He says it's a bid to make autonomous vehicle software safer by opening the software to more scrutiny, according to people who attended the gathering.
Elon Musk speaks onstage at Elon Musk Answers Your Questions! during SXSW at ACL Live on March 11, 2018 in Austin, Texas.
Diego Donamaria | Getty Images | SXSW
Elon Musk speaks onstage at Elon Musk Answers Your Questions! during SXSW at ACL Live on March 11, 2018 in Austin, Texas.

Tesla CEO Elon Musk told a hacker conference in Las Vegas he plans to "open source" the software Tesla uses to secure autonomous-driving features from hacks or takeovers, eventually allowing other carmakers to use it.

It's a bid to make autonomous vehicle software safer by opening the software to more scrutiny, he told a private audience of around 100 people on Friday at DEFCON, an annual cybersecurity defense conference held in Las Vegas.

"I think one of the biggest concerns for autonomous vehicles is somebody achieving a fleet-wide hack," he said according to people who attended. Musk confirmed the decision in a tweet on Saturday, writing it was "extremely important to a safe self-driving future for all."

Musk said the move is partly meant to show that Tesla is putting security concerns above worries over protecting intellectual property, according to the people. It's a departure from self-driving competitors that have fiercely protected their intellectual property, often through litigation (see Uber Technologies' dispute with Alphabet's Waymo").

For many companies, keeping source code secret has itself been considered a security measure. Proprietary source code's value is diminished, the thinking goes, because criminals would rather find and exploit software that many corporations use at once, giving them access to more targets.

But "obscurity" as a security strategy has proven ineffective in several cases.

For instance, software that would previously have been considered obscure — such as that which runs voting machines or operations and control rooms within electrical plants — has proven both vulnerable and desirable to criminals. In addition, while keeping code secret makes it harder for attackers to find and exploit holes, it also makes it harder for security researchers and customers to find holes and demand a fix.

A bug bounty aficionado

Musk has long invited hackers to test Tesla's systems, and the company has one of the industry's most robust "bug bounty" programs. Bug bounties involve inviting cybersecurity professionals to hack the company's systems in exchange for a monetary reward, public recognition or both.

According to information from Bugcrowd, a company that facilitates Tesla's bug bounty program, these rewards range from $100 to $10,000. The average payout to successful hackers in the past three months has been around $1,860, according to Bugcrowd.

Engaging security professionals in this way relies on a series of rules defined informally by bug bounty companies, corporations that wish to engage hackers and the hackers themselves.

Those rules include that hackers who find a vulnerability must allow companies time to validate it and fix it before making the news public. This is an effort to avoid letting criminals know about the problem before it can be fixed.

Other rules include making efforts to avoid privacy violations and not modifying or destroying any data the bug bounty seekers access.