- Two consumer safety groups are calling for federal and state investigations of Tesla's semi-autonomous technology in the wake of several fatal crashes linked to the system earlier this year.
- The groups contend the automaker violated Section 5 of the FTC Act, as well as California consumer law.
Two consumer safety groups are calling for federal and state investigations of Tesla's semi-autonomous technology in the wake of several fatal crashes linked to the system earlier this year.
The investigations could pose a major threat to the California electric vehicle maker, as Tesla CEO Elon Musk has promised a fully autonomous version of his technology, called Autopilot, will be released this year. He said during a conference call this week that the company expects to generate significant revenue from fleets of "robotaxis" it intends to roll out in 2020 using Autopilot.
"We feel Tesla violates the laws on deceptive practices, both at the federal and state level," said Jason Levine, the head of the Washington, D.C.-based Center for Auto Safety, one of two groups that have called for an investigation of both the Autopilot system and Tesla's promotion of the technology.
The CAS, along with California's non-profit Consumer Watchdog, pointed to a number of crashes, injuries and deaths that have occurred over the last several years involving Tesla vehicles operating in Autopilot mode. That includes one in May in which a Model S sedan slammed into a parked police car. Two months earlier, a driver was killed when his Model 3 sedan slammed into a semi-trailer in Del Ray Beach, Florida, shearing off its roof.
First introduced in October 2015, Autopilot is what is known, in industry terms, as an Advanced Driver Assistance System, or ADAS. A number of other manufacturers have launched similar technologies, such as Cadillac's Super Cruise and Audi's Traffic Jam Pilot. Some systems can, under very limited circumstances, permit a driver to briefly take hands off the wheel. They all require the motorist to be ready to take immediate control in an emergency.
In the March 1 crash in Florida, the National Transportation Safety Board determined the driver switched on Autopilot 10 seconds before impact and didn't have his hands on the wheel for the final eight seconds.
The agency has made similar findings in other crashes, several of them also fatal.
For its part, Tesla has defended Autopilot. In a statement released in May, it said, "As our quarterly safety reports have shown, drivers using Autopilot register fewer accidents per mile than those driving without it."
That has not been backed up by independent research, however, and Tesla has had to back off of claims that its safety record was supported by the National Highway Traffic Safety Administration.
The automaker also said in a statement that there is nothing about the name, Autopilot, that should mislead consumers.
"Presumably they are equally opposed to the name "Automobile," the statement suggested. The company also argued that it has gone to great lengths to make consumers aware of the limits of the system, in its owner's manuals, on its website and elsewhere.
CAS's Levine dismisses such claims as "legalese," citing the many ways Tesla and Musk have promoted the system. That includes pictures released soon after Autopilot debuted, including ones showing Musk and his then-wife driving off with their hands waving out the windows of a Tesla vehicle. Musk also appeared to imply the system could work hands-free during a December 2018 interview on the CBS newsmagazine, "60 Minutes."
"They can say they've written language to cover their liabilities but their actions portray a desire to deceive consumers," said Levine, in an interview.
Together with Consumer Watchdog, the Center wants both the Federal Trade Commission and the California Department of Motor Vehicles to launch immediate probes. The groups contend the automaker violated Section 5 of the FTC Act, as well as California consumer law, arguing that the way Tesla markets Autopilot is "materially deceptive and … likely to mislead consumers into reasonably believing that their vehicles have self-driving or autonomous capabilities."
Despite such concerns, Tesla has been working to update the Autopilot system and CEO Musk earlier this month repeated earlier promises to introduce a "full self-driving" version before the end of the year.
The CEO has promised to put as many as 1 million robotaxis on the road by 2020, a direct challenge to such ride-sharing services as Uber and Lyft that are working on their own self-driving technologies.
Musk has indicated that the service would provide a new source of revenue for the company. On Wednesday, Tesla posted a $1.12 a share loss for the second quarter, after adjustments, which was far wider than the 40 cents per share analysts surveyed by Refinitiv were expecting. Shares have fallen sharply since the report. Anything that could disrupt that program could complicate Tesla's struggles to turn its finances around.
"There is no question the (Autopilot) technology is impressive," said CAS chief Levine, but Tesla's continued reliance on what he called "hyperbolic statements" misleads consumers and poses serious safety risks.
Correction: This article was updated to remove reference to a study by the Insurance Institute for Highway Safety that was inaccurately characterized. It surveyed 2,000 drivers about five different semi-autonomous driving systems currently on the market. They weren't owners of the cars they were surveyed about and hadn't tested any of the systems.