- Tesla must provide National Highway Traffic and Safety Administration with extensive data about its driver assistance system, marketed as Autopilot, by October 22, 2021.
- NHTSA is trying to determine whether Tesla's Autopilot has a safety defect that caused Tesla vehicles to hit first-responder vehicles.
- The agency has the authority to mandate a recall if it determines a car, or any part or system within it, has a safety defect.
The National Highway Traffic and Safety Administration has added a 12th crash into the scope of its investigation into Tesla's Autopilot system, and is demanding that the company provide an exhaustive amount of data about its driver assistance systems by Oct. 22.
Autopilot is Tesla's driver assistance system that comes standard with all of its newer models. Tesla also sells a more advanced version under the brand name "Full Self Driving," for $10,000, or to subscribers for $199 a month in the U.S. Its Autopilot and FSD offerings do not make Tesla vehicles safe for operation without a driver at the wheel -- the systems can control some aspects of the car, but "active driver supervision" is required, according to Tesla's website.
As CNBC previously reported, NHTSA's office of defects investigation kicked off a safety probe in August after the agency determined that Autopilot was in use before collisions between Tesla electric cars and first responder vehicles. Those prior crashes were responsible for 17 injuries and one fatality.
A more recent crash in Orlando, Florida, involving a Tesla Model 3 and a police car, is now part of the investigation. The Tesla driver in that incident narrowly missed a trooper, and told officers she was using the car's Autopilot feature at the time of the collision.
NHTSA's letter to Tesla also sets a deadline of October 22, 2021, by which the company must provide extensive Autopilot-related and vehicle data to the federal auto safety agency.
NHTSA has the power to mandate recalls if it determines a vehicle or any part of it is defective, including software-defined systems like Autopilot.
In the letter, addressed to Tesla's Director of Field Quality, Eddie Gates, NHTSA provides a detailed list of the information it needs to evaluate to determine whether Tesla's Autopilot and traffic aware cruise control caused or contributed to crashes with first responder vehicles.
A professor of electrical and computer engineering at Carnegie Mellon University, Phil Koopman, characterized NHTSA's data request as "really sweeping."
He noted that the agency asked for information about Tesla's entire Autopilot-equipped fleet, encompassing cars, software and hardware Tesla sold from 2014 to 2021 (not just the 12 vehicles involved in the emergency responder crashes).
He said, "This is an incredibly detailed request for huge amounts of data. But it is exactly the type of information that would be needed to dig in to whether Tesla vehicles are acceptably safe."
The National Transportation Safety Board, another federal safety watchdog, has called on NHTSA to impose stricter standards on automated vehicle tech including Tesla Autopilot.
Tesla did not immediately respond to a request for comment.
Read the entire letter from NHTSA to Tesla here.