Tesla Said It Fixed Autopilot: The Feds Aren't Convinced, And The Fallout Could Be Huge
Tesla issued a remarkable recall covering over 2 million cars in December last year, citing Autopilot safety issues, but it seems the woes are far from over for the carmaker. The National Highway Traffic Safety Administration (NHTSA) has now launched a fresh investigation, with the goal of analyzing the "remedy adequacy" following the recall that entailed Tesla fixing the software issues.
The Office of Defects Investigation (ODI) says it found "concerns due to post-remedy crash events and results from preliminary NHTSA tests of remedied vehicles." However, the agency won't identify specifically what issues it has discovered in Tesla cars following the software patch aimed at fixing the flaw that promoted a widespread recall. The agency is also looking into the measures that Tesla pushed to remedy Autopilot safety concerns, but weren't originally detailed as part of the recall communication.
"This investigation will consider why these updates were not a part of the recall or otherwise determined to remedy a defect that poses an unreasonable safety risk," says the agency in a memo (PDF) released earlier today. The latest NHTSA investigation comes at a crucial juncture for Tesla. Sales and profits have taken a steep plunge, experts continue to question the safety measures in place, and lawsuits over its advanced driver assistance systems (ADAS) are piling up, while CEO Elon Musk continues to push autonomy as the future of the company.
Not a good picture for Musk's lofty autonomy goals
The recall investigation is not the only trouble staring at Tesla. The NHTSA has separately published another investigation that clearly blames Tesla's tech stack, and specifically, the safety measures put in place. "The warnings provided by Autopilot when Autosteer was engaged did not adequately ensure that drivers maintained their attention on the driving task," notes the NHTSA investigation report.
Interestingly, the NHTSA's report also highlights its own lack of power, as it can only flag the risks but can't force the company to make specific changes to fix an issue. And that's where the Autopilot situation becomes murky. On one hand, Tesla says that its driver assistance system requires the driver to always be attentive behind the wheel and ready to take over control, in accordance with the SAE Level 2 guidelines.
During the course of the investigation, the NHTSA studied 956 crash incidents spanning over half a decade, which resulted in 29 casualties. Notably, the agency says it "observed a trend of avoidable crashes involving hazards that would have been visible to an attentive driver." It notes that the incidents "with no or late evasive action attempted by the driver" were a repeating trend. Such patterns indicate Tesla can walk away claiming that explicit guidelines weren't followed, but experts are of the opinion that the company should take a more proactive approach.