Tesla Autopilot investigation closed after feds find 13 fatal crashes related to misuse

Tesla Autopilot investigation closed after feds find 13 fatal crashes related to misuse

Posted on


The National Highway Traffic Safety Administration closed a long-standing investigation into Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes involving its misuse, including 13 that were fatal and “many more involving serious injuries.”

At the same time, NHTSA is opening a new investigation to evaluate whether the Autopilot recall fix that Tesla implemented in December is effective enough.

NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

“This mismatch resulted in a critical safety gap between drivers’ expectations of [Autopilot’s] operating capabilities and the system’s true capabilities,” the agency wrote. “This gap led to foreseeable misuse and avoidable crashes.”

The closing of the initial probe, which began in 2021, marks an end of one of the most visible efforts by the government to scrutinize Tesla’s Autopilot software. The Department of Justice is also scrutinizing the company’s claims about the technology, and the California Department of Motor Vehicles has accused Tesla of falsely advertising the capabilities of Autopilot and the more-advanced Full Self-Driving beta software. Tesla, meanwhile, is now going “balls to the wall for autonomy,” according to CEO Elon Musk.

NHTSA said its investigation reviewed 956 reported crashes up until August 30, 2023. In roughly half (489) of those, the agency said either there “was insufficient data to make an assessment,” the other vehicle was at fault, Autopilot was found to not be in use, or the crash was otherwise unrelated to the probe.

NHTSA said the remaining 467 crashes fell into three buckets. There were many (211) crashes where “the frontal plane of the Tesla struck another vehicle or obstacle with adequate time for an attentive driver to respond to avoid or mitigate the crash. It said 145 crashes involved “roadway departures in low traction conditions such as wet roadways. And it said 111 of the crashes involved “roadway departures where Autosteer was inadvertently disengaged by the driver’s inputs.”

Tesla tells drivers they need to pay attention to the road and keep their hands on the wheel while using Autopilot, which it measures via a torque sensor and, in its newer cars, the in-cabin camera. But NHTSA, and other safety groups, have said that these warnings and checks do not go far enough. In December, NHTSA said these measures were “insufficient to prevent misuse.”

Tesla agreed to issue a recall via a software update that would theoretically increase driver monitoring. But that update did not really appear to change Autopilot much — a sentiment NHTSA seems to agree with.

Parts of that recall fix require the “owner to opt in,” and Tesla allows a driver to “readily reverse” some of the safeguards, according to NHTSA.

This story is developing…



Source Link Website

Gravatar Image
My Miranda cosgrove is an accomplished article writer with a flair for crafting engaging and informative content. With a deep curiosity for various subjects and a dedication to thorough research, Miranda cosgrove brings a unique blend of creativity and accuracy to every piece.

Leave a Reply

Your email address will not be published. Required fields are marked *