November 23, 2024
NHTSA presses Tesla for more records in Autopilot safety probe
US auto safety regulators are demanding more records from Tesla as part of an ongoing investigation into Autopilot safety.

Chief Executive Officer of SpaceX and Tesla and owner of Twitter, Elon Musk attends the Viva Technology conference dedicated to innovation and startups at the Porte de Versailles exhibition centre on June 16, 2023 in Paris, France. 

Chesnot | Getty Images

Tesla must send extensive new records to the National Highway Traffic and Safety Administration as part of an Autopilot safety probe — or else face steep fines.

If Tesla fails to supply the federal agency with information about its advanced driver assistance systems, which are marketed as Autopilot, Full Self-Driving and FSD Beta options in the U.S., the company faces “civil penalties of up to $26,315 per violation per day, with a maximum of $131,564,183996 for a related series of daily violations,” according to a letter published on the NHTSA website Thursday.

The NHTSA initiated an investigation into Autopilot safety in 2021 after it identified a string of crashes in which Tesla vehicles using Autopilot had collided with stationary first responders’ vehicles and road work vehicles.

To date, none of Tesla’s driver assistance systems are autonomous, and the company’s cars cannot function as robotaxis like those operated by Cruise or Waymo. Instead, Tesla vehicles require a driver behind the wheel, ready to steer or brake at any time. Autopilot and FSD only control braking, steering and acceleration in limited circumstances.

Among other details, the federal vehicle safety authority wants information on which versions of Tesla’s software, hardware and other components have been installed in each car that was sold, leased or in use in the U.S. from model years 2014 to 2023, as well as the date when any Tesla vehicle was “admitted into the ‘Full-Self Driving beta’ program.”

The company’s FSD Beta consists of driver assistance features that have been tested internally but have not been fully de-bugged. Tesla uses its customers as software- and vehicle safety-testers via the FSD Beta program, rather than relying on professional safety drivers, as is the industry standard.

Tesla previously conducted voluntary recalls of its cars due to issues with Autopilot and FSD Beta and promised to deliver over-the-air software updates that would remedy the issues.

A notice on the NHTSA website in February 2023 said Tesla’s its FSD Beta driver assistance system may “allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.”

According to data tracked by NHTSA, there have been 21 known collisions resulting in fatalities that involved Tesla vehicles equipped with the company’s driver assistance systems — higher than any other automaker that offers a similar system.

According to a separate letter out Thursday, NHTSA is also reviewing a petition from an automotive safety researcher, Ronald Belt, who asked the agency to re-open an earlier probe to determine the underlying causes of “sudden unintended acceleration” events that have been reported to NHTSA.

With sudden unintended acceleration events, a driver may be either parked or driving at a normal speed when their car lurches forward unexpectedly, potentially leading to a collision.

Tesla’s vice president of vehicle engineering, Lars Moravy, did not immediately respond to a request for comment. 

Read the full letter from NHTSA to Tesla requesting extensive new records.