US probes Tesla recall of 2 million vehicles over Autopilot, citing concerns

Tesla said in December its largest-ever recall was to better ensure that drivers pay attention when using its advanced driver-assistance system. PHOTO: REUTERS

WASHINGTON – US auto safety regulators said on April 26 they have opened an investigation into whether Tesla’s recall of more than two million vehicles announced in December to install new Autopilot safeguards is adequate following a series of crashes.

The National Highway Traffic Safety Administration (NHTSA) said it opened the probe after receiving reports of 20 crashes involving vehicles that had the new Autopilot software updates installed under Tesla’s recall.

The new investigation adds to regulatory scrutiny of Autopilot at a time when chief executive Elon Musk is pushing for full self-driving, as Tesla is offering a month of free trials and plans to unveil its robotaxi on Aug 8.

The agency said it had concerns following those 20 crashes as well as results from preliminary NHTSA tests of updated vehicles.

Also on April 26, the agency closed its nearly three-year defect investigation into Autopilot, saying it found evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities”, resulting in a “critical safety gap”.

NHTSA disclosed on April 26 that during its Autopilot safety probe launched in August 2021, it identified at least 13 Tesla crashes involving at least one death and many more involving serious injuries in which “foreseeable driver misuse of the system played an apparent role”. NHTSA listed reports of 54 serious injuries in Autopilot crashes involving potential driver misuse.

NHTSA noted that Tesla’s December 2023 recall “allows a driver to readily reverse” the software update. The agency said Tesla has issued additional software updates to address issues related to its concerns but has not made them part of the recall.

Tesla said in December its largest-ever recall covering 2.03 million US vehicles – or nearly all of its vehicles on US roads – was to better ensure that drivers pay attention when using its advanced driver-assistance system.

The new recall investigation covers models Y, X, S, 3 and Cybertruck vehicles in the US equipped with Autopilot produced between the 2012 and 2024 model years.

NHTSA said there were gaps in Tesla’s telematic data reporting on crashes involving Autopilot, as the automaker primarily gets data from crashes involving airbag deployments, which account for only about one-fifth of police-reported crashes.

US Senators Ed Markey and Richard Blumenthal on April 26 urged NHTSA to bar Tesla from allowing the use of Autopilot on roads where it is not intended, asking the agency “to take all necessary actions to prevent these vehicles from endangering lives”.

NHTSA also on April 26 raised concerns that Tesla’s Autopilot name “may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation”.

Tesla shares were down 1.1 per cent at US$168.26 in the afternoon on April 26. The company did not immediately respond to a request for comment. It has said repeatedly that Autopilot does not make vehicles self-driving and is intended for use with a fully attentive driver who is prepared to take over and has hands on the steering wheel.

In February, Consumer Reports, a non-profit research organisation, said its testing of Tesla’s Autopilot recall update found that changes did not adequately address many safety concerns raised by NHTSA and urged the agency to require the automaker to take “stronger steps”.

Tesla’s Autopilot is intended to enable cars to steer, accelerate and brake automatically within their lane, while enhanced Autopilot can assist in changing lanes on highways, but does not make vehicles autonomous.

The touchscreen inside a Tesla. The National Highway Traffic Safety Administration is investigating Tesla’s recall of its Autopilot system. PHOTO: NYTIMES

One component of Autopilot is Autosteer, which maintains a set speed or following distance and works to keep a vehicle in its driving lane.

Tesla said in December it did not agree with NHTSA’s analysis but would deploy an over-the-air software update that would “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to... continuous driving responsibility whenever Autosteer is engaged.”

Since 2016, NHTSA has opened more than 40 Tesla special crash investigations in cases where driver systems such as Autopilot were suspected of being used, with 23 crash deaths reported to date.

Tesla’s software upgrade under the recall includes increasing prominence of visual alerts and disengaging of Autosteer if drivers do not respond to inattentiveness warnings and additional checks upon engaging Autosteer. Tesla said it would restrict Autopilot use for one week if significant improper use was detected.

Tesla disclosed in October that the US Justice Department issued subpoenas related to its Full Self-Driving (FSD) and Autopilot programs. Reuters reported in October 2022 that Tesla was under criminal investigation.

Tesla in February 2023 recalled 362,000 US vehicles to update its FSD Beta software after NHTSA said the vehicles did not adequately adhere to traffic safety laws and could cause crashes. REUTERS

Join ST's Telegram channel and get the latest breaking news delivered to you.