US to investigate Tesla's 'full self-driving' system after pedestrian killed
DETROIT – The US government's highway safety agency is investigating Tesla's “fully self-driving” system after receiving reports of accidents in low-visibility situations, including the death of a pedestrian.
The National Highway Traffic Safety Administration said in documents that it launched an investigation after the company reported four crashes Thursday when Teslas were exposed to sunlight, fog and airborne dust.
Apart from the pedestrian's death, another person was injured in the accident, the agency said.
Investigators will examine the ability of “fully self-driving” to “respond appropriately to conditions that reduce visibility on the road and, if so, to conditions that contribute to these accidents.”
The investigation covers about 2.4 million Teslas from the 2016 to 2024 model years.
A message was left Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and that human drivers must always be ready to intervene.
Last week Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi without a steering wheel or pedals. Musk, who has previously promised autonomous vehicles, said the company plans to launch the autonomous Model Y and 3 next year without human drivers. Robotaxis without steering wheels will be available starting in California and Texas in 2026, he said.
The impact of the investigation on Tesla's self-driving ambitions is unclear. NHTSA has to approve any robotaxi without pedals or a steering wheel, and it's unlikely that will happen during the investigation. But if the company tries to put autonomous vehicles into its existing models, it will likely run into state regulations. There are no federal regulations specifically focusing on autonomous vehicles, although they must meet broader safety rules.
NHTSA also said it will investigate whether any other similar crashes involving “fully self-driving” occurred in low-visibility conditions, and it will seek information from the agency about whether any updates affected the system's performance in those conditions.
“Specifically, this review will assess the timing, purpose and capability of any such updates, as well as Tesla's assessment of their security implications,” the document states.
Tesla reported four accidents to NHTSA under an agency order covering all automakers. An agency database says the pedestrian was killed in November 2023 after being struck by a 2021 Tesla Model Y in Rimrock, Arizona. Rimrock is about 100 miles (161 kilometers) north of Phoenix.
The Arizona Department of Public Safety said in a statement that the crash happened on Interstate 17 on Nov. 27 after 5 p.m. Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two men got out to help direct traffic. A red Tesla Model Y then struck the 4Runner and a person exiting it. A 71-year-old woman from Mesa, Arizona was pronounced dead at the scene.
The collision occurred because the Tesla driver had the sun in his eyes, so the Tesla driver was not charged, said Raul Garcia, the department's public information officer. Sunlight was also a contributing factor in the first collision, he added.
Tesla has twice withdrawn “fully self-driving” under pressure from NHTSA, which sought information from law enforcement and the company in July after a Tesla using the system hit and killed a motorcyclist near Seattle.
The recall was issued because the system was programmed to run stop signs at slow speeds and the system violated other traffic laws. Both issues were fixed via online software updates.
Critics say Tesla's system, which uses only cameras to detect hazards, doesn't have the right sensors to be fully self-driving. Almost all companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in dark or poor visibility conditions.
Musk says that humans drive by sight alone, so cars should be able to drive by camera alone. He called lidar (light detection and ranging), which uses lasers to detect objects, a “fool's errand”.
The “full self-driving” recall comes after a three-year investigation into Tesla's less sophisticated Autopilot system crashing into other vehicles in emergencies and parked on highways, many with warning lights flashing.
That investigation was closed last April when the agency pressured Tesla to recall its vehicles to bolster a weak system that ensures drivers are paying attention. A few weeks after the recall, NHTSA began investigating whether the recall was working.
NHTSA began its Autopilot crash investigation in 2021, after receiving 11 reports of Teslas using Autopilot hitting parked emergency vehicles. In documents explaining why the investigation ended, NHTSA said it ultimately found 467 crashes involving Autopilot that resulted in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, while “fully self-driving” is billed by Musk as being able to drive without human intervention.
The investigation, which was opened Thursday, enters new territory for NHTSA, which previously viewed Tesla's systems as assisting drivers rather than driving their own cars. With the new findings, the agency is focusing on the capabilities of “fully self-driving” rather than simply making sure drivers are paying attention.
Michael Brooks, executive director of the nonprofit Center for Auto Safety, said previous investigations into Autopilot didn't look at why Teslas couldn't see emergency vehicles and stop.
“Earlier they put the liability on the driver instead of the car,” he said. “Here they are saying that these systems are not able to accurately detect safety hazards whether drivers are paying attention or not.”