Mesa Woman Killed in Tesla 'Full Self-Driving' Crash; Feds to investigate automaker

Mesa Woman Killed in Tesla 'Full Self-Driving' Crash; Feds to investigate automaker

Phoenix (AZFamily/AP) — An Arizona woman was killed by a driver using Tesla's “full self-driving” feature as the automaker faces a National Highway Traffic Safety Administration investigation.

Tesla reported four crashes to NHTSA under an order from the agency that covers all automakers, including one that occurred in northern Arizona that killed a Valley woman.

An agency database says the pedestrian was struck by a 2021 Tesla Model Y and killed in November 2023 in Rimor, Arizona, about 100 miles from Camp Verde and Phoenix.

The Arizona Department of Public Safety (DPS) said in a statement that the crash happened on Interstate 10 last November just after 5 p.m. Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two men got out to help direct traffic.

A red Tesla Model Y then struck the 4Runner and a person exiting it. A 71-year-old Mesa woman was pronounced dead at the scene, officials said.

The collision occurred because the Tesla driver had the sun in his eyes, so the Tesla driver was not charged, DPS Public Information Officer Raul Garcia said. He added that sunlight also contributed to the first collision.

Investigators will investigate whether autonomous driving features can “identify and respond appropriately to conditions that reduce road visibility and, if so, the circumstances contributing to these accidents.”

The investigation covers about 2.4 million Teslas from the 2016 to 2024 model years.

A message was left Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and that human drivers must always be ready to intervene.

Last week, Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi that doesn't have a steering wheel or pedals.

Musk, who has previously promised autonomous vehicles, said the company plans to launch the autonomous Model Y and 3 next year without human drivers. Robotaxis without steering wheels will be available starting in California and Texas in 2026, he said.

The impact of the investigation on Tesla's self-driving ambitions is unclear. NHTSA has to approve any robotaxi without pedals or a steering wheel, and it's unlikely that will happen during the investigation. But if the company tries to put autonomous vehicles into its existing models, it will likely run into state regulations.

There are no federal regulations specifically focusing on autonomous vehicles, although they must meet broader safety rules.

NHTSA also said it would investigate whether any other similar crashes involving “full self-driving” occurred in low-visibility conditions and seek information from the agency about whether any updates affected the system's performance in those conditions.

“Specifically, this review will assess the timing, purpose and capability of any such updates, as well as Tesla's assessment of their security implications,” the document states. Tesla has twice withdrawn “full self-driving” under pressure from the NHTSA, which sought information from law enforcement and the company in July after a motorcyclist was hit and killed near Seattle using the Tesla system.

The recall was issued because the system was programmed to run stop signs at slow speeds and the system violated other traffic laws. Both issues were fixed via online software updates.

Critics say Tesla's system, which uses only cameras to detect hazards, lacks the right sensors to be fully self-driving. Almost all companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in dark or poor visibility conditions.

Musk says that humans drive by sight alone, so cars should be able to drive by camera alone. He called lidar (light detection and ranging), which uses lasers to detect objects, a “fool's errand”.

The “full self-driving” recall comes after a three-year investigation into Tesla's less sophisticated Autopilot system crashing into other vehicles in emergencies and parked on highways, many with warning lights flashing.

That investigation closed last April when the agency pressured Tesla to recall its vehicles to bolster a weak system that ensures drivers are paying attention. A few weeks after the recall, NHTSA began investigating whether it was working.

NHTSA launched its Autopilot crash investigation after receiving 11 reports in 2021 that Teslas using Autopilot hit parked emergency vehicles. In the document explaining why the investigation was closed,

NHTSA said it ultimately found 467 crashes involving Autopilot, resulting in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, while “fully self-driving” is billed by Musk as being able to drive without human intervention.

The investigation, which was opened Thursday, enters new territory for NHTSA, which previously viewed Tesla's systems as assisting drivers rather than driving their own cars. With the new findings, the agency is focusing on the capabilities of “fully self-driving” rather than simply making sure drivers are paying attention.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said previous investigations into Autopilot didn't look at why Teslas couldn't see emergency vehicles and stop.

“Earlier they put the liability on the driver instead of the car,” he said. “Here they are saying that these systems are not able to accurately detect safety hazards whether drivers are paying attention or not.”

See a spelling or grammatical error in our story? Click here to report it.

Do you have a photo or video of a breaking news story? send It's ours here with a brief description.

Source link

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *