Tesla faces new NHTSA investigation into its ‘Autopilot’ artificial intelligence system after three people die in a crash involving a Model S


Electric vehicle maker Tesla is under US government investigation over its AI-powered Autopilot system, after a recent Model S crash left three dead and three injured in California this month. The United States Automobile Safety Agency said on Wednesday it had opened an investigation into the fatal crash that may have been caused by its advanced driver assistance system.

The crash, involving a 2022 Tesla Model S that struck Newport Beach construction equipment last week, is one of 35 being investigated by the US National Highway Traffic Safety Administration (NHTSA) involving Tesla vehicles in which advanced driver assistance systems like Autopilot have been suspected of being used since 2016.

A total of 14 accidental deaths have been reported in connection with these Tesla investigations, including the three recent deaths. NHTSA has confirmed the new investigation into a May 12 Tesla Model S crash that killed three people in the vehicle and injured three workers when it struck construction equipment along Pacific Coast Highway Newport Beach. Newport Beach police declined to say Wednesday whether the Tesla vehicle was on Autopilot at the time of the crash, saying the investigation is ongoing.

Tesla’s Autopilot system and other driver assistance systems that take over certain tasks from the driver’s seat are getting more attention. Tesla says on its website that Autopilot provides driver assistance by allowing vehicles to steer, accelerate, and brake automatically, but requires active driver supervision and does not make the vehicle self-driving. NHTSA notes that there are no self-driving vehicles on sale that would allow drivers not to be careful.

Last June, NHTSA ordered automakers to report any crashes on public roads involving vehicles that are fully autonomous or have partially automated driver assistance systems. Partially automated systems can keep a vehicle centered in its lane and a safe distance from vehicles ahead. According to NHTSA, the data can show whether there are common patterns in crashes involving these systems.

NHTSA sends special teams each year to conduct more than 100 in-depth investigations into unique real-world crashes to quickly complete in-depth investigations that can be used by the automotive safety community to improve the performance of its systems. advanced security.

Of the 35 special crash investigations NHTSA has conducted on Tesla since 2016 involving advanced driver assistance systems, Autopilot use was ruled out in three cases. NHTSA said separately on Wednesday that it opened another special investigation in April into a crash involving a 2016 Tesla Model X in a collision in Florida that resulted in minor injury, which may also have involved the use of a system. advanced driving assistance.

In August, NHTSA said it had opened a formal preliminary assessment of Autopilot system faults and had identified at least a dozen crashes involving Tesla models and emergency vehicles. This investigation is still ongoing.
NHTSA is also investigating two crashes involving Volvos, a Navya shuttle crash, two involving Cadillacs, one in a Lexus and one in a Hyundai. One of the crashes involving Volvos was an Uber self-driving test vehicle that hit and killed an Arizona pedestrian in March 2018.

The investigation is a second blow to Tesla this week, after security consultant Sultan Qasim Khan revealed that a hack could unlock Tesla Model 3 and Y cars and start the engine without having to physically enter them. Khan says it’s possible to trick the car’s entry system into thinking the owner is near the vehicle. This unlocks the doors and starts the engine, making it hypothetically easy for a thief to get away in an expensive Tesla without even a scratch on the paint.

Source: NHTSA (1, 2)

And you?

What is your opinion on the subject?
Do you think the Autopilot and FSD names mislead drivers? Should they, in your opinion, be changed?

See as well :

Tesla tells U.S. lawmakers Autopilot requires ‘constant monitoring’, bolsters criticism that name misleads drivers

Tesla must now report Autopilot-related crashes to the government or face fines, federal road safety agency says

Tesla Autopilot: US investigates feature after 11 Teslas crash into emergency vehicles

Thirty Tesla crashes linked to the assisted driving system are being investigated in the US, crashes have claimed 10 lives since 2016

US Government asks Tesla why people can play video games in moving cars, feature under review

Leave a Comment