Car: Authority is examining Tesla’s improvements to the “Autopilot” system

Car: Authority is examining Tesla’s improvements to the “Autopilot” system

US regulators analyzed several hundred accidents involving Tesla’s “Autopilot” system. Their conclusion: Many of these could have been avoided if people had paid attention behind the wheel.

The US traffic authority has initiated a new investigation into Tesla’s “Autopilot” driving assistance system. She examines the question of whether an “autopilot” update from December is enough to address the authority’s security concerns. In an investigation lasting several years, the NHTSA (National Highway Traffic Safety Administration) concluded that “Autopilot” made it too easy for drivers to give up complete control to the system, even though they have to constantly keep an eye on the traffic situation.

The NHTSA analyzed a total of 956 accidents from January 2018 to August 2023. 29 of them resulted in fatalities. In many cases, the accidents could have been avoided if the drivers had paid attention, the authority emphasized in its report. In 59 of 109 collisions for which there was enough data for such an analysis, the obstacle was visible at least five seconds before the accident. As an example, NHTSA cited an accident in March 2023 in which a minor exiting a school bus was struck by a Model Y and seriously injured.

Gaps in Tesla’s collection of vehicle data

With the online update carried out as an official recall campaign, Tesla introduced, among other things, additional information for drivers. The electric car manufacturer points out that “Autopilot” does not make a Tesla a self-driving car and that the people behind the wheel must be ready to take control at any time. The US accident investigation agency NTSB warned that drivers were relying too much on the technology.

The NHTSA also noted in its report that there are gaps in Tesla’s collection of vehicle data that make it difficult to determine the actual number of “Autopilot” accidents. For the most part, the car manufacturer only receives accident data when airbags or seatbelt tensioners are triggered.

According to the general accident statistics from 2021, this only happens in 18 percent of all collisions reported to the police. In addition, a prerequisite for data transmission to Tesla is that a mobile phone network is available and the antenna works after the accident. In many cases, electric cars burn out after accidents because the batteries burst into flames.

Criticism of the term “autopilot”

The traffic safety authority NHTSA also criticized the name of the system. The term “autopilot” could lead drivers to overestimate the capabilities of the software and to rely on it. US drivers can currently use an advanced “Autopilot” version called “Full Self-Driving” as a test version.

However, FSD does not officially make the car an autonomous vehicle and requires constant human attention. Tesla recently added the addition “supervised” in brackets to the name. Company boss Elon Musk once again promised self-driving Tesla cars this week. He wants to present a robotaxi at the beginning of August.

The standard “autopilot” system can maintain the speed and distance to the vehicle in front, as well as the lane. The FSD version should also control traffic lights, stop signs and right-of-way rules at intersections, among other things. According to the report, U.S. Senators Edward Markey and Richard Blumenthal called on NHTSA to limit the use of Autopilot only to the roads for which the system was designed.

Source: Stern

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Posts