Why does Tesla’s autopilot keep crashing into ambulances? US authorities investigate

Why does Tesla’s autopilot keep crashing into ambulances?  US authorities investigate

With numerous technical aids, Tesla owners can help themselves with driving – and even have themselves driven completely automatically. But the autopilot seems to have problems recognizing emergency vehicles. The US authorities now want to know what’s behind it.

Keeping speed and lane, recognizing obstacles and even parking: Modern cars are doing more and more themselves. Hardly any manufacturer goes as far as Tesla: The cars of the US electric king can drive largely automatically. But one bizarre problem keeps causing accidents, including injuries and even fatalities: Tesla’s autopilot repeatedly drives into stationary ambulances. Now the authorities are investigating.

The US National Highway Traffic Safety Administration (NHTSA) has started an investigation into this. She announced that on Friday. The serious background: Since 2018, the authorities have registered eleven accidents in which a Tesla drove into a stationary ambulance using various autopilot functions. Four of the incidents happened between late February and mid-July of this year alone. A total of 17 people were injured and one person died as a result of the accident.

Blindness for emergencies

The authorities are currently primarily concerned with finding out more details about the accidents, a spokesman told “The Verge”. According to the announcement, it concerns all Teslas available since 2014, the series S, 3, X and Y are all affected. The accidents primarily occurred in poor lighting conditions, and the software would have ignored traffic information such as warning lights, signal lights, traffic cones and even illuminated signs.

The problem has actually been known for a long time. In a Wired report from 2018, experts said one possible reason is that autopilots tend to ignore stationary objects in order not to be constantly distracted by objects on the side of the road, such as trees or parked cars. According to the report, Tesla and Volvo are actually warning of the behavior. In Tesla’s operating instructions, there is a note that the autopilot might have problems recognizing a stationary obstacle if a car in front of it suddenly evades the obstacle. The car may then not be able to slow down or brake, warns Tesla.

Elon Musk: The billionaire is supposed to live in this tiny house today

No real autonomy

For the traffic authority, the name of the feature is misleading anyway. Compared to “The Verge” it was important for the spokesman to emphasize that there is not a single car available for customers that is able to drive alone. Everyone would still need human control and only assist with driving. Tesla had repeatedly been accused of creating too much confidence in the autopilot among drivers by promoting the function.

It is therefore not surprising that part of the investigation is now intended to clarify which “technologies and methods are used to monitor, support and force the driver to participate in the use of the autopilot,” says the announcement. In plain language: The authorities want to find out whether the Tesla also makes sure that the driver gives the car the necessary attention.

The timing of the investigation could also have to do with the recent decision to roll out the autopilot as a trial version. Since the end of July, the version of the unmistakably “Full Self-Driving” system can be installed. The system was very quickly criticized, it was certified as being able to drive as “a drunk driver”. “Videos of FSD Beta 9 in action do not show a system that makes driving safer or less stressful,” said test expert Jake Fisher.

Source Link

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Posts