In 2019, a Tesla suddenly veered off a highway, crashed into a tree and caught fire. The driver died. His fiancée blames the “autopilot” system in the car.
Tesla has won the first US trial over the role of its “Autopilot” assistance system in a fatal accident. The electric car manufacturer led by tech billionaire Elon Musk was able to convince the jury in Riverside, California, that “Autopilot” cannot be held responsible for the accident, as the financial service Bloomberg and the “Wall Street Journal” reported from the court, among others .
In the 2019 accident, the Tesla of a 37-year-old driver who was traveling with his fiancée and her son left the road on a highway in Southern California, hit a tree and burst into flames. The driver was killed, the fiancée and her son survived injured.
Did “Autopilot” change direction?
In the lawsuit, survivors accused Tesla of alleging errors in the vehicle’s “Autopilot” system were responsible for the accident. Her lawyer said that “Autopilot” changed the direction of travel on its own. They also argued that the company knew about weaknesses in the system and gave owners a false sense of security.
Tesla pointed out, among other things, that when using the “Autopilot” system, drivers should keep an eye on the traffic situation and be ready to take control at any time. There is also no evidence that “autopilot” was activated before the accident. Tesla’s lawyer denied that the system could change direction in this way. A person in the vehicle must have triggered this.
Tesla vehicles record a variety of data that is often helpful in incident investigations. However, this information is sometimes no longer available after accidents such as fires.
Further processes will follow
Further trials on allegations against the system are scheduled for the coming months in the USA. In early 2024, a lawsuit regarding the fatal accident of an Apple employee whose Tesla drove into a concrete bollard on a highway in the middle of Silicon Valley in 2018 is expected to come to court. This accident occurred in a construction site during the expansion of the route – and one of the theories was that remnants of road markings could have influenced the system. US accident investigators concluded that the driver had trusted the system too much.
Tesla previously won a lawsuit in the USA over a non-fatal accident. A Tesla drove into a lane divider in 2019. The driver blamed the system for this. Among other things, Tesla pointed out that the “Autopilot” version it used had not been approved for use in the city. The individual cases are so different that no conclusions can be drawn about other cases from the first two trials.
A series of accidents
The “autopilot” system is controversial. While Musk and Tesla emphasize that it makes the vehicles safer, there have been a number of accidents that have also brought US regulators into focus. This includes a series of incidents in which Teslas drove into emergency vehicles parked on the side of the road with their hazard lights on. According to media reports, the US Department of Justice and the Securities and Exchange Commission are also investigating the question of whether Tesla always provided correct information about the capabilities of the assistance system.
“Autopilot” can take over assistance functions such as maintaining the distance to the vehicle in front or changing lanes. At the same time, Tesla is currently letting customers in the USA test an advanced system called “Full Self-Driving” (FSD). Despite the name, it doesn’t make Tesla an autonomous car, but it is also intended to stop at traffic lights and stop signs and turn at intersections in the city, for example. The driver remains responsible, not Tesla. Some test users reported some serious software errors in traffic.
Source: Stern