24hoursworld

Tesla Autopilot allegedly failed the test with a bang

Tesla Autopilot allegedly failed the test with a bang

In the US, around 100,000 Teslas are driving with a beta version of the so-called “autopilot”. The name suggests that the car will reach its destination safely without the help of a human being. A test now wants to show that at least children should be careful in front of the vehicles.

Experts have been arguing with Tesla for years: Many consider the software for fully autonomous driving, which Tesla calls “Autopilot” or “FSD”, to be immature and the name therefore dangerous. Especially in the US, where, unlike in the EU, current software is tested as a beta version on public roads, many Tesla drivers literally sit back and let the car do the work. Software developer Dan O’Dowd wants to show that trusting software can have bad consequences.

TV ad warning of Tesla

Billionaire O’Dowd spent a lot of money on his test, which was carried out on behalf of the “Dawn Project”, an initiative for secure software – because the plans are to broadcast the results on American television. The video, which is to be broadcast, urgently warns of Tesla’s software.

The speaker of the video refers to quotes from Tesla CEO Elon Musk, who repeatedly advertised the “autopilot” as an outstanding technology – and then asked whether the software worked. The video then shows a Tesla Model 3 driving unbraked into a dummy child three times in three attempts – and mowing it without any intervention.

The warning follows that 100,000 people are on American roads with this software as testers – and that this poses a great risk. Dan O’Dowd, introduced in the video as “President and CEO of Green Hills Software,” calls the beta software “the worst software he’s ever seen” — and urges viewers to take it to the US Congress Convince Tesla to revoke its “autopilot” license.

Tester is considered biased

The video is making waves on social media – and is not entirely uncontroversial. Because “Green Hills Software”, the company whose profits go into O’Dowd’s pocket, also produces software for autonomous cars, for example for current BMW iX models. Also, the campaign that O’Dowd is waging against Tesla is by no means new. The software developer warns about the autopilot every few months, for example with a full-page advertisement in the “New York Times” in January or regular tweets. As early as April, Politico magazine wrote that Dan O’Dowd wanted to “destroy” Musk.

The entire test should therefore be taken with caution as there is clearly a conflict of interest. Nevertheless, the results cannot be dismissed out of hand, as it seems. The “Dawn Project” describes in detail on its own website which methodology was used and what the results were. Conclusion: The vehicle ran over the doll several times with full force without warning.

Another video has the same result

Another Twitter user named Taylor Ogan, apparently an investor with shares in various electric car manufacturers, also shows a similar video. There you can see a white Tesla competing against a vehicle from another manufacturer. But in contrast to the competing product, the Tesla irons apparently uncompromisingly over the doll. Ogan cites the missing laser scanning system Lidar as the reason, which Musk has always vehemently refused and describes as useless.

For the background of this video, too, it is important to know that the test shown was apparently carried out by the Luminar company, whose business consists of the sale of lidar systems.

“A Tesla recognizes cardboard”

Counter-arguments from Tesla fans and defenders are numerous. They range from the rather unfounded argument that a Tesla “can tell cardboard from a real child” to videos that consist of a similar test setup but show completely different results. The Twitter user “tesladriver2022” writes: “I just destroyed Dan O’Dowd’s advertisement with the cardboard friend. Not only did my Tesla correctly recognize the child, but it reliably bypassed it every time.”

However, the test setup is criticized because the parameters are unequal – O’Dowd intentionally limited the roadway, but the Tesla was given significantly more space when the evidence to the contrary was provided.

But: Even independent tests such as the Euro NCAP test do not show any major deficiencies in the behavior of a Tesla. So you can see that the car reliably avoids obstacles and people – also in the form of dolls.

The devil is in the updates

The “autopilot” is still a problem. Because firstly the name suggests complete autonomy of the car, which Tesla really has neither in the EU nor in the USA, secondly every test is based on different software – and Tesla turns drivers on public roads into test subjects.

Not only once has Tesla withdrawn software because of this strategy, because after its release it turned out that errors had crept in and could lead to problems or dangerous driving maneuvers.

Tesla itself writes when asked whether the system is a way to drive autonomously: “Not yet. All Tesla vehicles require active driver monitoring and are not autonomous. With the FSD computer, we expect that we will save billions of miles of experience with our features to achieve new levels of autonomy. Deploying and using autonomy features requires proven reliability that far exceeds the capabilities of human drivers.”

And that is precisely the dilemma for the manufacturer: How are you supposed to cover billions of kilometers in a reasonable amount of time if you are not allowed to involve the customers? Theoretically, you could ask other manufacturers with similar goals who rarely appear in the press because of dead cardboard comrades.

Sources: , , , ,

Source: Stern

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Posts