24hoursworld

Safety experts: Tesla’s new “Full Self Driving” beta software drives “like a drunk driver”

In the USA, Tesla is playing a beta version for autonomous driving. The “Consumer Report” criticizes the area trial on public roads sharply.

Tesla’s handling of the autopilot repeatedly led to criticism. Tesla promises and suggests more to the consumer than the program can do. That is dangerous, even if the attentive customer is, to a certain extent, informed about the limits of the system in the “small print”.

Tesla’s “Full Self-Driving” beta software has been used on public roads for about a week. Now the safety experts of the influential “Consumer Reports” (“CR”) speak out in the USA and sharply criticize the use of a beta version in public road traffic – without further safety precautions.

FSD beta 9 is a previous version of a “full self-driving” mode. The problem starts with the name, because the system does not provide complete autonomy. The software update automates many driving situations, with the new software the vehicle can navigate through intersections and city streets – but only under the supervision of the driver. But: “Videos of FSD Beta 9 in action do not show a system that makes driving safer or less stressful,” says Jake Fisher, Head of the Auto Test Center at “CR”. “Consumers are simply acting as test engineers in developing a technology without adequate security protection.”

Field trial with bystanders

Fisher criticizes Tesla for not using at least the existing technology to monitor driver attention. “It’s not enough for Tesla to ask people to pay attention – the system has to make sure people are engaged and focused when the system is up and running.” He recommends using the in-vehicle driver monitoring systems to ensure that drivers have their eyes on the road. “We already know that tests to develop self-driving systems can – and will – be fatal without adequate driver support.”

In fact, the software seems to be more powerful, but that makes it more dangerous in everyday life. Fisher: “If software works well most of the time, a small mistake can be catastrophic because drivers trust the system more and intervene less when it is needed.”

Far from real autonomy

Other experts are appalled by the benefits of the beta software. Selika Josiah Talbott, a professor at the American University School of Public Affairs in Washington, DC, said she saw videos showing the Tesla behaving “almost like a drunk driver” and struggling to stay between the lane lines . “It meanders to the left, it meanders to the right,” she says. “While his right turns seem pretty solid, the left turns are pretty wild.”

“It’s hard to tell from the outside exactly what the problem is when you watch the videos. But it is clear that the software has a problem with object detection and / or classification,” said Missy Cummings, automation expert and director of the Humans and Autonomy Laboratory at Duke University in Durham, NC She does not rule out the possibility that Tesla will build self-driving cars at some point in the future. “But are you ready now? No. Are you even close? No.”

“It’s a very Silicon Valley-esque ethos to get your software 80 percent complete, then release it and let users figure out the problems,” Cummings continued. “And maybe that’s okay for a cell phone, but not for a security-critical system.”

State rules necessary

Jason Levine, executive director of the Center for Auto Safety, fears that Tesla’s course could damage the entire industry. “Vehicle manufacturers who choose to test their untested technology with vehicle owners and in public without consent will, at best, throw it out of safety and, in the worst case, lead to avoidable accidents and deaths “.

“CR” criticizes the whole process of trying out a beta version in public. Bryan Reimer, professor at MIT told CR that “drivers are aware of the increased risk they are taking, while other road users – motorists, pedestrians, cyclists, etc. – are unaware that they are in the.” Near a test vehicle. You did not agree to take such a risk. “

Tesla has been criticized for its shirt-sleeved and cost-saving style in autonomous driving for some time. Founder Elon Musk aggressively defends the decision to forego additional lidar sensors in the Tesla models. Other companies that develop self-driving cars, such as Argo AI, Cruise and Waymo, told the “CR” that they limit software tests to private routes only and use trained drivers as observers.

The “CR” calls for strict, sensible safety rules to be introduced in the USA and for manufacturers to be liable for their mistakes. William Wallace, manager for security policy at “CR”: “Otherwise some companies will simply treat our public roads like a private test area without being held responsible for the security.”

Source Link

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Posts