US: Tesla does not take sufficient measures to prevent misuse of Autopilot – Image and sound – News

US: Tesla does not take sufficient measures to prevent misuse of Autopilot – Image and sound – News
US: Tesla does not take sufficient measures to prevent misuse of Autopilot – Image and sound – News
--

The operation of Tesla’s Autopilot software has led to “foreseeable misuse and avoidable accidents.” This is the conclusion of an American study into this system. It is not clear whether last December’s Autopilot update fixed the problems.

The US traffic regulator National Highway Traffic Safety Administration has shared the results of its investigation into the operation of Tesla’s Autopilot function. To this end, 956 accidents were analyzed over three years. This analysis showed that the Autopilot function played a role in 467 of those accidents. Thirteen cases involved fatal accidents.

According to the Nhtsa, those accidents were caused by drivers abusing the system. However, the regulator blames Tesla for the fact that the manufacturer has not implemented enough safety measures to prevent such abuse. The regulator says that ‘under certain circumstances’ the Autopilot system does not sufficiently ensure that drivers pay attention and use the function correctly.

Drivers expect the Autopilot system to require much less supervision than is actually the case, leading to a “critical safety gap,” the Nhtsa said. According to the regulator, Tesla must improve the effectiveness of Autopilot warnings and ensure that users better know what this system can and cannot be used for.

The regulator is concerned, among other things, about the name ‘Autopilot’. That name would suggest that this mode allows the car to drive autonomously, while a name with ‘assist’ or ‘team’ in it would describe it better, the government body claims. Attempts to make adjustments during Autopilot would also cause the system to be completely deactivated. As a result, drivers can be discouraged from engaging in driving, the Nhtsa writes.

Last December, Tesla released a software update for a part of Autopilot, namely the self-steering Autosteer function. The company did this because the Nhtsa informed Tesla that the manufacturer did not sufficiently check whether drivers were holding their steering wheel. The accidents analyzed in the current study all occurred before this update was released. It is therefore not clear whether the update addresses the regulator’s concerns. The Nhtsa has started a new investigation to find out.

However, the government body has already announced that this is probably not sufficient to solve the problems, as several new reports have since emerged about accidents related to Autopilot. In addition, drivers can choose whether they want to download the update and it is also possible to undo it.

The article is in Dutch

Tags: Tesla sufficient measures prevent misuse Autopilot Image sound News

-

NEXT Higher wages in healthcare, GL-PvdA proposes on Labor Day