Tesla has issued a recall for the majority of cars sold in the US to address a malfunction in the Autopilot system, which monitors driver behavior.


Tesla will be issuing a recall for over 2 million vehicles sold in the U.S. in order to address a software update and fix a faulty system meant to ensure drivers are attentive while utilizing Autopilot.

According to documents released by U.S. safety regulators on Wednesday, the update will enhance notifications and alarms for drivers and restrict the geographical regions where the basic versions of Autopilot can function.

After a thorough two-year inquiry by the National Highway Traffic Safety Administration, a recall has been issued due to a string of accidents that occurred while the Autopilot semi-automated driving system was engaged, resulting in several fatalities.

The investigation conducted by the agency revealed that Autopilot’s method of ensuring driver attention may be insufficient and may result in predictable misuse of the system.

According to the documents, the additional controls and alerts will prompt the driver to continue fulfilling their driving responsibility.

Safety professionals stated that although the recall is a positive measure, it does not address the root issue of Tesla’s automated systems struggling to detect and avoid obstacles while driving, ultimately placing the responsibility on the driver.

Some vehicles, including models Y, S, 3, and X, produced from Oct. 5, 2012 to Dec. 7, 2021, are included in the recall. The update will be sent to affected vehicles on Tuesday, with the remaining vehicles receiving it at a later date.

Tesla’s shares initially dropped by over 3% during Wednesday’s trading, but rebounded as the overall stock market saw a surge, ultimately finishing the day with a 1% increase.

Dillon Angulo, who suffered serious injuries in a 2019 crash involving a Tesla using Autopilot on a rural Florida highway where it should not have been activated, felt that the company’s efforts to fix the issues with the technology came too late.

Angulo, who is currently recovering from injuries including brain trauma and broken bones, stated that this technology is unsafe and needs to be removed from roads. He is suing Tesla and believes that the government needs to take action instead of allowing such experimentation.

The Autopilot system consists of two functions: Autosteer and Traffic Aware Cruise Control. Autosteer is designed for use on highways with restricted entry, while a more advanced feature, Autosteer on City Streets, operates in other settings.

The upcoming update for the software will restrict the usage of Autosteer. If the driver tries to activate Autosteer in unsuitable conditions, the feature will notify them through visual and audible alerts and will not engage. This information was stated in the recall documents.

Based on the features of a Tesla’s hardware, the updated options consist of making visual alerts more noticeable, streamlining the process of activating and deactivating Autosteer, and implementing extra measures to ensure Autosteer is only used on designated roads and when approaching traffic signals. According to the documents, a driver may lose the privilege of using Autosteer if they consistently fail to show responsible and attentive driving behavior.

Per recall records, NHTSA contacted Tesla in October to discuss their preliminary findings regarding the repair of the monitoring system. While Tesla did not agree with NHTSA’s assessment, they did comply with the recall on December 5th in an attempt to resolve the matter.

For a long time, advocates for auto safety have been pushing for stricter regulation of the driver monitoring system. This system primarily detects whether the driver’s hands are on the steering wheel. These advocates have also been urging for the inclusion of cameras to ensure that the driver is attentive, a feature that is already utilized by other car manufacturers with similar systems.

Professor Philip Koopman, an expert in electrical and computer engineering at Carnegie Mellon University, commented on the recent software update for autonomous vehicles. He believes that while it is a step in the right direction, it does not fully address the issue of a lack of night vision cameras to monitor driver’s eyes. Additionally, he notes that Teslas still struggle to detect and avoid obstacles.

Koopman expressed disappointment with the compromise as it does not adequately address the issue of older cars lacking proper hardware for driver monitoring.

According to Koopman and Michael Brooks, who leads the non-profit Center for Auto Safety, the issue of collisions with emergency vehicles is a safety flaw that has not been properly addressed. Brooks argues that this does not address the main focus of the investigation, which is why Tesla vehicles on Autopilot are failing to detect and respond to emergency situations.

Koopman mentioned that NHTSA appeared to have determined that the alteration in software was the most they could obtain from the corporation. They also stated that the advantages of implementing this change at present outweighed the disadvantages of spending another year negotiating with Tesla.

On Wednesday, NHTSA declared that the inquiry is still ongoing “as we observe the effectiveness of Tesla’s solutions and continue collaborating with the company to ensure maximum safety.”

The Autopilot feature has the ability to guide, increase speed, and apply brakes within its designated lane, but it is not a self-driving system despite its name. Tests have proven that the monitoring system is vulnerable to being tricked, resulting in instances where drivers have been caught operating the vehicle while under the influence or occupying the back seat.

According to Tesla’s defect report submitted to the safety agency, the controls of Autopilot may not adequately prevent driver misuse.

A request for additional comment from the Austin, Texas-based company was made on Wednesday morning.

According to Tesla’s website, Autopilot and an advanced Full Self Driving system are designed to assist drivers who must remain attentive and prepared to take over at any moment. The Full Self Driving feature is currently undergoing testing on public roads by Tesla vehicle owners.

On Monday, X (previously known as Twitter), Tesla announced that engaging Autopilot results in stronger safety measures.

Since 2016, the NHTSA has sent investigators to 35 accidents involving Tesla vehicles that were possibly operating on an automated system. These incidents have resulted in the deaths of at least 17 individuals.

The NHTSA is currently conducting an investigation into several cases of Tesla vehicles using Autopilot and colliding with emergency vehicles. This is part of a larger probe by the NHTSA into safety concerns regarding Teslas, which has recently led to a recall of the Full Self Driving software.

In May, Secretary of Transportation Pete Buttigieg, whose department oversees NHTSA, stated that Tesla should not refer to their system as Autopilot as it is not capable of fully autonomous driving.

—-

The story was aided by contributions from Michael Liedtke, a writer specializing in technology for the Associated Press.

Source: wral.com