Tesla Faces upgraded the US probe to autopilot in emergency scene crashes

US auto safety regulators have stepped up investigations into accidents involving emergency scene accidents involving

Tesla a company

TSLA 0.63%

Autopilot, a critical step in deciding whether to request a safety recall.

The National Highway Traffic Safety Administration said in a notice published Thursday that it is expanding The investigation began last August In a series of collisions in which autopilot Tesla cars collided with emergency first responder vehicles on the roads.

The agency said it is upgrading its previous investigation to an engineering analysis after identifying new incidents involving autopilots and emergency response vehicles.

The NHTSA also said it has expanded its examination of autopilot to include a broader range of accidents, not just those that occur at emergency sites. The agency said it will further assess how drivers interact with autopilot and the degree to which it may reduce motorists’ attention.

The agency said that available forensic data on 11 accidents showed that drivers failed to take evasive measures within two to five seconds before impact.

The investigation covers an estimated 830,000 Tesla vehicles made from 2014 to 2021, including the Model 3, Model S, Model X and Model Y.

The NHTSA said in its filing that it has identified 15 injuries and one death related to crashes.

Tesla did not immediately respond to a request for comment. The electric-car maker’s stock rose 2.5% at midday Thursday, after news of a Strong rebound in production at its factory in China.

Autopilot, Tesla’s name for the advanced driver assistance technology used in its cars, is designed to assist drivers with tasks such as steering and maintaining a safe distance from other vehicles. Tesla directs drivers using the system to pay attention to the road and keep their hands on the steering wheel.

The electric car maker has long maintained that driving with autopilot on is safer than doing without it. Tesla cites internal data that shows malfunctions were less common when drivers were on autopilot. Some researchers have criticized Tesla’s methodology.

Opening the initial investigation last year, the NHTSA said it had identified 11 incidents since early 2018 in which an autopilot Tesla car collided with one or more of the vehicles involved in an emergency response situation. In its latest filing, the agency said it had discovered six additional accidents involving Tesla cars and first responder cars where the autopilot was in use.

US safety regulators are investigating Teslas crashes, and suspect the company’s autopilot system may be involved. WSJ’s Robert Wall reports on how some motorists mistakenly believe autopilot is a self-driving feature that doesn’t require their attention. (Video from 3/18/21)

The expanded investigation of autopilot is the latest sign that US auto safety regulators are getting bolder in scrutinizing advanced vehicle technologies that automate some or all driving tasks.

NHTSA is Prepare to release new crash data The Wall Street Journal reports that this month it will provide the public with a first detailed look at the frequency and severity of accidents involving so-called automated driving or advanced driver assistance features.

More than 100 companies Subject to agency order Ask them to report malfunctions in which these systems were in use. Among these operators are fleets of self-driving cars, such as

the alphabet a company

Waymo and

general motors a company

Cruise LLC.

The technology under review includes lane-keep assist and cruise control systems that maintain a constant distance behind a leading vehicle, as well as high-tech systems such as features that can steer the vehicle along highways with minimal driver input.

Autopilot has become a particular focus of US regulators in recent years, spurred by incidents where drivers have misused technology, overcoming safety functions to operate the vehicle without putting their hands on the steering wheel, for example. Some critics have also said that the term autopilot risks giving drivers an inflated sense of the system’s capabilities.

The NHTSA said in its most recent filing that driver use or misuse of autopilot does not necessarily prevent the agency from determining whether the technology is defective.

“This is particularly the case if the behavior of the driver in question is expected given the design or operation of the system,” NHTSA said. Automakers are legally required to initiate a recall if a safety defect is discovered in their vehicles.

Separately, the NHTSA has opened a broader investigation into dozens of accidents in which it suspects advanced driver assistance features may have played a role. While the investigation covers vehicles made by any auto company, accidents involving Teslas account for the most cases under investigation, including many deaths.

Copyright © 2022 Dow Jones & Company, Inc. all rights are save. 87990cbe856818d5eddac44c7b1cdeb8

Leave a Reply

Your email address will not be published. Required fields are marked *