NHTSA releases Tesla Autopilot data as review continues

[ad_1]

NHTSA has divided crash data into two categories based on the level of autonomous systems: driver assistance systems – which provide speed and direction input – and fully autonomous technologies, which are intended to operate a safely without human intervention. NHTSA found that there were 367 crashes in the past nine months involving vehicles that used these driver assistance technologies. 273 of the incidents involved a Tesla system, or its “fully autonomous driving” software or its precursor, Tesla Autopilot.

There were 130 accidents involving entirely automated driving systems, 62 of which were Waymo accidents. Transdev, a shuttle operator, reported 34 accidents, and Cruise, which provides robotaxis for General Motors in San Francisco, reported 23.

The data lacks critical context like fleet size or number of miles traveled, making it impossible to fairly compare the safety of different technologies. All relevant crashes may not be included in the data set, NHTSA said, because recording crash data can vary widely between manufacturers.

“I would advise caution before trying to draw conclusions based solely on the data we publish. In fact, the data alone may raise more questions than it answers,” the admin said. of NHTSA Steven Cliff to reporters during a Tuesday briefing.

Two of the technologies with the most reported crashes are also two of the most commonly used systems. Tesla Autopilot, for example, comes standard on all its vehicles, unlike competing driver assistance systems from other automakers. Drivers describe regular use of Autopilot as they say it can help them feel less tired after long journeys. Waymo, the other company with the most accidents, operates the most extensive robotaxi service in the country, with operations in much of metropolitan Phoenix, Arizona and San Francisco.

For the first time, automakers and robotaxi operators were required to report crash data involving these vehicles to NHTSA. NHTSA says it will use the data to identify safety issues and intervene if necessary. Pony.ai, which tests robotaxis in California, recalled three of its vehicles this year following data collected by NHTSA from that process.

Of the total 497 accidents, 43% occurred in California. The state is home to Silicon Valley, making it a hotspot for testing new technologies.

NHTSA found that of the 367 reported driver-assist crashes, there were six fatalities and five serious injuries.

The security risks of these new technologies have captured the attention of security advocates for years. There are no specific regulations for driver assistance systems, leaving car manufacturers to market and describe the systems as they wish.
Autopilot and Tesla’s “full self-driving” software have been particularly controversial. NHTSA’s investigation into Tesla’s first responder vehicles was expanded last week and could lead to a recall.
The National Transportation Safety Board has investigated fatal crashes involving Autopilot and called on the automaker to make changes, such as developing technology to more effectively detect the driver’s level of engagement and alert them when their engagement lack.
Tesla has released data since 2018 claiming Autopilot has a lower accident rate per mile than typical driving. But safety experts warn that Tesla’s analysis compares apples to oranges because most driving on autopilot takes place on highways, where accident rates per mile are far lower than any driving.

Tesla says drivers using Autopilot should remain alert and be prepared to take full control of the vehicle at all times. However, drivers using technology such as Autopilot are at risk of being distracted, experts say.

A 2021 MIT study found that Tesla drivers looked away from the road more frequently when using Autopilot than when driving without the driver assistance system.

NHTSA said its investigation of Tesla’s rear-end emergency vehicles while using Autopilot found that in 37 of 43 crashes with detailed car log data available, drivers had their hands on the flying in the last second before the collision.

For years, Tesla sensed torque on the wheel to determine if a driver was engaged. He started using an on-board camera to detect distractions, which many security experts say is a superior method because the cameras can track eye movements.

Tesla and Waymo did not immediately respond to a request for comment.

[ad_2]
cnn-top

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button