Sam Levin in San Francisco 

Tesla fatal crash: ‘autopilot’ mode sped up car before driver killed, report finds

Findings about crash in Silicon Valley raise fresh concerns about limits of Elon Musk’s technology
  
  

The crash is not the first time that safety issues have been raised about Tesla’s ‘autopilot’ mode.
The crash is not the first time that safety issues have been raised about Tesla’s ‘autopilot’ mode. Photograph: Francois Mori/AP

A Tesla driving in “autopilot” mode crashed in March when the vehicle sped up and steered into a concrete barrier, according to a new report on the fatal collision, raising fresh concerns about Elon Musk’s technology.

The National Transportation Safety Board (NTSB) said that four seconds before the 23 March crash on a highway in Silicon Valley, which killed Walter Huang, 38, the car stopped following the path of a vehicle in front of it. Three seconds before the impact, it sped up from 62mph to 70.8mph, and the car did not brake or steer away, the NTSB said.

The report – which said the Tesla battery was breached, causing the car to be engulfed in flames – comes after after the company has repeatedly sought to deflect blame on to the driver and the local highway conditions. Musk has also aggressively attacked journalists writing about this crash and other recent autopilot collisions, complaining that the negative attention would discourage people from using his technology.

The NTSB report, however, has once again raised serious safety questions about the limits and performance of the autopilot technology, which is meant to assist drivers and has faced growing scrutiny from experts and regulators. Mark Fong, an attorney for Huang’s family, also said the report appeared to “contradict Tesla’s characterization” of the collision.

Following numerous embarrassing autopilot crashes, including Teslas colliding into a police vehicle and firetruck, the company has pointed to its manual which warns that the technology cannot detect all objects and that drivers should remain attentive.

After the fatal crash in the city of Mountain View, Tesla noted that the driver had received multiple warnings to put his hands on the wheel and said he did not intervene during the five seconds before the car hit the divider.

But the NTSB report revealed that these alerts were made more than 15 minutes before the crash. In the 60 seconds prior to the collision, the driver also had his hands on the wheel on three separate occasions, though not in the final six seconds, according to the agency. As the car headed toward the barrier, there was no “precrash braking” or “evasive steering movement”, the report added.

Fong said in a statement: “The NTSB report provides facts that support our concerns that there was a failure of both the Tesla Autopilot and the automatic braking systems of the car.”

“The Autopilot system should never have caused this to happen,” he added.

“There’s clearly a technology failure,” said Erick Guerra, an assistant professor in city and regional planning at the University of Pennsylvania. “The technology is just not up to doing as much as people hope it can do.”

Tesla declined to comment on the NTSB report. In previous statements, the company emphasized that the highway safety barrier had been damaged in an earlier crash, contributing to the severity of the collision. The NTSB confirmed this previous damage.

Some critics have argued that the Tesla “autopilot” branding and Musk’s hype about his technology can be misleading and problematic given that the current capability of his cars continues to be fairly limited. Experts say the development of autonomous technology is entering a particularly dangerous phase when drivers are lulled into a false sense of security but expected to intervene to avoid crashes and other problems.

“In an ideal automated system, it should really be taking over when the driver fails … rather than forcing the driver to take over when it fails,” said Guerra, who was not involved with the NTSB report.

The NTSB, which has publicly feuded with Tesla over the release of information during the investigation, said it intended to issue safety recommendations to prevent similar crashes.

The problems with the damaged highway divider do not “absolve Tesla of responsibility”, said Ryan Calo, a University of Washington law professor and expert in autonomous cars. “That doesn’t mean they are off the hook.”

Tesla’s designers may not have anticipated this specific kind of crash, he added: “The technology is being deployed before there is a clear sense of … what is adequately safe.”

 

Leave a Comment

Required fields are marked *

*

*