NTSB Calls Out Tesla, Apple And NHTSA Over Fatal Autopilot Crashes And Sloppy Regulating

We may earn a commission from links on this page.
Image for article titled NTSB Calls Out Tesla, Apple And NHTSA Over Fatal Autopilot Crashes And Sloppy Regulating
Screenshot: Tesla

The National Transportation Safety Board (NTSB) just put Tesla, Apple, and the federal government on blast, claiming the two companies have been reckless in self-regulating vehicle safety systems and distracted driving, and concluded that the government needs to step in and do more to regulate the driver assistance systems on our roads.

Let’s start with Tesla. In a hearing concerning the fatal crash of Wei “Walter” Huang in a Tesla Model X operating Autopilot at the time of the incident back in March of 2018, the NTSB took the company to task over its Autopilot semi-autonomous driver assistance system.

Advertisement
Advertisement

NTSB Chairman Robert Sumwalt claimed the automaker has ignored safety recommendations communicated to Tesla and five other companies working on self-driving technology back in 2017, prior to Huang’s fatal crash. The other five companies all responded acceptably within 90 days. Tesla hasn’t officially responded to the recommendations in over two years.

Advertisement

The Board’s investigation in Huang’s crash also revealed that, during that trip, Huang had already triggered an audible warning to place his hands back on the steering wheel to pay attention to the road from the vehicle’s system once before.

The vehicle logs also show there were no hands on the steering wheel in the six seconds leading up to the vehicle’s impact with a damaged safety barrier on the highway shoulder, and no attempts by the driver to steer or apply brakes, but Autopilot was still activated.

Advertisement

Four seconds before impact, the vehicle began accelerating as the system believed it was in an open lane with no traffic ahead. The crash site indicated there was suitable visibility for an alert driver to have time to react, if they were paying attention.

Transportation reporter Ed Niedermeyer has more details from the hearing:

Advertisement

The investigation concluded that Autopilot steered the vehicle out of the appropriate lane, thinking the gore area of the roadway was a suitable lane. During this maneuver, there were no warnings to the driver.

Here’s how Tesla’s website currently defines its Autopilot system, which is widely considered to be a level two system:

Autopilot enables your car to steer, accelerate and brake automatically within its lane. Current Autopilot features require active driver supervision and do not make the vehicle autonomous.

Advertisement

Tesla also allegedly told NTSB investigators that it is the driver’s responsibility to determine an acceptable operating environment for Autopilot.

The NTSB disagrees, arguing the company could do more to protect occupants and citing the Society of Automotive Engineers (SAE) definition for a level two self-driving system which specifically describes that, “manufacturers can implement automatic safeguards to ensure their systems are used only in conditions for which they are designed.”

Advertisement

The NTSB also argued that long-standing research points to human drivers being poor monitors of automated systems. The Board also has determined that so-called “predictable abuse” of a system, like drivers knowingly activating Autopilot and then playing video games, can qualify as a system defect under NHTSA’s enforcement guidelines, which could be a big deal going forward.

Advertisement

As harsh as the Board was on Tesla, it also determined that distracted driving was a significant factor in Huang’s death. But, unlike Tesla’s efforts to push blame onto how drivers choose to use its Autopilot system, the NTSB blamed Huang’s distraction on Apple, his employer and the manufacturer of the phone he was allegedly using while driving.

Advertisement

The investigation revealed that a mobile game application had been running on Huang’s phone for minutes prior to the crash. Here’s more details from The Verge’s report from the hearing:

Sumwalt took Apple to task for not having a distracted driving policy that could have helped prevent this specific case of Autopilot misuse. He said that while many people consider Apple to be a technology leader, he considers the company to be “lagging behind” in this area.

In response, Apple said it expects its employees to follow the law and pointed The Verge to two support pages for the company’s “Do Not Disturb While Driving” feature.

Sumwalt advocated for wide adoption of lockout features like Do Not Disturb While Driving during his opening remarks. “It is not anti-technology to ask that mobile devices prevent drivers from engaging in distracting activity through the simple installation of a lockout mechanism or application,” he said. “It is pro-safety.”

Advertisement

But the NTSB’s biggest target in this hearing was the federal government itself. The Board slammed the Department of Transportation (DOT) and the National Highway Traffic Safety Administration (NHTSA) for their collective “non-regulatory hands-off approach to oversight of level two” automated systems.

Advertisement

Sumwalt brought the issues with lacking oversight into focus, via The Verge:

“What struck me most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology,” Sumwalt said in his prepared remarks on Tuesday. “Instead, industry keeps implementing technology in such a way that people can get injured or killed, ignoring this Board’s recommendations intended to help them prevent such tragedies.”

Advertisement

The Board also noted that investigations into crashes potentially implicating current automated driver assistance systems are made more difficult because they require manufacturer cooperation to receive vehicle systems data. Automated assistance systems are not covered by current Event Data Recorders, so investigators have to wait for the company to retrieve and send information.

Advertisement

The Board’s Dr. Robert Malloy had strong words for the current regulatory environment concerning automated driving systems:

For my staff and myself, there’s nothing more disappointing than investigating a crash, coming up with a good solution, having no response from Tesla, & in NHTSA’s case, ‘We don’t need to do that.’

Advertisement

You can now watch the hearing in full on the NTSB website.


We may finally be on the verge of a long-overdue overhaul for how we approach and regulate automated driving systems.

Advertisement

NTSB’s investigation and report suggests the implementation of stricter guidelines for automakers, the implementation of a defect investigation program in NHTSA’s broader investigations of these systems, and more liability on tech companies to ensure hand-held devices are designed to prevent distracted driving.

It seems the Wild West of automated driving may be coming to an end, and here’s hoping a stronger focus on labeling issues with system accessibility and irresponsible driver use of said technology will lead to less distracted driving, fewer fatal crashes, and just generally a sense of responsibility shared between drivers and companies. But it’s probably going to be a fight.