Driver error and an overreliance on vehicle automation were the probable cause of the fatal Tesla crash in Florida last year. But the US National Transportation Safety Board explicitly shared some of the blame with Tesla in its conclusion of the investigation on Sept. 12.
The agency specifically faulted Tesla for not ensuring driver attention at high speeds, restricting the use of Autopilot to appropriate roads such as highways, and failing to sufficiently monitoring driver attention. It issued seven new safety recommendations and reiterated two previous ones, including capturing standard reporting data from automated system, limits on the use of self-driving features in expected environments, and ways to measure drivers’ engagement.
“System safeguards were lacking,” NTSB Chairman Robert Zumwalt said today in statement. “Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention….The result was a collision that, frankly, should have never happened.”
The inquiry was triggered by the 2016 crash of a Model S driven by 40-year-old Joshua Brown. Brown collided with a tractor trailer in Florida while in “Autopilot” mode. Investigators later found that Brown’s hands were on the steering wheel for 25 seconds of the 37 minutes the car was in Autopilot, and the car’s operating system warned Brown seven times to place his hands back on the wheel before hitting the truck. Neither the driver nor the car’s sensor system detected the tractor trailer pulling across the road ahead.
Tesla said it was reviewing the agency’s recommendations, according to Reuters. It also cited a January report by the National Highway Traffic Safety Administration that found no safety issues with Tesla’s Autopilot system, and a 40% decline in crashes after automatic steering was added to Tesla cars. Tesla has since upgraded its sensor suite and software to prevent drivers from using Autopilot if they ignore warnings from the system and is collecting millions of miles of driver data to improve the Autopilot performance.
The Florida crash highlights the difficulties carmakers face deciding how to regulate drivers’ behavior and how much autonomy to hand over the to car. The handoff between human and machine remains the most problematic challenge for autonomous (and semi-autonomous) cars. Transferring control to the driver, and gaining their full attention, can take an average of 3 to 7 seconds, according to data from Audi. Doing that safely and consistently is still an unsolved design problem.
Ford, which is developing its own self-driving vehicles, says it wants to aim for Level 4, or fully autonomous, vehicles from the start. “Right now, there’s no good answer, which is why we’re kind of avoiding that (semi-autonomous) space,” Ford’s vice president of research and advanced engineering told Wired.
At the heart of the debate is: Are automated cars safer? Of the more than 33,000 motor vehicle deaths in the US each year, about 94% are due to human error. In the past, Tesla CEO Elon Musk has asserted that autonomous vehicles will cut back on many of these deaths. “The probability of having an accident is 50% lower if you have Autopilot on,” said Musk at a 2016 energy conference in Oslo, Norway. The NTSB offered a qualified agreement today saying that “automation in highway transportation has the potential to save tens of thousands of lives,” but cautioned, “people still need to safely drive their vehicles” in the meantime.
Brown’s death has not dimmed Musk’s belief that autonomous vehicles are ultimately safer than humans behind the wheel. The company released a new hardware and software suite for the Autopilot feature earlier this year, and Musk claims this new system would “very likely” have prevented the fatal Florida accident. He expects Tesla’s autopilot mode will come out of beta once it’s approximately 10 times safer than the US vehicle average, he said. The arrival of such capabilities could come as early as next year: