Tesla Motors announced in a blog post today that it had been made aware that one of its Model S electric cars was involved in a fatal accident on May 7, while the car’s ”autopilot” function was enabled.
This function, introduced last October as an over-the-air upgrade available to Model S cars built after September 2014, is intended to allow the car to steer, change lanes, and even park on its own.
At the time, CEO Elon Musk told press at the launch event: ““We explicitly describe [this software update] as a beta. It’s just important to exercise great caution at this early stage. In the long term, people will not need hands on the wheel—and eventually there won’t be wheels and pedals.”
This is the first death in over 130 million miles traveled with the autopilot enabled, Tesla said, comparing that to the average rate for US road deaths, where there is one fatality per 94 million miles traveled. Tesla said the US National Highway Traffic Safety Administration (NHTSA) will be investigating the crash, which took place in northern Florida.
According to Tesla, a tractor-trailer cut across the highway that the Model S was traveling on, adding:
Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.
Tesla said it designed autopilot so that riders still have to keep their hands on the wheel for it to work, but many videos have been uploaded to the internet showing Tesla owners testing the software’s ability to operate hands-free, and even people intentionally trying to ram their cars into oncoming traffic. “As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing,” Tesla added. “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert.”
The US government announced earlier this year its intention to regulate and integrate self-driving vehicles more fully into US roadways. In February, the NHTSA said it considers autonomous cars to be legal drivers. Currently, only a few states, such as California and Michigan, have regulations for self-driving cars in place, and only under very specific situations. This accident could have wide-ranging implications for automakers trying to develop fully autonomous cars, and how governments regulate them. It shows that even millions, or hundreds of millions of miles of experience cannot yet help software predict and react to every possible situation.
Even Google’s self-driving cars, which have been on the road in some capacity since 2009, have driven almost nowhere compared with what humans have: In seven years, the company’s fleet of 55 vehicles have driven about 1.5 million miles, and just last year, US drivers racked up over 3 trillion miles. Obviously, the vast majority of those human-driven miles will be in mundane, repeatable and predictable situations that wouldn’t be of particular use to a software system being trained on a wide range of driving situations. But the more miles that cars drive, the more likely they’re going to find themselves in new situations.
“Most driving is so easy that we can do it without really thinking about it much,” Gill Pratt, the former head of the DARPA Robotics Challenge and the head of Toyota’s billion-dollar autonomous vehicle, AI, and robotics research lab, told Quartz in January. Pratt said the miles self-driving car programs have racked up are just the tip of the iceberg of the research required before self-driving cars become a reality. “It’s all about reliability—when there’s snow, when there’s a bad sensor, when something that’s really surprising by another driver that occurs,” he said. “It’s the rarer cases that are going to be harder to deal with.”