Correction: An earlier version of this article carried a headline stating Google’s cars were involved in four accidents. They were in fact involved in three. The fourth accident involved a car using Delphi technology.
An investigation by the Associated Press has uncovered that four of the 48 self-driving cars on California’s roads have been involved in four accidents since September, after the state refused to make the official accident reports available.
Two of the accidents happened while the vehicle was driving, and the other two while the human safety driver was in control. Google, whose cars suffered three of the accidents, and Delphi, who had the fourth crash, both claim that in all cases human error or inattention, rather than the autonomous technology, was to blame. Chris Urmson, director of Google’s self-driving car program, revealed today that the tech giant’s fleet has experienced 11 minor accidents (“light damage, no injuries”) in the 1.7 million miles and six years since it started testing its driverless cars.
As the AP reported:
The national rate for reported “property-damage-only crashes” is about 0.3 per 100,000 miles driven, according to data from the National Highway Traffic Safety Administration.
In that context, Google’s three [crashes] in about 140,000 miles may seem high. As the company pointed out, however, perhaps 5 million minor accidents are not reported to authorities each year, so it is hard to gauge how typical Google’s experience is.
Each of the six companies testing autonomous cars in California has had to post a $5 million bond against the possibility of their vehicles damaging property or injuring or killing someone. They are also required to report all accidents originating from the operation of their vehicles, even the slightest fender bender.
Some companies aren’t happy about this. As revealed by Quartz last year, Google and Volkswagen Group of America proposed that only accidents occurring while vehicles were driving themselves should be reported. Volkswagen also wanted to establish a reporting threshold based on the value of damages, in order to rule out minor bumps and scrapes. The California Department of Motor Vehicles (DMV) rejected both suggestions, saying that it wants to hear about all crashes involving autonomous vehicles. This skirts the possibility that a car could simply hand back control to its human driver in the instant before a collision.
California also wants to understand whether autonomous vehicles are more accident-prone through no fault of their own. Self-driving cars vehicles might, for example, experience more accidents because other drivers are nervous around them or are unable to get social cues, such as eye contact at stop signs or pedestrian crossings.
But as useful as DMV expects these reports to be internally, it is currently refusing to release them to the public. In response to a Public Records Act request by Quartz for the reports, the DMV cited a different part of the state’s Vehicle Code that requires accident reports concerning traffic injuries or fatalities to remain confidential.
“It’s a tenuous stretch of that provision to say that it also covers autonomous vehicle reports,” says Bryant Walker Smith, a professor at the University of South Carolina and a lawyer with the Center for Internet and Society at Stanford Law School. “I suspect this is a policy decision made within the DMV or perhaps by others within government that they do not want to expose companies to this kind of public scrutiny, or they want to encourage candor that might otherwise be stymied.”
“The public has a right to understand what some of the risks are,” says Ryan Calo, a professor at the University of Washington who teaches a class on Robotic Law and Policy. “While I can see the reasons for holding back information about private parties, the whole point of a public records request is to make sure that citizens gain access to information that could affect their lives.”
Much of the justification for developing self-driving cars has been their potential to reduce the million-plus fatalities each year on roads around the world. A battery of laser, radar and video sensors and algorithms that never get tired promise a future in which automobile accidents are as rare as commercial plane crashes are today. “We’ve seen this moment when the car basically becomes a better driver than a human being, and it’s a transformational moment,” says Sebastian Thrun, who launched Google’s driverless car program.
But no one knows if such an accident-free future is even possible, or whether autonomous vehicles will prove as flexible and responsive as humans in the most dangerous and unpredictable scenarios. Either way, hiding minor accident reports from the public suggests a rocky road ahead for self-driving cars.