Uber driver Nathan Stachelek was pulled off to the side of the road when he saw the self-driving car turn the wrong way.
It was the night of Sept. 26 and the car he had spotted, one of the autonomous Ford Fusions that Uber is testing in Pittsburgh, Pennsylvania, was heading through the city’s Oakland neighborhood, just steps from the center of campus for the University of Pittsburgh. Stachelek watched the car turn off Bates Street and onto Atwood, a one-way road, going in the wrong direction. From a distance he couldn’t tell whether the car was driving itself, or its human operator had made a mistake. Stachelek took out his phone in time to shoot a brief video of Uber’s vehicle backing up and driving away, then uploaded it to Facebook.
“Driverless car went down a one way the wrong way,” he wrote. “Driver had to turn car around.”
Uber, the world’s most valuable startup, with a $68 billion valuation, has rushed to be first-to-market with driverless technology. The company showed off its self-driving cars at a media event in Pittsburgh last month before putting four Ford Fusions into service for a small group of local riders. It plans to add 100 Volvo SUVs to the fleet by the end of this year. Uber is betting on truly autonomous vehicles to transform the economics of ride-hailing by eliminating its biggest cost—drivers. The company lost at least $1.2 billion in the first half of 2016, Bloomberg reported in August, with the majority of that spent on driver subsidies.
For now Uber’s cars have limited operating hours and terrain, and they must travel with two humans up front—a designated “safety driver” behind the wheel and an engineer in the adjacent seat. Even so, the company is pushing this technology onto the public when it remains largely unproven and other tests of driverless cars around the US have yielded their fair share of accidents. Earlier this year a self-driving Google car hit a public bus while trying to make a right turn in Silicon Valley. In May, the driver of a Tesla Model S died in an accident while he had the autopilot function enabled. Google suffered its worst crash yet just a few weeks ago when another driver ran a red light and barreled into its self-driving Lexus.
Stachelek isn’t the only Pittsburgher to spy one of Uber’s self-driving cars in an awkward spot. Late on the night of Sept. 24, another Uber driver and his two passengers encountered a self-driving Uber and a second car pulled over at the intersection of Bigelow Boulevard and Herron Avenue, about five minutes driving from the Advanced Technologies Center (ATC), Uber’s research facility for driverless technologies. The second car had its hazard lights on and was being inspected by a man with a lanyard around his neck in the apparent aftermath of an accident.
“I couldn’t see any of the damage,” says Jason, the Uber driver, who requested Quartz withhold his last name because he feared being deactivated by the company. But “there’s no reason for a self-driving Uber car to be pulled over in the way that it was, with another car right behind it with its flashers on.” Amber McCann, a Pittsburgh resident and one of Jason’s passengers that night, told Quartz the intersection is known as a place “where there’s a ton of rear-ending accidents.” Her friend and the car’s other passenger, Jeanette McCulloch, provided Quartz with a photo she took while driving by.
Uber said it was aware that another car had tapped the fender of one of its self-driving Fords on the night of Sept. 24. The company said that was the only incident it had heard of involving one of its self-driving cars in Pittsburgh and that it was reported as the “lowest level”; it didn’t specify whether the car was in autonomous mode at the time. The company also didn’t have any record of a self-driving car turning the wrong way on a one-way street, either while in autonomous mode or because its human driver made a mistake.
While it would be easy to write these incidents off as minor mishaps, both suggest how much work Uber has left to do on its autonomous software, even as it’s begun putting real passengers in the cars. One reason Uber’s vehicles are currently traveling only a small area of Pittsburgh is because those are supposed to be the streets its engineers have carefully mapped and taught the cars about. If that’s really the case, no self-driving car should be turning the wrong way down a one-way street—nor should its safety driver, who is in theory the final check on the car’s autonomy.
Driverless vehicles also tend to operate in a cautious, hyper-logical manner and follow the rules of the road to a tee. Uber, again via its mapping efforts, has tried to prepare its cars to avoid certain tricky situations they might run into. On one street near the ATC in Pittsburgh, Uber engineers have instructed the self-driving cars to hang close to the curb because trucks making turns are more likely to swerve into the oncoming lane. By that same logic, the cars should also know certain intersections are hotspots for rear-ending accidents and be on the alert to avoid them, much as a savvy human driver would be. Uber’s approach differs from that of other companies such as Nvidia, which have focused on teaching computer systems to drive in a more adaptive, human-like way—by being introduced to situations a few times, and then applying what they learn to other encounters on the road.
The safety guidelines for autonomous vehicles (.pdf) recently released by the US Department of Transportation list “Detect and Respond to Access Restrictions (One-Way, No Turn, Ramps, etc.)” among the “normal driving” expectations of a driverless car. Chelsea Kohler, an Uber spokeswoman, told Quartz late last month that the company considers its self-driving cars to be at “level four with the driver” on the DOT’s six-level scale for automation. The DOT defines level four as when “an automated system can conduct the driving task and monitor the driving environment, and the human need not take back control … in certain environments and under certain conditions.”
Uber is taking advantage of a regulatory void in Pennsylvania, which has yet to enact autonomous vehicle legislation. Its self-driving cars are insured for up to $5 million per incident, in line with pending legislation in the state. Uber has repeatedly declined to specify to Quartz who would be held liable were one its self-driving cars involved in an accident, saying it doesn’t deal in hypotheticals. The company also doesn’t have an ethics board and is reluctant to discuss “trolley problem” scenarios, in which a car might have to protect one group of people (say, its passengers) at the cost of another (i.e., pedestrians). The DOT cautions in its guidelines that self-driving cars will inevitably have to be programmed to make “ethical judgements.”
Sonya Toler, a spokeswoman for the Pittsburgh Bureau of Police, said in an email that no traffic incidents involving Uber’s self-driving cars have been reported to the bureau. She added that there are “no agreements between the Police Bureau and Uber regarding information sharing, mishaps or protocols.” Katie O’Malley, a spokeswoman for Pittsburgh mayor Bill Peduto, said in an email that she is “not aware of incidents involving self-driving vehicles but it’s certainly possible,” and that the city has no formal agreements with Uber on what information about its autonomous vehicles needs to be shared. Google releases monthly reports on the activities of its self-driving cars, including collisions in either manual or autonomous mode.
Uber is in the process of hiring safety drivers to man its self-driving cars (technically “development vehicle operators”) and last month emailed Pittsburgh-area drivers suggesting they apply. Stachelek was among those who submitted an application. He hasn’t heard back yet and worries that self-driving technology like Uber’s will “cost more jobs than it’s going to create”—but not anytime soon. “We are a long way away from the driverless cars taking over, or at least I hope we are,” he says. “I don’t think the technology is ready yet.”