EYES UP FRONT

An AI-powered design trick could help prevent accidents like Uber’s self-driving car crash

While self-driving technology is still in its infancy, passengers are quickly coming to trust their cars to operate without them. Tesla owners play games and even sleep as their cars drive themselves, and Alphabet-owned Waymo even made an ad showing passengers in the backseat of its autonomous minivan yawning and taking selfies.

But taking your eyes off the road, even in an autonomous vehicle, can be dangerous, as the world realized after the video of Uber’s self-driving car killing a pedestrian emerged. The video showed that the car’s safety driver had been distracted in the moments leading up to the accident. These self-driving tests, along with any car with autonomous features on the road today like Teslas and Cadillacs, require a human to pay attention at all times—though that seldom is the case.

Startups and automotive companies say that there’s a technological solution for distracted driving in both autonomous and regular vehicles: Artificial intelligence algorithms that analyze video in real time to detect whether a person is watching the road.

“Imagine if this camera [in Uber’s car] was actually analyzing the driver’s head pose, eye closure rate, eyes on the road or not, various emotional and cognitive states, and in real time was able to alert if the safety driver was not paying attention,” says Rana el Kaliouby, CEO of Affectiva, an AI startup working with auto manufacturers like BMW and Daimler on this problem.

Since the Uber car already had a driver-facing camera, el Kaliouby notes, the technology and infrastructure for this solution already exists. All that’s needed is to implement software that actually monitors the driver.

Cadillacs with the “Super Cruise” semi-autonomous mode already have a form of this technology. These cars have a small camera located in the steering wheel that tracks a person’s head and eyes. If the camera detects that a person is not looking at the road, the steering wheel begins to flash and the car makes a warning noise, prompting them to pay attention. If the driver still isn’t watching the road, the car automatically slows down to a stop, puts on the hazard lights, and calls the emergency vehicle-assistance service OnStar.

“How long it takes before the system notices a driver is not paying attention depends on your speed,” Robb Bolio, a lead engineer for GM’s autonomous vehicles unit, told CNBC. “If you are going 75 miles per hour, it’s three or four seconds, depending on the traffic around you. If you are in bumper-to-bumper traffic going 10 miles per hour, it’s a little longer.”

Euro NCAP, an automotive safety organization backed by the European Commission and five European governments, has already said (pdf) that driver monitoring systems like this will be a key factor in how it rates cars for safety.

“The idea is that this is a symbiotic relationship between human and machine,” el Kaliouby says. “We need to leverage that relationship until we’re comfortable that these vehicles can drive in an autonomous mode, I don’t think anyone would say we’re at a place where we’re comfortable doing that.”

home our picks popular latest obsessions search