The last major obstacle to cracking the code on self-driving cars

Mobileye’s CEO Amnon Shashua poses with a Mobileye driverless vehicle at the Nasdaq Market site in New York, U.S., July 20, 2021.
Mobileye’s CEO Amnon Shashua poses with a Mobileye driverless vehicle at the Nasdaq Market site in New York, U.S., July 20, 2021.
Image: REUTERS/Jeenah Moon
We may earn a commission from links on this page.

As companies race to develop self-driving cars, they’re making very different bets about the technology they’ll need to release the first safe, reliable, fully autonomous vehicle.

In the minimalist camp, automakers like Tesla are relying primarily on cameras. They’re betting computer vision will improve enough that, within a few years, artificial intelligence can navigate roads just using visual data from onboard cameras—a very bold bet given the underdeveloped state of today’s technology.

Then there’s the maximalist camp that believes the best way to make self-driving cars reliable enough to sell is to equip them with as much data as possible. Intel-owned MobilEye is a prime example: It plans to develop its own full self-driving car software by using every sensor available: cameras, radar, and lidar.

The final missing ingredient? A detailed map of every road a car might ever encounter—that is, every single road on Earth.

The high stakes of picking the right sensor suite

Goldilocks would have a hard time deciding how many sensors to put on a self-driving vehicle. Too few sensors, and a driving AI might start to miss things; the less a car can see, the more likely it is to crash and the harder it will be to meet regulators’ safety standards. Too many sensors, and you risk driving up the cost so high few can afford your self-driving car. Even cheap lidar sensors, for example, can cost as much as $1,000 (one 2020 rendering of Waymo’s sensor layout shows a car with four lidar sensors).

The company that strikes the right balance will be able to enter the market before its rivals and out-compete them on price.

MobilEye, based in Israel, says it’s leading this race. It’s the world’s largest supplier of advanced driver assistance systems (ADAS). These systems use an array of sensors (which might include cameras, radar, or lidar) and an advanced computer chip to power the safety features that, for example, beep when a driver is about to back into a wall or automatically brake to avoid a crash. Of the roughly 100 million driver assistance systems on the road today, about 80% of them are built by MobilEye, which sells ADAS chips to more than 30 carmakers.

MobilEye doesn’t manufacture cars itself, but it believes it can develop self-driving car software safe enough to pass muster with regulators and sell to auto manufacturers by 2025. It’s already testing an early version of its software, which operates a fleet of robotaxis in Tokyo, Paris, Shanghai, and Detroit. Key to this effort is MobilEye’s plan to map out the traffic lights, crosswalks, and other key features of streets around the world, which the company believes will help it meet safe driving standards before other self-driving car developers.

If MobilEye’s mapping approach works, it could help the company hit the market before rivals like Waymo (which is already operating a driverless cab service in Phoenix, Arizona) and Tesla (which saw a rocky debut of its “fully self-driving” feature in September and has rejected precision mapping as part of its strategy).

Self-driving cars are (not quite) safe enough

MobilEye CEO Amnon Shashua argues that a driver assistance system and a self-driving car aren’t so different. Each needs to use sensors to see the world around it, and then rely on a computer chip to parse that information and make a decision—for instance, to slam on the brakes if it sees the car ahead of it suddenly stop. The only difference between the two, Shashua says, is how often they make mistakes: A driver assistance system can afford to miss something every few minutes, because ultimately a human driver is responsible for driving the car safely. A self-driving car doesn’t have that luxury.

Shashua says engineers have already figured out most of the basic computer vision and AI challenges involved in making a self-driving car: Driverless cars do a decent job of sensing people, vehicles, and hazards on the road around them, and in simple road conditions, they can accelerate, brake, and turn pretty well by themselves.

But they still don’t perform well enough to be trusted on their own: Full self-driving cars have already been a handful of fatal autonomous vehicle crashes. Tesla was forced to recall a beta version of its self-driving software because of its erratic and dangerous performance.

“The closer you get to the finish line, the harder it is is to make incremental progress,” said Sam Abuelsamid, a transportation analyst at Guidehouse Insights. “The edge cases just become harder and harder to deal with.”

Why maps may be key to autonomy

Self-driving cars struggle to understand the nuances of driving on particular roads says Shashua. Local laws, regional driving norms, and signage at a tricky intersection all vary from place to place, and current AI systems struggle to keep up.

A human driver might have a working understanding of local traffic laws, or at least they’d be able to parse a sign at an intersection that says “No right on red, Monday-Friday, 4-6 pm.” But “that’s not realistic for the current state of AI,”  Shashua said in a January talk during the CES auto show. The cars lack the onboard computing power to analyze these situations in real-time.

Instead, MobilEye wants to map the world’s roads as crib sheets for self-driving cars. “The idea behind the high-resolution maps is that you prepare all this information in advance,” Shashua explained. That way autonomous vehicles can already know there’s a “no right on red” sign at the next intersection, rather than relying on a computer vision system to spot the sign, read it, and apply its meaning to the road on the fly. “Once you connect to an actionable high-definition map, the performance accuracy is an order of magnitude better than if you don’t have a map,” argues Shashua. That makes a big difference in building regulators’ and the public’s trust in self-driving cars and eventually securing regulatory approval and winning customers.

Crowdsourcing a map of the world’s roads

The company plans to crowdsource the data it needs to build and maintain its map from cars that have MobilEye driver assistance systems installed. Those cars use onboard sensors to take in information about the road around them, and then upload that data to a cloud computer network owned by MobilEye. There are already 1 million cars sending MobilEye data, Shashua says, and by 2022 that number will grow to 5 million.

MobilEye struck its first data-sharing deals with Volkswagen, BMW, and Nissan in 2018; any car they sold from then on that came equipped with a driver assist system would send driving data to MobilEye to help the company develop its maps. Since then, MobilEye has struck a similar deal with three more carmakers, which it hasn’t named.

Since data collection began in 2018, Shashua says MobilEye’s maps have covered all of Europe and Japan and much of the US. It’s working with partners in China to manage regulatory barriers. The rest of the world, as yet, has limited coverage.

MobilEye plans to update its self-driving car research during the next CES auto show in January. In the meantime, MobilEye has doubled its ADAS chip production in the last four years to just under 20 million chips per year; the company is betting that every driver assistance chip it delivers will bring it another step closer to rolling out a fully autonomous vehicle.