Tesla's Autopilot mode is on trial in California

A plaintiff is seeking $3 million in damages for an accident caused by Tesla's software in 2019
Plaintiff says Tesla's semi-autonomous mode fractured her jaw.
Plaintiff says Tesla's semi-autonomous mode fractured her jaw.
Photo: Florence Lo (Reuters)
We may earn a commission from links on this page.

The first trial of robotic technology’s threat to human life is underway in a California court. The case relates to Tesla’s Autopilot software, which caused an accident on a city road in 2019.

According to Reuters, the plaintiff in the case is Los Angeles resident Justine Hsu, who first sued Tesla in 2020, when, while driving on the semi-autonomous mode, her Model S swerved into a restraint. She says in court filings that her airbag was activated with so much force that it “knocked out teeth, and caused nerve damage to her face” and broke her jaw. Hsu claims the airbag system—and the entire design of the Autopilot system, which Tesla launched in 2015—had flaws. She’s seeking more than $3 million in damages.

The electric car maker has denied any wrongdoing. It defends itself in part by pointing out that Hsu activated Autopilot on a city street, despite the car’s user manual warning against that. Tesla maintains that its cars are not fully autonomous, and that drivers should be ready “to take over at any moment.”

Tesla attorney Michael Carey claims that Hsu had time to brake the vehicle, yet still drove straight into the barrier. “The evidence proving distraction is pretty straightforward,” he said.

Autopilot’s safety record

Tesla vehicles running on the Autopilot software were involved in 273 crashes in 2021, according to data from the National Highway Traffic Safety Administration. That means Tesla vehicles made up nearly 70 percent of the 392 crashes involving advanced driver-assistance systems during that year.

Tesla CEO Elon Musk has always promoted Tesla’s “Full Self-Driving” (FSD) software, selling it as a $15,000 add-on to the company’s vehicles. Automation is a major part of the company’s future plans for revenue growth. As such, investors and shareholders are likely to monitor the outcome of the trial closely. The company’s shares dropped by 8% when the incident was reported in 2019.

While previous Tesla Autopilot flaws have led to deaths around the world, none has ever been prosecuted, making the outcome of the San Francisco case a critical point in how robotic car software will be designed in the future—and a major precedent in similar trials.