Tesla trial to size up cause of fatal autopilot crash: driver or tech

Autopilot is available on all new Teslas and is designed to help with driving tasks such as steering.
Autopilot is available on all new Teslas and is designed to help with driving tasks such as steering.


In a wrongful-death suit set for trial this week, a jury will determine who is at fault in a 2018 fatal crash.

Tesla is preparing for one of the biggest tests of its driver-assistant Autopilot: defending the company’s position that drivers, not the automaker, are ultimately responsible for crashes that involve the technology.

In a wrongful-death suit set for trial this week, a jury will determine who is at fault in the 2018 fatal crash involving 38-year-old Apple engineer Walter Huang. The driver died on Highway 101 in California after his Model X sport-utility vehicle crashed into a highway barrier while he was using Tesla’s driver-assistance technology.

Huang’s family is going after Tesla in a distinctive fashion by questioning whether the automaker oversold Autopilot’s capabilities and didn’t take sufficient actions to prevent customers from misusing the technology.

Tesla says Huang is responsible because he was playing a videogame while the vehicle was engaged in Autopilot. Government investigators and the Huangs agree that he was distracted in the moments leading up to the crash, but the family says Tesla is at fault because of its marketing of Autopilot.

If the Huangs prevail, the suit could represent a major financial liability for Tesla, potentially spurring additional cases that seek notable awards. The automaker is facing other disputes involving Autopilot.

“Every plaintiff’s lawyer that has one of these cases will be watching this," said Matthew Wansley, associate professor at Yeshiva University’s Cardozo School of Law, who has researched automated-driving systems and criticized Tesla’s marketing of the technology.

“The damage award could be significant here," he said, adding that juries in these cases typically award damages based on the victim’s lost income.

Tesla didn’t respond to requests for comment from The Wall Street Journal. Jury selection for the case starts Monday in San Jose, Calif., with opening arguments to follow as early as Thursday. The trial is expected to last several weeks.

A history of scrutiny

Several agencies have been investigating Autopilot, including the Justice Department and Securities and Exchange Commission, which have launched separate probes examining whether Tesla misled customers and investors about how Autopilot performs.

The National Highway Traffic Safety Administration has also been examining Autopilot and the automaker’s more expansive tech called “Full Self-Driving Capability" for years, raising concerns that not enough guardrails are built-in to ensure drivers use the systems appropriately. The regulator has launched more than 40 investigations into accidents suspected to be tied to Tesla’s Autopilot that resulted in 23 deaths.

Autopilot is available on all new Teslas and is designed to help with driving tasks such as steering and lane changes typically on highways. The Full Self-Driving upgrade features navigation on city streets.

Tesla sells subscriptions to enhanced versions of Autopilot as well as to Full Self-Driving. Tesla Chief Executive Elon Musk has said such sales could be significant profit drivers for the company.

The automaker says the Autopilot software isn’t designed for fully autonomous driving and allows for drivers to take control when the technology is engaged. Tesla says its website and user manuals make clear that the software requires active driver supervision.

The system deploys a series of warnings to alert drivers if they aren’t paying attention to the road. In December, Tesla issued a safety recall that updated the software underpinning Autopilot, adding more warnings for drivers to ensure they “adhere to their continuous driving responsibility," the company wrote in a regulatory filing.

Tesla said it made the changes to resolve an investigation by regulators.

The automaker has prevailed in the last two trials, with jurors in the most recent one finding the company wasn’t responsible for the crash because they found no manufacturing defect with Autopilot.

Huang family suit

In the Huang case, the family alleges Tesla drivers were sold on the idea that Autopilot was safer than a human-operated car, but the automaker knew the technology had serious flaws that customers wouldn’t expect to encounter based on how Autopilot was marketed.

On the morning of the March 2018 crash, Huang was making his commute to work after dropping his son off at preschool. With Autopilot engaged while on the highway, Huang’s Model X approached a dividing area that sits between travel lanes of the highway and an exit ramp.

The Autopilot system moved the vehicle off the highway and into the dividing area, and then struck a barrier at about 70 miles an hour. Huang died as a result of blunt force trauma injuries sustained in the crash.

His family’s attorneys say Tesla is to blame for the incident because reasonable drivers believe Autopilot is safe and can navigate highway roads, according to court filings, citing statements and advertising by Musk and the carmaker.

Among the family’s evidence is an email from former Tesla President Jon McNeill. Two years before Huang’s crash, McNeill emailed the company’s head of Autopilot and Musk, saying he had driven several hundred miles in a Model X with the technology activated.

“I got so comfortable under Autopilot, that I ended up blowing by exits because I was immersed in emails or calls (I know, I know, not a recommended use)," McNeill wrote in a March 25 email that year.

One of the Huang family’s attorneys read the email during a deposition, according to a transcript reviewed by the Journal. The Journal couldn’t obtain the full text of his message.

Prove your case

The Huangs will have to demonstrate that Tesla could have improved its warning system for drivers who used Autopilot, which by then had been in use for a little more than two years, said Richard Cupp, law professor at Pepperdine’s Caruso School of Law.

Tesla has said Huang’s hands weren’t detected on the wheel for six seconds before the crash, and that he knew his vehicle had trouble using Autopilot at this particular spot on the highway before, citing testimony from Huang’s wife and text messages. The company also says Huang was playing a videogame on his phone at the moment of impact.

“The sole cause of this crash was his highly extraordinary misuse of his vehicle and its Autopilot features so that he could play a videogame," Tesla said in a recent court brief.

Tesla has strong support for its argument that Huang was misusing Autopilot in the seconds before the crash, Wansley said, a point his family’s attorneys concede in court filings.

But the case highlights the issue of drivers becoming too complacent with partially automated technology, Wansley said.

“These crashes are happening because misuse happens all the time," said Wansley, adding that a significant verdict could force the company to do more to ensure drivers are responsible.

Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.