Tesla draws rebuke for blaming autopilot death on Model X driver
Autonomous-vehicle experts criticized Tesla for issuing statement about the death of a customer that pinned the blame on driver inattentiveness
San Francisco/Washington/Michigan: Consumer-safety advocates and autonomous-vehicle experts criticized Tesla Inc. for issuing another statement about the death of a customer that pinned the blame on driver inattentiveness.
Days after publishing a second blog post about the crash involving Walter Huang, a 38-year-old who died last month in his Model X, Tesla issued a statement in response to his family speaking with San Francisco television station ABC7. The company said the “only” explanation for the crash was “if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”
“I find it shocking,” Cathy Chase, president of the group Advocates for Highway and Auto Safety, said by phone Wednesday. “They’re claiming that the only way for this accident to have occurred is for Mr. Huang to be not paying attention. Where do I start? That’s not the only way.”
Groups including Advocates for Highway and Auto Safety and Consumer Reports have criticized Tesla for years for naming its driver-assistance system Autopilot, with the latter calling on the company to choose a different moniker back in July 2016. The two organizations share the view of the National Transportation Safety Board, which has urged carmakers to do more to ensure drivers using partial-autonomy systems like Autopilot remain engaged with the task of driving. The US agency is in the midst of two active investigations into Autopilot-related crashes.
It’s Tesla responsibility to provide adequate safeguards against driver misuse of Autopilot, including by sending visual and audible warnings when the system needs a human to take back over, Chase said. “If they’re not effective in getting someone to reengage—as they say that their drivers have to—then they’re not doing their job.”
The stakes for Tesla’s bid to defend Autopilot are significant. The NTSB’s investigation of the 23 March crash involving Huang contributed to a major selloff in the company’s shares late last month. Chief executive officer Elon Musk claimed almost 18 months ago that the system will eventually render Tesla vehicles capable of full self-driving, and much of the value of the $51 billion company is linked to views that it could be an autonomous-car pioneer.
Tesla has declined to say how long drivers can now use Autopilot between visual or audible warnings to have a hand on the wheel. It’s also refused to comment on how many alerts can be ignored before the system disengages, what version of Autopilot software was in Huang’s Model X, or when the car was built.
“Just because a driver does something stupid doesn’t mean they -- or others who are truly blameless -- should be condemned to an otherwise preventable death,” said Bryant Walker Smith, a professor at the University of South Carolina’s School of Law, who studies driverless-car regulations. “One might consider whether there are better ways to prevent drivers from hurting themselves or, worse, others.”
The NTSB is looking into the crash that killed Huang, as well as a collision in January involving a Tesla Model S using Autopilot that rear-ended a fire truck parked on a freeway near Los Angeles. The agency said after Tesla’s second blog post about the Huang incident that it was unhappy with the company for disclosing details during its investigation.
In its latest statement, Tesla said it is “extremely clear” that Autopilot requires drivers to be alert and have hands on the steering wheel. The system reminds the driver this every time it’s engaged, according to the company.
“Tesla’s response is reflective of its ongoing strategy of doubling down on the explicit warnings it has given to drivers on how to use, and not use, the system,” said Mike Ramsey, an analyst at Gartner Inc. “It’s not the first time Tesla has taken this stance.”
Minami Tamaki LLP, the San Francisco-based law firm that Huang’s family has hired, said in a statement Wednesday that it believes Tesla’s Autopilot is defective and likely caused Huang’s death. The firm declined to comment on Tesla’s statement.
The National Highway Traffic Safety Administration, which has the power to order recalls and fine auto manufacturers, found no defect after investigating the May 2016 crash involving a Tesla Model S driven on Autopilot by Josh Brown, a former Navy SEAL. The agency closed its probe in January 2017.
According to data Tesla gave NHTSA investigators prior to its decision against any recall, Autopilot’s steering system may prevent the rate of crashes per million miles driven by about 40%, a figure the company cited in its latest statement.
“We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road,” Tesla said. “The reason that other families are not on TV is because their loved ones are still alive.”
Neither Tesla nor NHTSA has released the underlying data to support the crash-rate reduction claim.
“Tesla explicitly uses data gathered from its vehicles to protect itself, even if it means going after its own customers,” said Ramsey, the Gartner analyst. Bloomberg
Editor's Picks »
- Chief technology officers of Reliance Jio, Bharti Airtel quit
- IIT-Bombay generates ₹17.99 crore revenue in 2017-18, highest among IITs
- EPFO payroll data shows 4.4 million jobs created in 9 months till May
- Dubai recipe for economic success looks stale as markets slump
- CWC meet: Rahul Gandhi says BJP attacks institutions, Dalits, and poor
- What ABB India’s performance in June quarter says about capex growth
- Bajaj Finance does well in Q1 even as competition hots up
- Kotak Mahindra Bank: The perils of being priced to perfection
- Higher cane price crushes hopes of sugar mills
- Market optimism before 2019 general election: History may not repeat itself