Active Stocks
Thu Apr 18 2024 15:59:07
  1. Tata Steel share price
  2. 160.00 -0.03%
  1. Power Grid Corporation Of India share price
  2. 280.20 2.13%
  1. NTPC share price
  2. 351.40 -2.19%
  1. Infosys share price
  2. 1,420.55 0.41%
  1. Wipro share price
  2. 444.30 -0.96%
Business News/ News / World/  Tesla Autopilot cars may be recalled, investigation upgraded
BackBack

Tesla Autopilot cars may be recalled, investigation upgraded

Tesla Autopilot are on the verge of being recalled.

An advertisement promotes Tesla Autopilot cars at a showroom of U.S. car manufacturer Tesla in Zurich, Switzerland March 28, 2018. REUTERS/Arnd Wiegmann (REUTERS)Premium
An advertisement promotes Tesla Autopilot cars at a showroom of U.S. car manufacturer Tesla in Zurich, Switzerland March 28, 2018. REUTERS/Arnd Wiegmann (REUTERS)

Tesla Autopilot cars are one step closer to being recalled after the United States stepped up its inquiry into a number of crashes with parked emergency vehicles or trucks with warning signals.

The National Highway Traffic Safety Administration (NHTSA) said on June 9 that the Tesla investigation had been upgraded to an engineering analysis, indicating that the electric vehicle maker and automated systems that conduct at least some driving responsibilities are being scrutinised more closely.

An engineering analysis is the final part of an investigation, and in most cases, the NHTSA decides within a year whether to issue a recall or terminate the inquiry.

The agency's documents raise major concerns about Tesla cars' autopilot system. The agency discovered that it's being used in regions where its capabilities are limited, and that many drivers ignore the vehicle's warnings to avoid crashes.

The investigation has now expanded to include practically all of the Austin, Texas-based automaker's vehicles sold in the United States since the start of the 2014 model year. According to the National Highway Traffic Safety Administration, 16 crashes involving emergency vehicles and trucks with warning flags occurred, resulting in 15 injuries and one death.

Additional data, vehicle performance, and the degree to which Tesla cars' autopilot systems may increase human factors or behavioural safety hazards, compromising the effectiveness of the driver's supervision, will be evaluated by investigators.

Tesla cars delivered collision alarms to the drivers in the majority of the 16 collisions right before contact. In roughly half of the situations, automatic emergency brakes acted to at least decelerate the autos. According to NHTSA records summarising the investigation, autopilot gave over control of Tesla cars on average less than a second before the crash.

NHTSA also said it's looking into crashes involving similar patterns that did not include emergency vehicles or trucks with warning signs.

The agency found that in many cases, drivers had their hands on the steering wheel as Tesla requires, yet failed to take action to avoid a crash. This suggests that drivers are complying with Tesla's monitoring system, but it doesn't make sure they're paying attention.

In crashes where video is available, drivers should have seen first responder vehicles an average of eight seconds before impact, the agency wrote. The agency will have to decide if there is a safety defect with autopilot before pursuing a recall.

Investigators also wrote that a driver's use or misuse of the driver monitoring system “or operation of a vehicle in an unintended manner does not necessarily preclude a system defect."

The agency document all but says Tesla’s method of making sure drivers pay attention isn’t good enough, that it's defective and should be recalled, said Bryant Walker Smith, a University of South Carolina law professor who studies automated vehicles.

“It is really easy to have a hand on the wheel and be completely disengaged from driving," he said. Monitoring a driver's hand position is not effective because it only measures a physical position. “It is not concerned with their mental capacity, their engagement or their ability to respond."

Similar systems from other companies such as General Motors' Super Cruise use infrared cameras to watch a driver's eyes or face to ensure they're looking forward. But even these systems may still allow a driver to zone out, Walker Smith said.

“This is confirmed in study after study," he said. “This is an established fact that people can look engaged and not be engaged. You can have your hand on the wheel and you can be looking forward and not have the situational awareness that's required."

In total, the agency looked at 191 crashes but removed 85 of them because other drivers were involved or there wasn’t enough information to do a definite assessment. Of the remaining 106, the main cause of about one-quarter of the crashes appeared to be running autopilot in areas where it has limitations, or in conditions that can interfere with its operation.

“For example, operation on roadways other than limited access highways, or operation in low traction or visibility environments such as rain, snow or ice," the agency wrote.

Other automakers limit use of their systems to limited-access divided highways.

The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit autopilot’s use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has yet to act on the recommendations. The NTSB can only make recommendations to other federal agencies.

In a statement, NHTSA said there aren’t any vehicles available for purchase today that can drive themselves. “Every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for the operation of their vehicles," the agency said.

Driver-assist systems can help avoid crashes but must be used correctly and responsibly, the agency said.

Tesla did an online update of autopilot software last fall to improve camera detection of emergency vehicle lights in low-light conditions. NHTSA has asked why the company didn't do a recall.

NHTSA began its inquiry in August of last year after a string of crashes since 2018 in which Teslas using the company's autopilot or Traffic Aware Cruise Control systems hit vehicles at scenes where first responders used flashing lights, flares, an illuminated arrow board, or cones warning of hazards.

(With AP inputs)

Unlock a world of Benefits! From insightful newsletters to real-time stock tracking, breaking news and a personalized newsfeed – it's all here, just a click away! Login Now!

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
More Less
Published: 10 Jun 2022, 10:59 AM IST
Next Story footLogo
Recommended For You
Switch to the Mint app for fast and personalized news - Get App