Tesla Pays $5 Million Settlement in Autopilot Case

Tesla Supercharger

Without admitting to wrongdoing, Tesla recently paid $5 million in a class action lawsuit settlement after its Autopilot “self-driving” system was deemed the cause of two fatal car crashes in which drivers had enabled Tesla’s driving assist system. At the conclusion of the suit, Tesla claimed to have implemented improved Autopilot hardware and software and also agreed to compensate consumers who had to wait longer than anticipated for their self-driving vehicles.

First Tesla-related Fatality

Autopilot was first enabled on Tesla’s cars in October 2015 and works by combining radar-controlled cruise control with automatic steering to stay within lines of lanes of traffic. Joshua Brown was the first fatality in a car accident while driving with a Tesla vehicle’s Autopilot enabled. His Model S crashed into a truck that turned across his path in Florida, in May 2016. Neither Joshua Brown nor the Tesla car’s computers saw the white truck against the sky.

The National Highway Traffic Safety Administration (NHTSA), after a very long investigation into Brown’s crash, found that the Autopilot system was, at the time of the crash, operating as intended, it was not defective, and that Tesla need not recall any of its vehicles, basically saying that the crash was Brown’s fault. The NHTSA claimed that Tesla should take some of the blame for creating and selling a system that’s too easy for drivers to misuse.

After Brown’s death, Tesla implemented some changes and made Autopilot rely more on data from its radar and less on the camera in order to spot obstacles in the vehicle’s path. A software update also drastically reduced the length of time a driver can have his hands off the steering wheel; it also introduced brighter, flashing warnings to alert drivers to act.

Tesla Model X Autopilot Fatality

The latest fatality occurred on March 23, 2018 when the Autopilot feature was enabled on Wei Huang’s Tesla Model X sport utility vehicle, which slammed into a highway lane divider and then burst into flames. Huang died in the hospital shortly after the accident. According to wired.com, citing a blog from the Tesla website, a Tesla spokesperson said that computer logs of the Huang car from around the time of the crash indicate that Autopilot was on.

When Autopilot is on with the cruise control set, the car will stay in its lane with a fixed distance from the car ahead of it. The driver is supposed to keep his hands on the wheel and monitor the car and his surroundings. If the driver’s hands are off the wheel for longer than the allowed time, there will be a visual alert on the dashboard to put your hands back on the wheel. If your hands remain off the wheel, the system emits a loud beep to get your attention. If all warnings are ignored, the car’s hazard lights will come on, and it will eventually slow down to a full stop.

Tesla claims that in the Huang crash, his hands were not detected on the wheel for six seconds prior to crashing into the concrete lane divider, and he had been given multiple visual warnings and one audible warning to put his hands back on the wheel. According to Tesla’s blog, the car’s computer logs indicate Huang failed to heed all visual and audible warnings given by the Autopilot system. Tesla emphasized that “Autopilot is a driver assistance tool, not a replacement, and that they <the drivers> retain responsibility for driving safely.”

Fraud by Concealment

The class action lawsuit, filed in 2017, named six Tesla Model S and Tesla Model X owners from Colorado, California, Florida and New Jersey to represent a nationwide class of consumers impacted by the failed Autopilot system. The plaintiffs alleged the company committed “fraud by concealment” because they paid an extra five thousand dollars for the Autopilot software, which was supposed to have provided added safety features, but the added features simply did not work as the company claimed they would. Elon Musk is the co-founder and CEO of Tesla, Inc.

What has brought this company and Musk to the headlines so often of late is the self-driving vehicle craze, but many people are leery of the reliability of autonomous vehicles. The very short track record of self-driving vehicles may give others pause as well.

Lynn Fugaro About Lynn Fugaro

Lynn has been writing web content since 2007 after a lengthy career as a middle school English teacher and administrator. Writing web content seemed a natural progression following a career teaching adolescents about the beauty and the power of the written word, and she quickly got hooked on the challenge of writing SEO- and reader-friendly content that could be found on Page 1 of Google and other search engines.

Having written content for physicians and attorneys for the first few years of her writing career, Lynn has most recently produced original, informative, entertaining, and relevant content for the entertainment industry, the automotive industry, senior communities, pet rescues and numerous other businesses hoping to increase website traffic and page views for all clients looking for informative, vibrant content.