Lawsuit: Elon Musk’s Tesla Misrepresented ‘Autopilot’ Safety Leading to Deadly Accident

Tesla boss Elon Musk shrugs
ODD ANDERSEN/Getty

The family of a Tesla driver who died when his car crashed into a parked fire truck while using Elon Musk’s Autopilot system has filed a lawsuit against the electric vehicle manufacturer, alleging “fraudulent misrepresentation” of the technology’s safety.

CNBC reports that Tesla finds itself embroiled in a legal battle as the family of Genesis Giovanni Mendoza-Martinez, a driver who lost his life in a 2023 collision, has filed a lawsuit against the company. The suit, originally filed in Contra Costa County in October, has recently been moved to federal court in California’s Northern District at Tesla’s request. The plaintiffs allege that Tesla’s “fraudulent misrepresentation” of its Autopilot technology was a contributing factor in the fatal crash.

The incident in question involved a 2021 Tesla Model S sedan, which collided with a parked fire truck in Walnut Creek, California, while the driver was utilizing the Autopilot system. Tragically, Mendoza-Martinez lost his life in the crash, while his brother Caleb, a passenger in the vehicle, sustained serious injuries.

According to the lawsuit, Tesla and CEO Elon Musk have consistently exaggerated or made false claims about the capabilities and safety of the Autopilot system over the years. The plaintiffs’ attorneys argue that these misrepresentations were made to generate excitement about the company’s vehicles and improve its financial standing. They cite various instances, including tweets, blog posts, earnings calls, and press interviews, where Tesla and Musk allegedly made misleading statements about the technology.

In response to the allegations, Tesla’s attorneys have asserted that the driver’s own negligence was the primary cause of the collision and that reliance on any representation made by the company was not a substantial factor in the harm suffered by the driver or passenger. They maintain that Tesla’s vehicles and systems have a “reasonably safe design” and comply with state and federal laws.

This lawsuit is one of at least 15 active cases that focus on similar claims involving Tesla incidents where Autopilot or its premium version, Full Self-Driving (Supervised), was in use just before a fatal or injurious crash. Three of these cases have also been moved to federal courts.

The crash at the center of the Mendoza-Martinez lawsuit has been part of a broader investigation by the National Highway Traffic Safety Administration (NHTSA) into Tesla’s Autopilot system. The agency has also initiated a second probe to evaluate the effectiveness of Tesla’s “recall remedy” in addressing issues with Autopilot’s behavior around stationary first responder vehicles.

Both the NHTSA and the California Department of Motor Vehicles have raised concerns about Tesla’s marketing and advertising practices surrounding its Autopilot and FSD systems. The NHTSA has warned that Tesla’s social media posts may mislead drivers into believing that its cars are fully autonomous robotaxis, while the California DMV has sued the company, alleging that its claims amount to false advertising.

Read more at CNBC here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

COMMENTS

Please let us know if you're having issues with commenting.