Evidence suggests that Tesla’s advanced driver-assistance system, Full Self-Driving (FSD), was engaged during a fatal crash that killed Tesla employee Hans von Ohain in Colorado in 2022. If this proves accurate, a Tesla employee could prove to be the very first fatality caused by Elon Musk’s “self-driving” software.
The Washington Post reports that on May 16, 2022, Hans von Ohain was killed when his Tesla Model 3 crashed into a tree and caught fire in Evergreen, Colorado. Von Ohain worked as a recruiter at Tesla and was an avid fan of CEO Elon Musk. His passenger, Erik Rossiter, survived the crash.
Rossiter told 911 dispatchers that von Ohain had activated an “auto-drive feature” on the Tesla, which led the car to veer off the road on its own. In a later interview, Rossiter clarified that he believes von Ohain was using Tesla’s Full Self-Driving feature at the time of the crash.
Full Self-Driving is Tesla’s most advanced driver-assistance technology, designed to guide the vehicle on roads from quiet suburbs to busy cities with little input from the driver. Over 400,000 Tesla owners have access to the FSD software, which remains in ongoing beta testing.
If Rossiter’s account proves true, this would likely be the first known fatality involving Full Self-Driving. In late 2021, federal regulators began requiring automakers to report crashes involving driver-assistance systems. Since then, they have logged over 900 crashes in Tesla EVs, including at least 40 serious or fatal injuries. Most crashes involved Tesla’s simpler Autopilot system.
According to the police report, there were no skid marks at the Colorado crash site, suggesting von Ohain did not brake before impact. The car continued powering its wheels after hitting the tree, pointing to the advanced driver-assistance system being active at the time.
An autopsy showed von Ohain had a blood alcohol level over three times the legal limit. Experts say this level of intoxication would have seriously hampered his ability to maintain control. However, the sophisticated self-driving capabilities von Ohain believed were engaged may have given him undue confidence in the car’s ability to correct itself.
Tesla has faced growing complaints over unreliable behavior by its driver-assistance software, including sudden swerving or braking. Lawsuits claim Tesla should share responsibility when its technology causes crashes or fails to prevent them. So far, Tesla has avoided liability by arguing that drivers must stay alert and in control.
Breitbart News has reported on dangerous situations caused by Musk’s “full self-driving” software such as a video of a Tesla running a red light while driving itself. In another case, a Tesla caused an eight-car pileup on the San Francisco Bay Bridge.
Read more at the Washington Post here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.