A Tesla car crashed into a fire truck blocking traffic as a car was towed on Interstate 680 in Contra Costa County, California, on Saturday. The driver of the Tesla was killed in the crash and a passenger was taken to the hospital. Although it isn’t yet known if the Tesla had Elon Musk’s “Autopilot” engaged, there have been multiple accidents involving a Tesla with Autopilot engaged hitting fire trucks and other safety vehicles that are stopped on the road.
The Wall Street Journal reports that early in the morning on February 18, a Contra Costa County Fire Protection District fire truck was parked across two lanes of Intestate 680 to block traffic while first responders assisted with the towing of a disabled vehicle when a Tesla vehicle smashed into it. A passenger in the Tesla was taken to the hospital in critical condition, while the driver of the electric car was pronounced dead at the scene. All four of the truck’s firefighters had their seatbelts on, and were taken to the hospital with luckily minor injuries.
It is not yet known if the Tesla driver had engaged Elon Musk’s “Autopilot” software, and the cause of the crash is still being investigated. The purpose of the Autopilot system is to help drivers with activities like steering and keeping a safe distance from other vehicles on the road. Regulators and safety advocates, however, have recently begun to question the system due to many accidents involving Tesla cars, particularly those involving emergency vehicles. Despite Tesla’s claims that drivers should remain attentive with Autopilot engaged, one recent viral video showed a woman appearing to be asleep behind the wheel as she relies on Autopilot to keep her on the road.
After numerous crashes at accident scenes, the National Highway Traffic Safety Administration (NHTSA) has been researching Tesla’s advanced driver-assistance system for more than a year. Concerns about the system’s capacity to identify parked emergency vehicles and the particular difficulties these vehicles pose for driver-assistance systems have been voiced by the agency. Late in January, the NHTSA informed Tesla of any potential worries it might have had regarding system characteristics unique to particular roadway environments.
Last week, Tesla revealed that it would recall nearly 363,000 cars with the Full Self-Driving Beta feature to fix issues with how the technology handles specific driving maneuvers. According to the NHTSA, some Teslas may occasionally defy local traffic laws, potentially raising the risk of a collision if a driver doesn’t take appropriate action.
Full Self-Driving adds more features for driver assistance without actually making cars drive themselves. According to Tesla, drivers must always be alert and prepared to take control of the vehicle when using the system. Critics contend that the system is not foolproof and that drivers shouldn’t rely on it to drive safely, despite the company’s claim that using Autopilot makes driving safer than not using it.
The latest California collision is the most recent in a string of incidents involving Tesla cars that have prompted questions about the security of the manufacturer’s driver-assistance technology. In May of 2022, the NHTSA launched a special crash investigation into a fatal wreck involving a Tesla vehicle that resulted in three people dying. The vehicle involved was identified as a 2022 Tesla Model S and the incident was added to a list of auto crashes the NHTSA believes may be linked to semiautonomous driving features like Tesla’s “full self-driving” Autopilot system.
There have been numerous other collisions between Tesla vehicles and emergency vehicles since then, prompting calls for stricter guidelines and safety requirements for driver-assistance technologies.
Read more at the Wall Street Journal here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan