The autonomous truck startup TuSimple is reportedly under investigation after trying to blame a self-driving truck crash as a result of “human error.” Autonomous vehicle researchers at Carnegie Mellon University say that assigning the blame to a human is misleading, and that common safeguards would have prevented the incident from taking place.

In April, an autonomously driven semi-trailer truck equipped with TuSimple technology was traveling down a highway in Tucson, Arizona, when it suddenly veered left and slammed into a concrete barricade, according to dashcam footage that was leaked to YouTube.

TuSimple blamed the accident on “human error,” but an internal report reviewed by the Wall Street Journal suggests that pinning the crash on a human is an oversimplification.

According to the internal report, the crash occurred because “a person in the cab hadn’t properly rebooted the autonomous driving system before engaging it, causing it to execute an outdated command,” Wall Street Journal reports.

Essentially, the left-turn command was 2.5 minutes old, and should have been erased but was not.

But autonomous vehicle researchers at Carnegie Mellon University say that assigning the blame to a human is misleading, and that common safeguards would have prevented the incident from taking place.

Researchers told WSJ that the truck should not be responding to commands that are even a few hundredths of a second old, and that the system should never allow a self-driving truck to turn so sharply while traveling at 65 miles per hour.

“This information shows that the testing they are doing on public roads is highly unsafe,” Phil Koopman, an associate professor at Carnegie Mellon, told the Journal.

On Tuesday, TuSimple said in a blog post, “We take our responsibility to find and resolve all safety issues very seriously,” adding that it responded to the April accident by “immediately ground[ing] our entire autonomous fleet and launched an independent review to determine the cause of the incident.”

“With learnings from this review in hand, we upgraded all of our systems with new automated system checks to prevent this kind of human error from ever happening again and we reported the incident to NHTSA and the Arizona Department of Transportation,” the company added.”

Nonetheless, the National Highway Traffic Safety Administration (NHTSA) is joining the Federal Motor Carrier Safety Administration (FMCSA) in investigating the San Diego-based company.

The FMCSA said in a letter that it has launched a “safety compliance investigation” into TuSimple — referring to the April accident.

TuSimple is not the only self-driving vehicle company under investigation n by the NTSA.

The federal agency has launched a probe into yet another fatal car crash involving Tesla’s Autopilot “full self-driving” system. The latest Tesla crash under federal investigation resulted in three fatalities.

In June, the federal probe of Tesla’s Autopilot function escalated, with the NHTSA now investigating if the Autopilot feature is potentially defective. The agency is studying data on 200 Tesla crashes, stating that “On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.”

You can follow Alana Mastrangelo on Facebook and Twitter at @ARmastrangelo, and on Instagram.