When 89 fatalities and numerous serious injuries are caused by unintended acceleration, why isn’t there more media coverage? It might be because software/hardware failure is not easy to pinpoint especially when source code is proprietary.
More lawsuits allege that Toyota failed to exercise reasonable care in developing and testing code that controls the throttle in its vehicles. Software engineering processes have become the issue in these trials. We’ll explain more in this blog and link to additional resources.
This current wave of litigation comes after the carmaker paid a $1.2 billion fine to settle a U.S. Justice Department criminal investigation in 2014 (it was the largest imposed on an auto company at the time). The company blamed unintended acceleration (UA) on driver error and admitted concealing problems from regulators. This was even after it had issued a recall for floor mats related to UA. Later, the company admitted flawed gas pedal designs as well.
Loss of control over the throttle
A bell weather case was brought to a jury in Oklahoma. In the case, Bookout and Schwartz v. Toyota, Jean Bookout and her passenger Barbara Schwartz were exiting I-69 in Oklahoma. Bookout’s 2005 Camry brakes didn’t slow the car down. She acted quickly to threw the parking brake. The car crashed into an embankment killing Bookout and seriously injuring Schwartz.
A 150-foot skid make from the right tire couldn’t be explained by Toyota. It indicated Bookout was attempting to brake and hadn’t mistaken the accelerator for a brake pedal.
The jury returned a $3 million verdict finding Toyota acted with reckless disregard. This was after hearing from a variety of experts including Philip Koopman, a Carnegie Mellon University professor of computer engineering and safety critical embedded systems expert. Finding for the plaintiffs, the jury concluded that the engineering process and source code was more likely than not the root cause of the crash.
Before punitive damages could be addressed, the company quickly settled.
Coding rules, complexity “spaghetti code,” fault containment and bugs
Koopman uses the Toyota issues as a case study in a lecture describing in depth some of the possible safety critical failures. Some of these ideas may only be fully understood by experienced software engineers and coders, but here is a summary of a few of the issues:
- Safety Integrity Levels (SIL) – These classifications relate to the importance of a system for vehicle safety. A system like the throttle that makes a car difficult to control if it fails requires a higher level of testing than emissions controls.
- Fault containment – This concept limits shared resources, so that if one fails another backup exists instead of the first corrupting a backup system.
- Coding complexity – “Spaghetti code” with many paths is harder to test and can become untestable if overly complex. Also, it is harder to fix a bug without causing another.
- Rules and safety culture – Standards exist across industries, but are not required in the automotive field (the US Department of Transportation can issue a recall, but this is reactive), so it is largely up to companies to develop their own safety culture.
The principle behind SIL is that it is not reasonable to allow the same error rate for a Smartphone app as a vehicle throttle control program. A process needs to be in place to ensure that a system within a vehicle as important as throttle function is not unreasonably dangerous. But apparently, Toyota did not have a software process testing model.
NASA conducted a review of Toyota practices. Fault containment practices were not always followed in designing paths for fail safes. The code for the throttle was also overly complex which affected the ability to run tests for catastrophic events. No peer review process was implemented to catch issues and amazingly there was no bug tracking/reporting system. Toyota had used its own coding rules, but didn’t always follow them.
Toyota has claimed that it would have caught and fixed any coding errors that caused UA. Its other argument was a blanket reliance on fail safe measures. Safety needs to be a part of organizational culture. An internal memo that was uncovered in the course of litigation mentions the safety culture needed to improve. When catastrophic events are not taken seriously enough, a defect/bug can cause significant damage.
If you or a loved one was in an accident that just can’t be explained and seemed to involve unintended acceleration, speak with one of the attorneys at Weisfuse & Weisfuse LLP. Our products liability lawyers have the experience to handle these complex claims and always offer free consultations. Call 212-983-3000 today.