Human and Programming Errors Caused Self-Driving Vehicle Fatality
One of the most difficult things about creating a safe self-driving vehicle is programming the vehicle to be as observant as a human driver.
Self-driving cars not only need GPS capabilities to get from point A to point B, they also must have the ability to sense and avoid surrounding objects, such as as other vehicles, pedestrians, and objects in the road in order to avoid an accident.
Generally, humans can still do this much easier than computers. However, this doesn't include drivers who are distracted, which is all too common today.
Safety Driver to Blame in Accident Involving Self-Driving Uber
Unfortunately, U.S. safety investigators found that both an inattentive human driver and faulty programming were involved in a fatal accident last year in which a self-driving Uber vehicle struck and killed a pedestrian in Tempe, Arizona.
A federal investigation found that the vehicle lacked programming to see and respond to pedestrians outside of crosswalks, in addition to other safety and design lapses. But the cause of the accident was attributed to a safety driver in the vehicle who was distracted by their phone.
The safety driver was sitting behind the wheel of the self-driving car and was supposed to step in if the software driving the vehicle failed. Investigators found that the safety driver was looking away from the road during 34 percent of the trip and looked away from the road a total of 23 times in the three minutes leading up to the crash, a video recording showed.
The National Transportation Safety Board found that if the driver had been paying attention at the time of the accident, they would have had two to four seconds to see and avoid the pedestrian, who was walking her bike across the road.
The NTSB also criticized the federal National Highway Traffic Safety Administration for not having more regulations in place for the testing of self-driving vehicles on U.S. roadways, CNN reported.
Who Is Liable for Accidents Involving Self-Driving Cars?
Currently, all self-driving vehicles on the roadways are equipped with safety drivers who are supposed to take over if technology fails. The technology has not reached a point where it is trustworthy enough by itself.
When it comes to criminal and civil liability for accidents involving self-driving vehicles, there is still a lot of gray area. Uber reached a civil settlement with the family of the woman killed in the Tempe accident, but the company was not criminally charged. The safety driver, on the other hand, could be charged with manslaughter. The local prosecutor's office is still deciding whether to press charges.
It is likely that different standards will be set for accidents involving vehicles driven by humans versus autonomously. It will certainly be an important area of law as self-driving vehicles become more common.
Related Resources:
- Self-Driving Car Accidents: Top 5 FAQs (FindLaw's Injured)
- 'Driver' of Uber's Self-Driving Car That Killed Pedestrian Was Watching Hulu During Accident (FindLaw's Injured)
- Who Is Liable in a Waymo Self-Driving Car Accident? (FindLaw's Injured)
- Fault and Liability for Car Accidents (FindLaw's Learn About the Law)