Self-driving cars are becoming a regular part of daily life. While fully self-driving cars are probably still years away, some cars can steer and park themselves, accelerate and brake, and even change lanes. Like any new technology, the bugs are becoming more apparent as these cars move from the test track to the real world. So when a self-driving car crashes, who is responsible – the human driver or the company that programmed the car’s software?
Under California law (Vehicle Code Section 38750), autonomous vehicles must have a safety alert system to alert the driver in the event of an emergency. The driver must be able to take full control of the autonomous vehicle, including use of the brake, accelerator, and steering wheel. If the alert system fails, the crash may be the car’s fault, but if the driver does not take control of the vehicle in the event of an emergency, the crash would be the driver’s fault and the fact that the car was set to “auto-pilot” will not be a defense.
As of February 1, 2022, the California DMV has received 405 reports of autonomous vehicle collisions. As with all new and emerging technology, it will take some time for court cases to be decided about how to apportion blame in self-driving car crashes, but already there have been a few cases that provide some guidance.
Self-driving car crashes have resulted in a number of civil claims and criminal charges being filed throughout California. In Northern California, two counts of vehicular manslaughter were filed against a driver of a Tesla Model S on Autopilot that ran a red light and hit another car, killing two people. The victims’ families also filed civil lawsuits against the driver and Tesla, accusing the company of selling vehicles with unreasonably dangerous emergency automatic braking systems.
In a civil case from Arizona, a pedestrian was killed by an Uber self-driving car. The investigation included data from the car’s computer system and dash cam video. In this case, dash cam video from inside the vehicle showed the driver repeatedly looking down at her lap in the final minutes before the crash, including for five full seconds before impact. Evidence from the driver’s Hulu account showed that she was streaming the television show The Voice just before the crash. The electronic data showed that if the driver had been paying attention, this death most likely could have been avoided. Lawsuits arising out of self-driving car crashes will rely heavily on electronic data and video that can be downloaded from the car’s computer as well as other devices such as dash cams and cell phones.
Self-driving technology is not a replacement for an alert human driver behind the wheel. One thing that is clear: Self-driving technology does not relieve the human driver of responsibility to control the vehicle in case of an emergency.