Published on:

Can You Sue A Self-Driving Car With No Driver?

https://www.reyeslaw.com/blog/wp-content/uploads/2018/03/https_2F2Fblogs-images.forbes.com2Fjimmcpherson2Ffiles2F20182F032FAP_18078661760690.jpgOn Sunday night, a self-driving car operated by Uber struck and killed a pedestrian, 49-year-old Elaine Herzberg, on North Mill Avenue in Tempe, Arizona. It appears to be the first time a self-driving car has killed a human being by force of impact. The car was traveling at 38 miles per hour.

An initial investigation by Tempe police indicated that the pedestrian might have been at fault. According to that report, Herzberg appears to have come “from the shadows,” stepping off the median into the roadway, and ending up in the path of the car while jaywalking across the street. The National Transportation Safety Board has also opened an investigation.

Likewise, it’s difficult to evaluate what this accident means for the future of autonomous cars. Crashes, injuries, and fatalities were a certainty as driverless vehicles began moving from experiment to reality.

Advocates of autonomy tend to cite overall improvements to road safety in a future of self-driving cars. Ninety-four percent of car crashes are caused by driver error, and both fully and partially autonomous cars could improve that number substantially.

Even so, crashes, injuries, and fatalities will hardly disappear when and if self-driving cars are ubiquitous. Robocars will crash into one another occasionally and, as the incident in Tempe illustrates, they will collide with pedestrians and bicyclists, too. Overall, eventually, those figures will likely number far fewer than the 37,461 people who were killed in car crashes in America in 2016.

When people get into car crashes with one another, vehicular negligence is typically the cause. Determining which party is negligent, and therefore at fault, is central to the common understanding of automotive risk. Negligence means liability, and liability translates the human failing of a vehicle operator into financial compensation—or, in some cases, criminal consequence.

Overall, there’s recognition that self-driving cars implicate the manufacturer of the vehicle more than its driver or operator. That has different implications for a company like GM, which manufactures and sells cars, than Google, which has indicated that it doesn’t have plans to make cars, only the technology that runs them. The legal scholar Bryant Walker Smith has argued that autonomous vehicles represent a shift from vehicular negligence to product liability.

On today’s roads, product liability claims arise in cases like the failure of Bridgestone/Firestone tires in the late 1990s, and the recent violent rupture of Takata airbags.

These situations represent fairly traditional examples of product liability: A company designed, manufactured, or marketed a product that didn’t do what it promised, and harmed people as a result.

The pedestrian killed by a self-driving Uber in Tempe shows that the legal implications of autonomous cars are as important, if not more so, than the technology. Read more about the possible legal action that can be taken in The Atlantic article, “Can You Sue a Robocar?”

 

Contact Information