finger hovering over glowing green 'autonomous drive' button

Who Is to Blame If a Self Driving Car Crashes?

Controversial Uber and Tesla Accidents Inspire Questions //

On March 18, 2018, a self-driving Uber struck and killed an Arizona woman who was walking her bike across the street in the first recorded pedestrian fatality involving an autonomous vehicle. Similarly, Tesla has a “worrisome record” of its vehicles being in fatal accidents while its controversial Autopilot feature is engaged.

Across the board, people seem to be wondering: who is responsible when a self-driving car crashes?

So far, investigations have attributed liability to both the human being in the driver’s seat and the company that created the self-driving technology.

The Uber Investigation

In a National Transportation Safety Board (NTSB) investigation, the safety driver in the self-driving Uber was cited as the primary cause of the crash we described above. Nevertheless, the individual behind the wheel shared liability with Uber. In fact, the company settled with the victim outside of court.

Although the safety driver was reportedly streaming an episode of The Voice on her phone at the time of the collision, Uber had no plan for safety and failed to investigate previous incidents with its self-driving fleet. Ultimately, a deadly combination of driver error and corporate negligence ended a woman’s life. While Uber paused its autonomous vehicle testing for about 9 months, the company resumed testing soon after and currently has a self-driving fleet in Pennsylvania.

The Problem With Tesla

According to The New York Times, the NTSB also found “Tesla’s Autopilot driver-assistance system and a driver who relied too heavily on it” to blame for a 2018 accident in Mountain View, California. For years, Tesla has been overpromising its Autopilot system and allegedly misleading consumers, who trust the car to more or less drive by itself. A screenshot of Tesla’s website shared by Vox helps illustrate this point:

In reality and in contrast with the above marketing statement, Tesla’s autopilot system is only a Level 2 “partial automation” technology, via the 6 levels (0-5) of autonomous technology categorized by SAE International.

In the Mountain View accident, the driver in question had also complained about issues with the Autopilot system before he fell victim to a fatal Tesla crash. Unfortunately, crashes like this are becoming a pattern for the company. An Associated Press article from January 2020, states:

Three crashes involving Teslas that killed three people have increased scrutiny of the company’s Autopilot driving system.”

Our Thoughts

Although sources like The Atlantic quip, you can’t “sue a robocar,” the companies behind self-driving technology may face liability when that technology fails.

Even with driver negligence, product liability seems to play a large role in this emerging legal landscape.

Fortunately, our team at MR Civil Justice has experience in both traditional car accident cases and the field of product liability.

If you or someone you love has been involved in any kind of autonomous vehicle accident, please contact us at (214) 307-8387 or online to discuss your case.

We offer free consultations for our personal injury clients and would be honored to help you navigate this complex case.

Categories