Can a Self-Driving Car Intentionally Kill You?

What if you knew that a manufacturer was putting a product on the market that not only could kill you, but was specifically programmed to do so, and that selling this kind of product would likely be perfectly legal? Well, that time may be right around the corner, as self-driving cars start dominating our roads. 

Liability for Self-Driving Car Accidents

We know that there can be liability for negligence when a self-driving car malfunctions. Driving safely without a driver requires millions of immediate calculations about how to operate a vehicle, and one glitch in a program can lead to catastrophy. So, why would a car be programmed to actually crash and potentially injure its occupants? The answer lies in the moral dilemma that programmers now face. 

The Moral Dilemma

Imagine that a car with two passengers inside recognizes 10 pedestrians in the middle of the street. Does the car run over the pedestrians, or does it immediately veer off the road, risking injury to its own occupants? Does the car veer to protect the 10 pedestrians, if it has to hit an outside café where it could injure 20 people?

These are the issues that self driving car manufacturers are facing. Many people say they would prefer a car choose to injure the fewest people. But that may change, if that option entails injuring the car’s own occupants. 

Laws May Provide Some Protections

It is likely that laws will be passed absolving a manufacturer from liability for damages that arise from “decisions” made by a car that save ors tries to save the maximum number of people. But that still will not solve all the problems.

In many cases, pedestrians are responsible for jumping into a road, or ignoring crosswalks, or even walking the streets while intoxicated. If a car must choose between four pedestrians who are illegally running into traffic and the one occupant of that car, logic says it should save the passengers. But legally, the pedestrians would be responsible for causing the situation. In that case, the car’s decision to simply save more people would not necessarily be legally or morally correct.

What about pedestrians who could avoid an accident? Does a car veer off road, injuring its passengers, when the pedestrians it was trying to avoid could have gotten out of the way? 

How Will a Car Make its Decisions?

The legal liability issues will likely rest upon how a car makes its decisions. Much litigation will be dedicated to the programming of these cars, and the decisions that the manufacturer “trains” the computers to make. Until a car can be programmed to weigh facts and information to determine legal liability, manufacturers may still be liable to car accident victims for injuries that self-driving cars cause.

If you are injured in a car accident or by a defective product, contact the attorneys of Brassel, Alexander & Rice, LLC today for a free consultation to discuss your injury or liability case.