This post was contributed by a community member. The views expressed here are the author's own.

Health & Fitness

DID YOU KNOW That A Driverless Car Isn’t Necessarily Protecting its Passengers?

Imagine you are a chief software engineer at an innovative technology company, you are leading the programming and development of autonomous vehicles and will soon give the official safety approval for the company’s first self-driving car. Before the vehicle can be released, the final code for programming will determine how the vehicle responds when faced with an imminent multi-vehicle accident from which there is a high probability of multiple fatalities. You have the following two choices:

  1. Do you program the car to always protect its passengers - regardless of any potential great loss of human life in other vehicles and/or pedestrians?
  2. Do you program the car to always minimize the total amount of human harm and fatality, even if that could mean “sacrificing” passengers in its car to save the lives of many people?

What do you do?

Always protect the passengers in the car.
Always minimize the total amount of harm and fatality.

I am currently taking an online course on technology and ethics offered by the Ohio State University on the  Coursera platform by professor Robert Bailey that considers the above question.  Dr. Bailey is an Emeritus Professor of Mechanical and Nuclear Engineering that has two degrees in chemical engineering from the University of Illinois and a Ph.d in nuclear engineering from Purdue.   In the past, Dr. Bailey also worked at Argonne National Laboratory and Atomics International on power reactor design and operation; and was a consultant and educator on nuclear weapons effects for the United States Department of Defense.  The intersection of technology and ethics is a fascinating area because you begin to realize that a lot of the assumptions that people have lived with for ages are no longer necessarily applicable.

Given that I drive on roads that already have driverless cars operating on I am stunned to realize that I hadn’t even thought of the driverless car scenario described above. If the car was being driven by a human being he or she would instinctively opt for option 1 because the self preservation instinct would in that split second decision override anything else.  At least in this situation, the interest of the driver and the passengers are one and the same.  Which until now has been the only option for basically all modes of transportation be it cars, boats or airplanes.  But now with the advent of driverless cars we have a new technological really ethical option available.  Namely for the car to pick the best option for the society at large and not necessarily for me the passenger.  

Do the states that have approved driverless cars know how those driverless cars have been programmed to respond? If they have been programmed to respond as per choice number 2 do they realize all the implications of that decision.  Although from a state perspective it might even appear to be the correct choice the fact is that from the beginning of locomotion, humans have presumed not only that they would act to defend themselves but that the other party would do the same so when they made split second decisions they factored all of these probable decisions in a certain way.  Now all of a sudden a human can no longer assume that a driverless car will most likely make a decision in its own favor.   Considering that these will be split second decisions I can foresee a lot of problems with any kind of programming of self driving vehicles that doesn’t program them to do what the other party would expect.

For more reading on this subject see some articles below but at this point point in time I would agree with Consumer Watchdog and recommend that government agencies proceed with a lot more caution than they appear to be doing.  (http://www.latimes.com/business/autos/la-fi-hy-consumer-watchdog-warns-dmv-on-googles-driverless-car-20140610-story.html)  In light of these issues, I am not sure I am happy to know that I am currently sharing the road with driverless cars.  What about you?   How do you think a driverless car should be programmed to react?  Do you want to share the road with one? Ride in one?


Additional reading:

The views expressed in this post are the author's own. Want to post on Patch?