Should ethics or human intuition drive the moral judgments of driverless cars?

Frontiers in Behavioral Neuroscience: Human intuition is sometimes at odds with ethically acceptable behavior and political guidelines for autonomous self-driving cars

If autonomous vehicles abide with current ethical guidelines, people may not be happy with the decisions their cars make for them. Image: Shutterstock

Human intuition is sometimes at odds with ethically acceptable behavior and political guidelines for self-driving cars

— By Tania FitzGeorge-Balfour

When faced with driving dilemmas, people show a high willingness to sacrifice themselves for others, make decisions based on the victim’s age and swerve onto sidewalks to minimize the number of lives lost, reveals new research published in Frontiers in Behavioral Neuroscience. This is at odds with ethical guidelines in these circumstances, which often dictate that no life should be valued over another. This research hopes to initiate discussions about the way self-driving vehicles should be programmed to deal with situations that endanger human life.

“The technological advancement and adoption of autonomous vehicles is moving quickly but the social and ethical discussions about their behavior is lagging behind,” says lead author Lasse T. Bergmann, who completed this study with a team at the Institute of Cognitive Science, University of Osnabrück, Germany. “The behavior that will be considered as right in such situations depends on which factors are considered to be both morally relevant and socially acceptable.”

Traffic accidents are a major source of death and injury in the world. As technology improves, automated vehicles will outperform their human counterparts, saving lives by eliminating accidents caused by human error. Despite this, there will still be circumstances where self-driving vehicles will need to make decisions in a morally challenging situation. For example, a car can swerve to avoid hitting a child that has run into the road but in doing so endangers other lives. How should it be programmed to behave?

An ethics commission initiated by the German Ministry for Transportation has created a set of guidelines, representing its members’ best judgement on a variety of issues concerning self-driving cars. These expert judgments may, however, not reflect human intuition.


Related news: Self-driving cars may soon be able to make moral and ethical decisions as humans do


Bergmann and colleagues designed a virtual reality experiment to examine human intuition in a variety of possible driving scenarios. Different sets of tests were created to highlight different factors that may or may not be perceived as morally relevant.

Based on a traditional ethical thought experiment, the trolley problem, test subjects could choose between two lanes on which their vehicle drove at constant speed. They were presented with a morally challenging driving dilemma, such as an option to move lanes to minimize lives lost, a choice between victims of different age, or a possibility for self-sacrifice to save others.

It revealed that human intuition was often at odds with ethical guidelines.

Bergmann explains, “The German ethics commission proposes that a passenger in the vehicle may not be sacrificed to save more people; an intuition not generally shared by subjects in our experiment. We also find that people chose to save more lives, even if this involves swerving onto the sidewalk –endangering people uninvolved in the traffic incident. Furthermore, subjects considered the factor of age, for example, choosing to save children over the elderly.”

He continues, “If autonomous vehicles abide with guidelines dictated by the ethics commission, our experimental evidence suggests that people would not be happy with the decisions their cars make for them.”

Professor Gordon Pipa, co-author, also based at the University of Osnabrück continues, “It is urgent that we start engaging into a societal discussion to define the goals and constraints of future rules that apply to self-drive vehicles. This needs to happen before they become an integral part of our daily lives”

Bergmann explains that further research is needed. “While ‘dilemma’ situations deserve more study, other questions should also be discussed. Driving requires an intricate weighing of risks versus rewards, for example speed versus the danger of a critical situation unfolding. Decision making-processes that precede or avoid a critical situation should also be investigated.”


Original article: Autonomous Vehicles Require Socio-Political Acceptance—An Empirical and Philosophical Perspective on the Problem of Moral Decision Making

REPUBLISHING GUIDELINES: Open access and sharing research is part of Frontiers’ mission. Unless otherwise noted, you can republish articles posted in the Frontiers news blog — as long as you include a link back to the original research. Selling the articles is not allowed.

2 Comments on Should ethics or human intuition drive the moral judgments of driverless cars?

  1. Michael Zeldich // May 23, 2018 at 6:19 pm // Reply

    That is the move in the wrong direction. The rules are the guide for decisions, not a moral consideration.

    Liked by 1 person

  2. Michael Zeldich // May 23, 2018 at 6:33 pm // Reply

    AI can’t be the base for producing controls for an SDV’s because of scalability problem. Driving a vehicle is requiring at least a human level of intelligence which could belong only to subjective systems, natural or artificial.

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s