Search

Do Self-Driving Cars Have Morals?

Driverless cars are a great opportunity for advancements in transportation, but also are an added impetus for secure software.

With the rise of self-driving cars has come the grim reality that cars may have to choose between saving one life and ending another. Unlike a reckless human driver, an autonomous vehicle itself cannot be held legally or morally responsible for any accident. As this is the case, scientists and programmers are responsible for pre-determining the machine’s course of action in given road situations. In order to make these judgements, a team led by Björn Meder at the Max Planck Institute for Human Development conducted a study using simulations to ascertain which actions of a self-driving car in risky situations are acceptable and which are morally reprehensible.

 

In each situation presented during the simulation, the subject was forced to make a decision between staying in the lane at the risk of hitting a pedestrian and swerving off of the road at the risk of injury to a bystander. A certain risk level was also presented for each pedestrian and each bystander with values ranging from an unknown risk up to an 80% chance of collision. Subjects considered many different combinations of risks to bystanders and pedestrians, and the researchers recorded the actions taken. The team hopes that, with this study and others like it, it can eventually standardize and program machines to replicate innate human morality. While this is no trivial task, it is necessary in order to ensure as much safety to passengers, pedestrians, and bystanders as possible.