The Morals Of Driverless Cars

The Morals Of Driverless Cars


Driverless cars are no longer the stuff of sci-fi films. They are out there now, being tested on our roads and may be in commercial use by 2021. Proponents argue that these vehicles could improve road safety, ease traffic and improve fuel efficiency. But the use of driverless vehicles could also lead to a whole set of risks and ethical issues that we are only just starting to understand.

An interesting and huge experiment has just been undertaken to see what the world-wide perception is on such decisions. Over 2.3 million gave their views – what would yours be.

For example, faced with a pedestrian stepping into the road, we hope we would slam on the brakes or swerve to avoid them. The driverless car would need to be programmed to do that. But what if slamming on the brakes led to injury of the passenger or swerving meant hitting a bus queue? Whilst hopefully a rare occurrence, how should the driverless car be programmed to make these ‘moral’ decisions about who should potentially be saved or sacrificed?

The survey was called the Moral Machine, and asked respondents to choose who to save or sacrifice in 13 different scenarios involving a mix of potential victims, including animals. Young or old, rich or poor, businessman versus jaywalker. Individuals or groups were also included. Some results were predictable – for example, most people spared humans over pets, and groups of people over individuals. Others were more interesting and varied depending on the country or region. North America and several European nations had a stronger preference for sacrificing older lives to save younger ones than did countries such as Japan, Indonesia and Pakistan who, perhaps, have greater respect for their elders. And countries with greater economic inequality were more likely to sacrifice the homeless person on one side of the road than the executive on the other side.

Not only does this highlight a new field of machine ethics but it raises some questions about our own moral judgements!

How important is this information? It seems likely that self-driving cars will cause fewer accidents, proportionally, than human drivers do each year. After all, robots don’t engage in drink-driving, drug-driving or drowsy-driving like some of us do. They aren’t checking their phones or sending a text either. And they may have better driving skills and sharper reflexes than many car, truck and van drivers we see on a daily basis. However, it is likely that any crashes involving driverless cars will receive a lot of scrutiny. The programmed responses that are made will need to be acceptable to the population who may be buying these vehicles in the not-too-distant future.

Certainly makes you think….but will it be the computer doing all the thinking in the future?!



The Moral Machine experiment. E Awad, S Dsouza, R Kim et  al. Nature 2018