YOU’RE driving up to an intersection, and suddenly a brick wall appears in front of you. You can either hit the wall and kill both yourself and your passenger, or swerve to the left of it and hit several pedestrians who are crossing the road, killing them.
What would you do?
It’s moral questions like these that are confounding carmakers as they rush headlong into a future where the self-driving car will have to make choices just as humans do every time they jump behind the wheel.
Researchers are all over it, and one in particular – the Massachusetts Institute of Technology – wants to know what you would do if you were the car.
It has set up a website, called Moral Machine, that lets people decide what they think is the lesser dilemma, choosing who will live or die from a mix of bank robbers, doctors, pregnant women, joggers, male and female business executives, babies, elderly or unfit people, and even cats and dogs.
It includes autonomous cars that have no driver, right up to cars filled with children and pets.
There’s even a choice over saving the lives of a car full of family pets, or a baby in a stroller.
“Recent scientific studies on machine ethics have raised awareness about the topic in the media and public discourse,” MIT says.
“This website aims to take the discussion further, by providing a platform for ... building a crowd-sourced picture of human opinion on how machines should make decisions when faced with moral dilemmas, and ... crowd-sourcing assembly and discussion of potential scenarios of moral consequence.”
So far, the results show that we’re more interested in saving the highest number of lives, particularly if they’re females, babies or doctors, but the family pet is very high on the hit list when it comes to potential victims we’re willing to mow down in an emergency.
Research into how self-driving cars will interact with people, and even other cars, on our roads, is still in its infancy. It also raises a lot more questions than who will live or die.
In fact, researchers have even drawn parallels between self-driving cars and robotic weapons systems designed to wipe out enemy forces. One of the key findings was that, just like an armed drone, carmakers would need to build aggression – a very human nature – into self-driving vehicles.
Carmakers will also have to deal with how autonomous vehicles handle other very human characteristics, such as making eye contact with another driver in a stand-off, or giving a wave if another vehicle makes room for a lane change – both a significant part of everyday commuting.
Volvo hopes its moves into autonomous car technology will help it live up to a pledge to build zero-fatality cars by 2020.