They're scenarios no one ever wishes to find themselves in, but one new video game is forcing players to face their worst driving fears. A recent simulation game makes players choose one life over another to prove a point: programming autonomous cars won't be as easy as we think.
The simulation comes from creative technologist Matthieu Cherubini. In each situation, the car can pick three different philosophies similar to the three ethical behaviors in which autonomous cars could one day be programmed.
First, there's preservationist behavior. Preservationist programming makes sure that everyone inside the car remains the top priority in the event of an accident. The second is the humanist programming. Humanist programming quantifies the greatest number of lives saved in various scenarios and tries to save the most possible -- even if it's to the detriment of the car's driver. The last (and arguably the most controversial) method of programming is profit-driven. The car will attempt to make the lowest-cost decision in terms of insurance purposes and damage.
The game reminds players that ethical decision making is inherent in problem-solving algorithms and, as our reliance on technology increases, the stakes will rise.
Cherubini noted that autonomous cars won't necessarily be programmed to uphold the same 'values.' Often, those change with automakers depending on culture.
"If a car is manufactured in Germany and works well in German context and culture, and is exported to China, and I think it’s not possible that this car that works in a German context will work in a Chinese context," he said.
"The ethics don’t adapt from one culture to the next."
The problems presented by the video game aren't novel either. This style of decision-making quandaries stems from what's known as the Trolley problem. In the original scenario, a runaway train is speeding toward five railway workers, and you have no way to warn them. You see a lever that will switch the tracks, but the problem is that one worker is on the alternate route. It's still one death compared to five; however, you'd be the one pulling the lever and ultimately sending one man to his death.
The game recreates variations of this century's old problem, and players must confront the limits of their own morality. At what point do they sacrifice themselves and their wellbeing for strangers? How do they calculate a net gain when death is certain?
"It doesn’t decide what to do–it does something random," he said. "That’s a bit how we do it now. We don’t think we’re going to hit that person or that one–we panic. Then you don’t put value on people, that this person would be better [to harm] than this other person."
These questions don't stop with the hypothetical future. Automakers have already hinted at their answers. Toyota Research Institute John Hanson spoke about the struggles automakers will deal with as these technologies advance.
"What if we can build a car that’s 10 times as safe, which means 3,500 people die on the roads each year. Would we accept that?” said Hanson in a February interview, who is currently developing Toyota's self-driving technology. A lot of people say if, ‘I could save one life it would be worth it.’ But in a practical manner, though, we don’t think that would be acceptable."