Help Make the Difficult Moral Decisions to Improve Self-Driving Cars

A program called the Moral Machine is collecting data on how humans decide what to do in tough moral circumstances, and you can play too
Trevor English
The photo credit line may appear like thisMoral Machine

As self-driving cars are nearing mass production, many are wondering how a machine could possibly make difficult moral decisions when presented with a dangerous situation. For example, driverless cars will have to decide in the event of an imminent crash how to steer to brake or speed up, or what to hit. This is obviously a moral issue when human or animal life is at stake.

A new program called the Moral Machine is collecting data on how humans would decide what to do in tough moral circumstances, and you can play too.

[see-also]

Autonomous vehicles really will become the modern day moral choice machines. While many feel that a machine should not have the capability of making choices for humans lives, that time is approaching us fast.

In the Moral Machine game, you have to chose essentially who will die given a certain scenario. In many scenarios, there isn't any outcome but bad, and ultimately a car will have to be programmed to make this choice.

When you finish the short test, the game shows you who you preferred to save more and what ideals you cared more about. It's a little scary to be presented with the end result of a series of difficult moral choices.

You can try out the game here.

IELogoIELogo

Subscribe today

For full access to all features
and product updates.

%30 Save Quarterly

$25

$17.97

Quarterly

Subscribe Now
You can cancel anytime.
View Other Options

Already have an account? Log in

0 Comment
Already have an account? Log in