How is human behaviour impacted by an unfair AI? A game of Tetris reveals all

A team of researchers give a spin to Tetris, and make observations as people play the game.
Sejal Sharma
Tetris unfair line piece
Tetris unfair line piece

alengo/isStock 

We live in a world run by machines. They make important decisions for us, like who to hire, who gets approved for a loan, or recommending user content on social media. Machines and computer programs have an increasing influence over our lives, now more than ever, with artificial intelligence (AI) making inroads in our lives in new ways. And this influence goes far beyond the person directly interacting with machines.

While there is literature on how the decisions made by machines impact an individual, there is little evidence of how machine behavior affects interpersonal relationships - how we interact with and perceive other people. 

A team of researchers from Cornell University wanted to understand the interpersonal consequences of decisions made by machines. They introduced the notion of Machine Allocation Behavior, which is the overt behavior resulting from a machine’s decisions about allocating something of value among people, as explained in their study.

The social consequences of Machine Allocation Behaviour

“We are starting to see a lot of situations in which AI makes decisions on how resources should be distributed among people,” said Malte Jung, associate professor of information science, whose group conducted the study, in a statement. “We want to understand how that influences the way people perceive one another and behave towards each other. We see more and more evidence that machines mess with the way we interact with each other.”

The researchers tweaked Tetris, an addictive puzzle game in which players stack falling blocks on each other to clear levels, as a two-player game. Instead of one player in the original game, two players would work simultaneously to complete a level of Tetris. 

The team then assigned an allocator whose job would be to determine which player’s turn to play. The allocator thus assigned was a human allocator and a machine (AI) allocator. The experiment was designed in a way that the player would have either 90 percent of the turns, 10 percent, or 50 percent.

The researchers found that the players who received fewer turns were aware that the other player was given more turns. Interestingly, whether the allocator was human or an AI did not impact the players' feelings.

In the statement, the researchers also found that fairness didn’t automatically lead to better gameplay and performance. In fact, equal allocation of turns led, on average, to a worse score than unequal allocation.

Calling the implications of their study Machine Allocation Behavior, the researchers believe that their work has revealed a new perspective on algorithmic fairness and the effect of a machine’s decisions on people's interpersonal relationships.

Study abstract:

Machines increasingly decide over the allocation of resources or tasks among people resulting in what we call Machine Allocation Behavior. People respond strongly to how other people or machines allocate resources. However, the implications for human relationships of algorithmic allocations of, for example, tasks among crowd workers, annual bonuses among employees, or a robot’s gaze among members of a group entering a store remains unclear. We leverage a novel research paradigm to study the impact of machine allocation behavior on fairness perceptions, interpersonal perceptions, and individual performance. In a 2 × 3 between-subject design that manipulates how the allocation agent is presented (human vs. artificial intelligent [AI] system) and the allocation type (receiving less vs. equal vs. more resources), we find that group members who receive more resources perceive their counterpart as less dominant when the allocation originates from an AI as opposed to a human. Our findings have implications on our understanding of the impact of machine allocation behavior on interpersonal dynamics and on the way in which we understand human responses towards this type of machine behavior.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board