Tetris reveals how people respond to unfair AI

A Cornell University-led experiment in which two people play a modified version of Tetris revealed that players who get fewer turns perceived the other player as less likable, regardless of whether a person or an algorithm allocated the turns.

Most studies on algorithmic fairness focus on the algorithm or the decision itself, but researchers sought to explore the relationships among the people affected by the decisions.

Photo by cottonbro studio on Pexels.com

“We are starting to see a lot of situations in which AI makes decisions on how resources should be distributed among people,” said Malte Jung, associate professor of information science, whose group conducted the study. “We want to understand how that influences the way people perceive one another and behave towards each other. We see more and more evidence that machines mess with the way we interact with each other.”

In an earlier study, a robot chose which person to give a block to and studied the reactions of each individual to the machine’s allocation decisions.

“We noticed that every time the robot seemed to prefer one person, the other one got upset,” said Jung. “We wanted to study this further, because we thought that, as machines making decisions becomes more a part of the world – whether it be a robot or an algorithm – how does that make a person feel?”

Leave a comment

Filed under Uncategorized

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s