Search

Eyes are the Windows to the Robot’s Soul: How mutual gaze with a robot impacts human social decision-making

Image courtesy of IIT.

Movies like Her, The Iron Giant, and Chappie reveal a trend in the media of pondering the social relationship between robots and humans. Researchers affiliated with the Instituto Italiano di Technologia (IIT) sought to understand this phenomenon in a study focused on how social interactions with a robot affects how humans make decisions in a social context.

“What we try to do is to actually understand those parameters [of social interaction]. What is it that makes an interaction smooth and natural and pleasant?” said Agnieszka Wykowska, a professor of engineering psychology who leads IIT’s Social Cognition in Human-Robot Interaction unit. 

The team—consisting of Wykowska, Marwen Belkaid, Kyveli Kompatsiari, Davide De Tommaso, and Ingrid Zablith—invited participants to play a game where both they and a humanoid robot controlled cars that were heading towards each other. After looking into the eyes of the robot, the participant decided whether to divert from the path or keep moving forward at the risk of colliding with the other car. In each trial, players were scored depending on the outcome; one car moving straight while the other diverted resulted in the highest possible payoff for the former, and both cars moving straight and colliding with each other resulted in the highest possible loss for both players. One group of participants received reciprocated eye contact from the robot in most of the trials, while the other group received it on fewer occasions.

After making eye contact with the robot, participants were slower in choosing to divert the car or keep it moving straight compared to the situations in which there was no eye contact. Further analyses of participants’ brain activity revealed higher neural synchronization during trials in which the robot reciprocated eye contact. “This high neural synchronization is related to higher cognitive effort to separate irrelevant signals. Here, [that signal] was eye contact with the robot,” Kompatsiari said. Mutual gaze also impacted how the participants played the game, with the group that received less eye contact from the robot playing a more self-oriented strategy. 

Kompatsiari and Wykowska hope studies like this can improve on the field of social robotics. “There’s this trend in social robotics that we should make the robots be as social as possible,” Wykowska said. “[But] I think it’s quite striking to see that the social is not necessarily always beneficial. So it can be that you have tasks where you need to focus on something, and then having a social robot right next to you might not necessarily help.” 

On the other hand, their work provides insight on how robots can be used as proxies to human-human interactions. “If humanoid robots will be around, they might automatically trigger mechanisms in our brain to treat them as social agents, and we might start having social attitudes towards those robots,” Wykowska said. 

While there is still a long way to go before robots become completely integrated into human life, they have already impacted fields such as clinical psychology. Kompatsiari noted, for example, the ways that some robots can interact with patients with autism. “[These interactions] can induce the robot to assist the therapist to train specific mechanisms of social cognition,” Kompatsiari said.