Putting the social back in social robotics

Much research on the interaction of humans and robots focuses on dydadic interaction, or at best a few humans with one robot. Below a photo from the FRACTOS study where a robot and a virtual tutor create a triad with a child in a learning task on fractions.

Sooraj Krishna, Catherine Pelachaud, and Arvid Kappas. 2020. FRACTOS: Learning to be a Better Learner by Building Fractions. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’20). Association for Computing Machinery, New York, NY, USA, 314–316. DOI:https://doi.org/10.1145/3371382.3378318

In social psychology, much research deals with how groups interact with one another. The us vs them, that is at the heart of much conflict. This is not yet a topic of much social robotics research and theory. However, this is of course a function of the small number of robots in the wild. In a science fiction movie it is easy to have hordes of robots being part of a busy street scene, such as in the classic I, Robot movie. But which research group possesses enough robots to create real groups? Who has interacted with large quantities of robots in their place of work? However, it is only a question of time that robots will cease to be novelties and intergroup psychologiy of mixed human/robot society will be a thing. I am convinced that in this decade it will become important to foreshadow how robots might become part of human groups, or how groups of humans and robots will interact in collaboration or competition.

Some of these issues are dealt with in a recent paper with Eric Vanman.


Society’s increasing reliance on robots in everyday life provides exciting opportunities for social psychologists to work with engineers in the nascent field of social robotics. In contrast to industrial robots that, for example, may be used on an assembly line, social robots are designed specifically to interact with humans and/or other robots. People tend to perceive social robots as autonomous and capable of having a mind. As such, they are also more likely to be subject to social categorization by humans. As social robots become more human like, people may also feel greater empathy for them and treat robots more like (human) ingroup members. On the other hand, as they become more human like, robots also challenge our human distinctiveness, threaten our identity, and elicit suspicion about their ability to deceive us with their human‐like qualities. We review relevant research to explore this apparent paradox, particularly from an intergroup relations perspective. We discuss these findings and propose three research questions that we believe social psychologists are ideally suited to address.

Vanman, EJ, Kappas, A. “Danger, Will Robinson!” The challenges of social robots for intergroup relations. Soc Personal Psychol Compass. 2019; 13:e12489. https://doi.org/10.1111/spc3.12489