New dyads? The effect of social robots’ anthropomorphization on empathy towards human beings
Mind attribution
Social robot
Human -technology interaction
Anthropomorphism; Anthropomorphization; Dyadic completion; Human-technology interaction; Mind attribution; Social robot;
Anthropomorphism
Dyadic completion
Anthropomorphization
DOI:
10.1016/j.chb.2023.107821
Publication Date:
2023-05-31T02:32:41Z
AUTHORS (3)
ABSTRACT
Research on Human-Technology Interactions revealed that, under certain conditions, people instinctively interact with social robots in ways comparable to Human-Human Interactions. Indeed, people apply social perception schemas and attribute a mind to social robots, especially when they present anthropomorphic characteristics. Furthermore, under certain conditions, anthropomorphic social robots are awarded with moral consideration and participate to moral dyads. Thus, anthropomorphism facilitates social robots integration in people's lives. However, what is still unknown is whether adopting social schemas with social robots, in turn, affects how individuals perceive and interact with other people. To fill this gap, we experimentally investigated whether the type of mind attributed to anthropomorphic social robot, then, complementary influences the empathy towards a person in trouble. Participants (n = 269) interacted (vs. did not interact) through a chatbot with a highly (vs. lowly) anthropomorphic social robot, evaluated it on mind dimensions and, finally, expressed their empathy towards a person. Results demonstrated that anthropomorphism fosters the attribution of agency (anthropomorphic appearance and interaction through chatbot) and experience (anthropomorphic appearance only), which, in turn, significantly, but in opposite directions, affected empathy towards the social target. Im-plications and future research directions are outlined.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (50)
CITATIONS (7)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....