To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability
- 1.1k Downloads
Previous work has shown that non-verbal behaviors affect anthropomorphic inferences about artificial communicators such as virtual agents or social robots. In an experiment with a humanoid robot we investigated the effects of the robot’s hand and arm gestures on the perception of humanlikeness, likability of the robot, shared reality, and future contact intentions after interacting with the robot. For this purpose, the speech-accompanying non-verbal behaviors of the humanoid robot were manipulated in three experimental conditions: (1) no gesture, (2) congruent co-verbal gesture, and (3) incongruent co-verbal gesture. We hypothesized higher ratings on all dependent measures in the two multimodal (i.e., speech and gesture) conditions compared to the unimodal (i.e., speech only) condition. The results confirm our predictions: when the robot used co-verbal gestures during interaction, it was anthropomorphized more, participants perceived it as more likable, reported greater shared reality with it, and showed increased future contact intentions than when the robot gave instructions without gestures. Surprisingly, this effect was particularly pronounced when the robot’s gestures were partly incongruent with speech, although this behavior negatively affected the participants’ task-related performance. These findings show that communicative non-verbal behaviors displayed by robotic systems affect anthropomorphic perceptions and the mental models humans form of a humanoid robot during interaction.
KeywordsSocial human-robot interaction Multimodal interaction and conversational skills Non-verbal cues and expressiveness Anthropomorphism Robot companions and social robots
The work described was supported by the Honda Research Institute Europe and the Center of Excellence ‘Cognitive Interaction Technology’.
- 1.Bartneck C, Croft E, Kulic D (2008) Measuring the anthropomorphism, animacy, likeability, perceived intelligence and safety of robots. In: Proceedings of the metrics of human-robot interaction workshop, technical report 471, pp 37–41 Google Scholar
- 3.Breazeal C (2002) Designing sociable robots. AAAI Press, Menlo Park Google Scholar
- 15.Ltd Honda Motor Co (2000) The Honda Humanoid Robot Asimo, year 2000 model Google Scholar
- 16.McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago Google Scholar
- 19.Salem M (2012) Conceptual motorics—generation and evaluation of communicative robot gesture. Logos Verlag, Berlin. PhD Dissertation Google Scholar
- 20.Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2011) Effects of gesture on the perception of psychological anthropomorphism: a case study with a humanoid robot. In: Proceedings of the third international conference on social robotics. Lecture notes in artificial intelligence, vol 7072. Springer, Berlin, pp 31–41 Google Scholar
- 22.Schröder M, Trouvain J (2003) The German text-to-speech synthesis system MARY: a tool for research, development and teaching. Int. J. Speech Technol., 365–377. doi: 10.1023/A:1025708916924
- 23.Sidner CL, Kidd CD, Lee C, Lesh N (2004) Where to look: a study of human-robot engagement. In: Proceedings of the 9th international conference on intelligent user interfaces, pp 78–84 Google Scholar
- 26.Yamazaki A, Yamazaki K, Kuno Y, Burdelski M, Kawashima M, Kuzuoka H (2008) Precision timing in human-robot interaction: coordination of head movement and utterance. In: Proceedings of the ACM/SIGCHI conference on human factors in computing systems, pp 131–140 Google Scholar
- 27.Yoshikawa Y, Shinozawa K, Ishiguro H, Hagita N, Miyamoto T (2006) Responsive robot gaze to interaction partner. In: Proceedings of the robotics: science and systems conference, pp 37–43 Google Scholar