Abstract
Previous work has shown that non-verbal behaviors affect anthropomorphic inferences about artificial communicators such as virtual agents or social robots. In an experiment with a humanoid robot we investigated the effects of the robot’s hand and arm gestures on the perception of humanlikeness, likability of the robot, shared reality, and future contact intentions after interacting with the robot. For this purpose, the speech-accompanying non-verbal behaviors of the humanoid robot were manipulated in three experimental conditions: (1) no gesture, (2) congruent co-verbal gesture, and (3) incongruent co-verbal gesture. We hypothesized higher ratings on all dependent measures in the two multimodal (i.e., speech and gesture) conditions compared to the unimodal (i.e., speech only) condition. The results confirm our predictions: when the robot used co-verbal gestures during interaction, it was anthropomorphized more, participants perceived it as more likable, reported greater shared reality with it, and showed increased future contact intentions than when the robot gave instructions without gestures. Surprisingly, this effect was particularly pronounced when the robot’s gestures were partly incongruent with speech, although this behavior negatively affected the participants’ task-related performance. These findings show that communicative non-verbal behaviors displayed by robotic systems affect anthropomorphic perceptions and the mental models humans form of a humanoid robot during interaction.
This is a preview of subscription content, access via your institution.




Notes
- 1.
According to Haslam et al. [9], when people are denied human nature, they are implicitly or explicitly objectified or likened to machines rather than to animals or humans.
References
- 1.
Bartneck C, Croft E, Kulic D (2008) Measuring the anthropomorphism, animacy, likeability, perceived intelligence and safety of robots. In: Proceedings of the metrics of human-robot interaction workshop, technical report 471, pp 37–41
- 2.
Bergmann K, Kopp S, Eyssel F (2010) Individualized gesturing outperforms average gesturing—evaluating gesture production in virtual humans. In: Proceedings of the 10th conference on intelligent virtual agents. Springer, Berlin, pp 104–117
- 3.
Breazeal C (2002) Designing sociable robots. AAAI Press, Menlo Park
- 4.
Buisine S, Abrilian S, Martin J-C (2004) Evaluation of multimodal behaviour of embodied agents—cooperation between speech and gestures. In: From brows to trust, vol. 7, pp 217–238
- 5.
Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42(3–4):177–190
- 6.
Echterhoff G, Higgins ET, Levine JM (2009) Shared reality: experiencing commonality with others’ inner states about the world. Perspect Psychol Sci 4:496–521
- 7.
Epley N, Waytz A, Cacioppo JT (2007) On seeing human: a three-factor theory of anthropomorphism. Psychol Rev 114(4):864–886
- 8.
Goldin-Meadow S (1999) The role of gesture in communication and thinking. Trends Cogn Sci 3:419–429
- 9.
Haslam N, Bain P, Loughnan S, Kashima Y (2008) Attributing and denying humanness to others. Eur Rev Soc Psychol 19:55–85
- 10.
Iio T, Shiomi M, Shinozawa K, Akimoto T, Shimohara K, Hagita N (2011) Investigating entrainment of people’s pointing gestures by robot’s gestures using a WOZ method. Int J Soc Robot 3(4):405–414
- 11.
Kendon A (1986) Some reasons for studying gesture. Semiotica 62:1–28
- 12.
Kopp S, Wachsmuth I (2004) Synthesizing multimodal utterances for conversational agents. Comput Animat Virtual Worlds 15(1):39–52
- 13.
Krämer N, Simons N, Kopp S (2007) The effects of an embodied conversational agent’s nonverbal behavior on user’s evaluation and behavioral mimicry. In: Proceedings of the 7th conference on intelligent virtual agents, vol 4722. Springer, Berlin, pp 238–251
- 14.
Loughnan S, Haslam N (2007) Animals and androids: implicit associations between social categories and nonhumans. Psychol Sci 18:116–121
- 15.
Ltd Honda Motor Co (2000) The Honda Humanoid Robot Asimo, year 2000 model
- 16.
McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago
- 17.
Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human-robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM/IEEE international conference on human-robot interaction, pp 61–68
- 18.
Neff M, Wang Y, Abbott R, Walker M (2010) Evaluating the effect of gesture and language on personality perception in conversational agents. In: Proceedings of the 10th international conference on intelligent virtual agents. Springer, Berlin, pp 222–235
- 19.
Salem M (2012) Conceptual motorics—generation and evaluation of communicative robot gesture. Logos Verlag, Berlin. PhD Dissertation
- 20.
Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2011) Effects of gesture on the perception of psychological anthropomorphism: a case study with a humanoid robot. In: Proceedings of the third international conference on social robotics. Lecture notes in artificial intelligence, vol 7072. Springer, Berlin, pp 31–41
- 21.
Salem M, Kopp S, Wachsmuth I, Rohlfing K, Joublin F (2012) Generation and evaluation of communicative robot gesture. Int J Soc Robot 4(2):201–217. Special issue on expectations, intentions, and actions
- 22.
Schröder M, Trouvain J (2003) The German text-to-speech synthesis system MARY: a tool for research, development and teaching. Int. J. Speech Technol., 365–377. doi:10.1023/A:1025708916924
- 23.
Sidner CL, Kidd CD, Lee C, Lesh N (2004) Where to look: a study of human-robot engagement. In: Proceedings of the 9th international conference on intelligent user interfaces, pp 78–84
- 24.
Steinfeld A, Jenkins OC, Scassellati B (2009) The Oz of Wizard: simulating the human for interaction research. In: ACM/IEEE international conference on human-robot interaction (HRI), pp 101–108
- 25.
White RW (1959) Motivation reconsidered: the concept of competence. Psychol Rev 66:297–331
- 26.
Yamazaki A, Yamazaki K, Kuno Y, Burdelski M, Kawashima M, Kuzuoka H (2008) Precision timing in human-robot interaction: coordination of head movement and utterance. In: Proceedings of the ACM/SIGCHI conference on human factors in computing systems, pp 131–140
- 27.
Yoshikawa Y, Shinozawa K, Ishiguro H, Hagita N, Miyamoto T (2006) Responsive robot gaze to interaction partner. In: Proceedings of the robotics: science and systems conference, pp 37–43
Acknowledgements
The work described was supported by the Honda Research Institute Europe and the Center of Excellence ‘Cognitive Interaction Technology’.
Author information
Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Salem, M., Eyssel, F., Rohlfing, K. et al. To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability. Int J of Soc Robotics 5, 313–323 (2013). https://doi.org/10.1007/s12369-013-0196-9
Accepted:
Published:
Issue Date:
Keywords
- Social human-robot interaction
- Multimodal interaction and conversational skills
- Non-verbal cues and expressiveness
- Anthropomorphism
- Robot companions and social robots