International Journal of Social Robotics

, Volume 5, Issue 3, pp 313–323 | Cite as

To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability

  • Maha Salem
  • Friederike Eyssel
  • Katharina Rohlfing
  • Stefan Kopp
  • Frank Joublin
Article

Abstract

Previous work has shown that non-verbal behaviors affect anthropomorphic inferences about artificial communicators such as virtual agents or social robots. In an experiment with a humanoid robot we investigated the effects of the robot’s hand and arm gestures on the perception of humanlikeness, likability of the robot, shared reality, and future contact intentions after interacting with the robot. For this purpose, the speech-accompanying non-verbal behaviors of the humanoid robot were manipulated in three experimental conditions: (1) no gesture, (2) congruent co-verbal gesture, and (3) incongruent co-verbal gesture. We hypothesized higher ratings on all dependent measures in the two multimodal (i.e., speech and gesture) conditions compared to the unimodal (i.e., speech only) condition. The results confirm our predictions: when the robot used co-verbal gestures during interaction, it was anthropomorphized more, participants perceived it as more likable, reported greater shared reality with it, and showed increased future contact intentions than when the robot gave instructions without gestures. Surprisingly, this effect was particularly pronounced when the robot’s gestures were partly incongruent with speech, although this behavior negatively affected the participants’ task-related performance. These findings show that communicative non-verbal behaviors displayed by robotic systems affect anthropomorphic perceptions and the mental models humans form of a humanoid robot during interaction.

Keywords

Social human-robot interaction Multimodal interaction and conversational skills Non-verbal cues and expressiveness Anthropomorphism Robot companions and social robots 

References

  1. 1.
    Bartneck C, Croft E, Kulic D (2008) Measuring the anthropomorphism, animacy, likeability, perceived intelligence and safety of robots. In: Proceedings of the metrics of human-robot interaction workshop, technical report 471, pp 37–41 Google Scholar
  2. 2.
    Bergmann K, Kopp S, Eyssel F (2010) Individualized gesturing outperforms average gesturing—evaluating gesture production in virtual humans. In: Proceedings of the 10th conference on intelligent virtual agents. Springer, Berlin, pp 104–117 CrossRefGoogle Scholar
  3. 3.
    Breazeal C (2002) Designing sociable robots. AAAI Press, Menlo Park Google Scholar
  4. 4.
    Buisine S, Abrilian S, Martin J-C (2004) Evaluation of multimodal behaviour of embodied agents—cooperation between speech and gestures. In: From brows to trust, vol. 7, pp 217–238 CrossRefGoogle Scholar
  5. 5.
    Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42(3–4):177–190 MATHCrossRefGoogle Scholar
  6. 6.
    Echterhoff G, Higgins ET, Levine JM (2009) Shared reality: experiencing commonality with others’ inner states about the world. Perspect Psychol Sci 4:496–521 CrossRefGoogle Scholar
  7. 7.
    Epley N, Waytz A, Cacioppo JT (2007) On seeing human: a three-factor theory of anthropomorphism. Psychol Rev 114(4):864–886 CrossRefGoogle Scholar
  8. 8.
    Goldin-Meadow S (1999) The role of gesture in communication and thinking. Trends Cogn Sci 3:419–429 CrossRefGoogle Scholar
  9. 9.
    Haslam N, Bain P, Loughnan S, Kashima Y (2008) Attributing and denying humanness to others. Eur Rev Soc Psychol 19:55–85 CrossRefGoogle Scholar
  10. 10.
    Iio T, Shiomi M, Shinozawa K, Akimoto T, Shimohara K, Hagita N (2011) Investigating entrainment of people’s pointing gestures by robot’s gestures using a WOZ method. Int J Soc Robot 3(4):405–414 CrossRefGoogle Scholar
  11. 11.
    Kendon A (1986) Some reasons for studying gesture. Semiotica 62:1–28 CrossRefGoogle Scholar
  12. 12.
    Kopp S, Wachsmuth I (2004) Synthesizing multimodal utterances for conversational agents. Comput Animat Virtual Worlds 15(1):39–52 CrossRefGoogle Scholar
  13. 13.
    Krämer N, Simons N, Kopp S (2007) The effects of an embodied conversational agent’s nonverbal behavior on user’s evaluation and behavioral mimicry. In: Proceedings of the 7th conference on intelligent virtual agents, vol 4722. Springer, Berlin, pp 238–251 CrossRefGoogle Scholar
  14. 14.
    Loughnan S, Haslam N (2007) Animals and androids: implicit associations between social categories and nonhumans. Psychol Sci 18:116–121 CrossRefGoogle Scholar
  15. 15.
    Ltd Honda Motor Co (2000) The Honda Humanoid Robot Asimo, year 2000 model Google Scholar
  16. 16.
    McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago Google Scholar
  17. 17.
    Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human-robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM/IEEE international conference on human-robot interaction, pp 61–68 CrossRefGoogle Scholar
  18. 18.
    Neff M, Wang Y, Abbott R, Walker M (2010) Evaluating the effect of gesture and language on personality perception in conversational agents. In: Proceedings of the 10th international conference on intelligent virtual agents. Springer, Berlin, pp 222–235 CrossRefGoogle Scholar
  19. 19.
    Salem M (2012) Conceptual motorics—generation and evaluation of communicative robot gesture. Logos Verlag, Berlin. PhD Dissertation Google Scholar
  20. 20.
    Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2011) Effects of gesture on the perception of psychological anthropomorphism: a case study with a humanoid robot. In: Proceedings of the third international conference on social robotics. Lecture notes in artificial intelligence, vol 7072. Springer, Berlin, pp 31–41 Google Scholar
  21. 21.
    Salem M, Kopp S, Wachsmuth I, Rohlfing K, Joublin F (2012) Generation and evaluation of communicative robot gesture. Int J Soc Robot 4(2):201–217. Special issue on expectations, intentions, and actions CrossRefGoogle Scholar
  22. 22.
    Schröder M, Trouvain J (2003) The German text-to-speech synthesis system MARY: a tool for research, development and teaching. Int. J. Speech Technol., 365–377. doi:10.1023/A:1025708916924
  23. 23.
    Sidner CL, Kidd CD, Lee C, Lesh N (2004) Where to look: a study of human-robot engagement. In: Proceedings of the 9th international conference on intelligent user interfaces, pp 78–84 Google Scholar
  24. 24.
    Steinfeld A, Jenkins OC, Scassellati B (2009) The Oz of Wizard: simulating the human for interaction research. In: ACM/IEEE international conference on human-robot interaction (HRI), pp 101–108 CrossRefGoogle Scholar
  25. 25.
    White RW (1959) Motivation reconsidered: the concept of competence. Psychol Rev 66:297–331 CrossRefGoogle Scholar
  26. 26.
    Yamazaki A, Yamazaki K, Kuno Y, Burdelski M, Kawashima M, Kuzuoka H (2008) Precision timing in human-robot interaction: coordination of head movement and utterance. In: Proceedings of the ACM/SIGCHI conference on human factors in computing systems, pp 131–140 Google Scholar
  27. 27.
    Yoshikawa Y, Shinozawa K, Ishiguro H, Hagita N, Miyamoto T (2006) Responsive robot gaze to interaction partner. In: Proceedings of the robotics: science and systems conference, pp 37–43 Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2013

Authors and Affiliations

  • Maha Salem
    • 1
  • Friederike Eyssel
    • 2
  • Katharina Rohlfing
    • 3
  • Stefan Kopp
    • 4
  • Frank Joublin
    • 5
  1. 1.Research Institute for Cognition and RoboticsBielefeld UniversityBielefeldGermany
  2. 2.Center of Excellence Cognitive Interaction TechnologyBielefeld UniversityBielefeldGermany
  3. 3.Emergentist Semantics GroupBielefeld UniversityBielefeldGermany
  4. 4.Sociable Agents GroupBielefeld UniversityBielefeldGermany
  5. 5.Honda Research Institute EuropeOffenbachGermany

Personalised recommendations