Advertisement

Emotions and Messages in Simple Robot Gestures

  • Jamy Li
  • Mark Chignell
  • Sachi Mizobuchi
  • Michiaki Yasumura
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5611)

Abstract

Understanding how people interpret robot gestures will aid design of effective social robots. We examine the generation and interpretation of gestures in a simple social robot capable of head and arm movement using two studies. In the first study, four participants created gestures with corresponding messages and emotions based on 12 different scenarios provided to them. The resulting gestures were then shown in the second study to 12 participants who judged which emotions and messages were being conveyed. Knowledge (present or absent) of the motivating scenario (context) for each gesture was manipulated as an experimental factor. Context was found to assist message understanding while providing only modest assistance to emotion recognition. While better than chance, both emotion (22%) and message understanding (40%) accuracies were relatively low. The results obtained are discussed in terms of implied guidelines for designing gestures for social robots.

Keywords

Human-Robot Interaction Gestures Social Robots Emotion 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Breazeal, C.: Toward sociable robots. Robotics and Autonomous Systems 42, 167–175 (2003)CrossRefzbMATHGoogle Scholar
  2. 2.
    Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robotics and Autonomous Systems 42, 143–166 (2003)CrossRefzbMATHGoogle Scholar
  3. 3.
    McNeill, D.: Gesture and Thought. The University of Chicago Press, Chicago (2005)CrossRefGoogle Scholar
  4. 4.
    Argyle, M.: The Psychology of Interpersonal Behaviour, 5th edn. Penguin Books, London (1994)Google Scholar
  5. 5.
    Montepare, J., Koff, E., Zaitchik, D., Albert, M.: The use of body movements and gestures as cues to emotions in younger and older adults. Journal of Nonverbal Behavior 23(2), 133–152 (1999)CrossRefGoogle Scholar
  6. 6.
    Pollick, F.E., Paterson, H.M., Bruderlin, A., Sanford, A.J.: Perceiving affect from arm movement. Cognition 82, B51–B61 (2001)CrossRefGoogle Scholar
  7. 7.
    Clarke, T.J., Bradshaw, M.F., Field, D.T., Hampson, S.E., Rose, D.: The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34(10), 1171–1180 (2005)CrossRefGoogle Scholar
  8. 8.
    DePaulo, B.M., Rosenthal, R.: The structure of nonverbal decoding skills. Journal of Personality 47(3), 506–517 (1979)CrossRefGoogle Scholar
  9. 9.
    Mizoguchi, H., Sato, T., Takagi, K., Nakao, M., Hatamura, Y.: Realization of expressive mobile robot. In: Proceedings of the International Conference on Robotics and Automation, pp. 581–586 (1997)Google Scholar
  10. 10.
    Scheeff, M., Pinto, J., Rahardja, K., Snibbe, S., Tow, R.: Experiences with Sparky: A social robot. In: Proceedings of the Workshop on Interactive Robot Entertainment (2000)Google Scholar
  11. 11.
    Marui, N., Matsumaru, T.: Emotional motion of human-friendly robot: Emotional expression with bodily movement as the motion media. Nippon Robotto Gakkai Gakujutsu Koenkai Yokoshu 23, 2–12 (2005)Google Scholar
  12. 12.
    Sidner, C., Lee, C., Morency, L.-P., Forlines, C.: The effect of head-nod recognition in human-robot conversation. In: Proc. of ACM SIGCHI/SIGART conference on HRI, pp. 290–296 (2006)Google Scholar
  13. 13.
    Bartneck, C.: eMuu—an emotional embodied character for the ambient intelligent home. Ph.D. dissertation, Technical University Eindhoven, The Netherlands (2002)Google Scholar
  14. 14.
    Krauss, R.M., Morrel-Samuels, P., Colasante, C.: Do conversational hand gestures communicate? Journal of Personality and Social Psychology 61, 743–754 (1991)CrossRefGoogle Scholar
  15. 15.
    Krauss, R., Chen, Y., Chawla, P.: Advances in experimental social psychology. In: Zanna, M.P. (ed.) Advances in Experimental Social Psychology, vol. 28, pp. 389–450. Academic Press, San Diego (1996)Google Scholar
  16. 16.
    Cassell, J.: Embodied conversational interface agents. Commun. ACM 43(4), 70–78 (2000)CrossRefGoogle Scholar
  17. 17.
    Sekiguchi, D., Inami, M., Tachi, S.: RobotPHONE: RUI for interpersonal communication. In: CHI 2001: Extended Abstracts, pp. 277–278 (2001)Google Scholar
  18. 18.
    Nomura, T., Suzuki, T., Kanda, T., Kato, K.: Altered attitudes of people toward robots: Investigation through the negative attitudes toward robots scale. In: Proc. AAAI 2006 Workshop on Human Implications of Human-Robot Interaction, pp. 29–35 (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Jamy Li
    • 1
  • Mark Chignell
    • 1
  • Sachi Mizobuchi
    • 2
  • Michiaki Yasumura
    • 3
  1. 1.Interactive Media Lab, Department of Mechanical and Industrial EngineeringUniversity of TorontoTorontoCanada
  2. 2.Toyota InfoTechnology CenterTokyoJapan
  3. 3.Interactive Design Lab, Faculty of Environment and Information StudiesKeio UniversityKanagawaJapan

Personalised recommendations