International Journal of Social Robotics

, Volume 3, Issue 2, pp 125–142 | Cite as

Communication of Emotion in Social Robots through Simple Head and Arm Movements

  • Jamy LiEmail author
  • Mark Chignell


Understanding how people perceive robot gestures will aid the design of robots capable of social interaction with humans. We examined the generation and perception of a restricted form of gesture in a robot capable of simple head and arm movement, referring to point-light animation and video experiments in human motion to derive our hypotheses. Four studies were conducted to look at the effects of situational context, gesture complexity, emotional valence and author expertise. In Study 1, four participants created gestures with corresponding emotions based on 12 scenarios provided. The resulting gestures were judged by 12 participants in a second study. Participants’ recognition of emotion was better than chance and improved when situational context was provided. Ratings of lifelikeness were found to be related to the number of arm movements (but not head movements) in a gesture. In Study 3, five novices and five puppeteers created gestures conveying Ekman’s six basic emotions which were shown to 12 Study 4 participants. Puppetry experience improved identification rates only for the emotions of fear and disgust, possibly because of limitations with the robot’s movement. The results demonstrate the communication of emotion by a social robot capable of only simple head and arm movement.


Human-robot interaction Gesture design Communication of emotions Puppetry 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Breazeal C (2003) Toward sociable robots. Robot Auton Syst 42:167–175 CrossRefzbMATHGoogle Scholar
  2. 2.
    Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42:143–166 CrossRefzbMATHGoogle Scholar
  3. 3.
    Mizoguchi H, Sato T, Takagi K, Nakao M, Hatamura Y (1997) Realization of expressive mobile robot. In: Proceedings of the international conference on robotics and automation, pp 581–586 Google Scholar
  4. 4.
    Reeves B, Nass C (1996) The media equation. Cambridge University Press, Cambridge Google Scholar
  5. 5.
    Lee KM, Peng W, Jin S-A, Yan C (2006) Can robots manifest personality? An empirical test of personality recognition social responses and social presence in human–robot interaction. J Commun 56:754–772 Google Scholar
  6. 6.
    Sidner C, Lee C, Morency L-P, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proc of ACM SIGCHI/SIGART conference on HRI, pp 290–296 Google Scholar
  7. 7.
    Marui N, Matsumaru T (2005) Emotional motion of human-friendly robot: emotional expression with bodily movement as the motion media. Nippon Robotto Gakkai Gakujutsu Koenkai Yokoshu 23:2H12 Google Scholar
  8. 8.
    Bacon F (1815) The works of sir Francis Bacon. Jones, London Google Scholar
  9. 9.
    McNeill D (1987) Psycholinguistics: a new approach. Harper Row, New York Google Scholar
  10. 10.
    Wachsmuth I, Lenzen M, Knoblich G (2008) Embodied communication in humans and machines. Oxford University Press, London Google Scholar
  11. 11.
    Argyle M (1994) The psychology of interpersonal behaviour, 5th edn. Penguin, London Google Scholar
  12. 12.
    Nehaniv C (2005) Classifying types of gesture and inferring intent. In: Proc AISB’05 symposium on robot companions the society for the study of artificial intelligence and simulation of behaviour, pp 74–81 Google Scholar
  13. 13.
    Levy D (2007) Intimate relationships with artificial partners. PhD thesis University of Maastricht Google Scholar
  14. 14.
    Cassell J (2000) Embodied conversational interface agents. Commun ACM 43(4):70–78 CrossRefGoogle Scholar
  15. 15.
    Cassell J, Thorisson KR (1999) The power of a nod and a glance: envelope vs emotional feedback in animated conversational agents. Appl Artif Intell 13(4):519–538 CrossRefGoogle Scholar
  16. 16.
    Hodgins JK, O’Brien JF, Tumblin J (1998) Perception of human motion with different geometrical models. IEEE Trans Vis Comput Graph 4:307–317 CrossRefGoogle Scholar
  17. 17.
    Blake R, Shiffar M (2007) Perception of human motion. Annu Rev Psychol 58:47–73 CrossRefGoogle Scholar
  18. 18.
    Su M-H, Lee W-P, Wang J-H (2004) A user-oriented framework for the design and implementation of pet robots. In: Proceedings of the 2004 IEEE international conference on systems man and cybernetics 10–13 October 2004, The Hague Netherlands, IEEE, Piscataway, NJ Google Scholar
  19. 19.
    Silva DC, Vinhas V, Reis LP, Oliveira E (2009) Biometric emotion assessment and feedback in an immersive digital environment. Int J Soc Robot 1(4):301–317 CrossRefGoogle Scholar
  20. 20.
    Ekman P, Friesen WV, Ellsworth P (1972) Emotion in the human face: Guidelines for research and an integration of findings. Pergamon Press, New York Google Scholar
  21. 21.
    Schlossberg H (1954) Three dimensions of emotion. Psychol Rev 61:81–84 CrossRefGoogle Scholar
  22. 22.
    Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 23(2):133–152 CrossRefGoogle Scholar
  23. 23.
    Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33:717–746 CrossRefGoogle Scholar
  24. 24.
    de Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13:247–268 CrossRefGoogle Scholar
  25. 25.
    Dittrich WH, Troscianko T, Lea S, Morgan D (1996) Perception of emotion from dynamic point-light displays represented in dance. Perception 25:727–738 CrossRefGoogle Scholar
  26. 26.
    Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82:B51–B61 CrossRefGoogle Scholar
  27. 27.
    Clarke TJ, Bradshaw MF, Field DT, Hampson SE, Rose D (2005) The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34(10):1171–1180 CrossRefGoogle Scholar
  28. 28.
    Shaarani AS, Romano DM (2006) Basic emotions from body movements. In: (CCID 2006) The first international symposium on culture creativity and interaction design HCI 2006 workshops, the 20th BCS HCI group conference, Queen Mary University of London, UK Google Scholar
  29. 29.
    Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage and coding. Semiotica 1:49–98 Google Scholar
  30. 30.
    Rosenthal R, DePaulo B (1979) Sex differences in eavesdropping on nonverbal cues. J Pers Soc Psychol 37(2):273–285 CrossRefGoogle Scholar
  31. 31.
    McNeill D (2005) Gesture and thought. The University of Chicago Press, Chicago Google Scholar
  32. 32.
    Kret ME, de Gelder B (2010) Recognition of emotion in body postures is influenced by social context. Exp Brain Res 206(1):169–180 CrossRefGoogle Scholar
  33. 33.
    Frijda N (1986) Emotions. Cambridge University Press, Cambridge Google Scholar
  34. 34.
    Atkinson A, Tunstall M, Dittrich W (2007) Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures. Cognition 104(1):59–72 CrossRefGoogle Scholar
  35. 35.
    Sawada M, Suda K, Ishii M (2003) Expression of emotions in dance: relation between arm movement characteristics and emotion. Percept Mot Skills 97:697–708 CrossRefGoogle Scholar
  36. 36.
    Rakison DH, Poulin-Dubois D (2001) Developmental origin of the animate–inanimate distinction. Psychol Bull 2:209–228 CrossRefGoogle Scholar
  37. 37.
    Leslie AM (1994) ToMM ToBy and agency: core architecture and domain specificity. In: Hirschfield L, Gelman S (eds) Mapping the mind: domain specificity in cognition and culture. Cambridge University Press, Cambridge, pp 119–148 CrossRefGoogle Scholar
  38. 38.
    Morewedge C, Preston J, Wegner D (2007) Timescale bias in the attribution of mind. J Pers Soc Psychol 93(1):1–11 CrossRefGoogle Scholar
  39. 39.
    Opfer J (2002) Identifying living and sentient kinds from dynamic information: the case of goal-directed versus aimless autonomous movement in conceptual change. Cognition 86:97–122 CrossRefGoogle Scholar
  40. 40.
    Premack D (1990) The infant’s theory of self-propelled objects. Cognition 36:1–16 CrossRefGoogle Scholar
  41. 41.
    Gelman R, Durgin F, Kaufman L (1995) Distinguishing between animates and inanimates: not by motion alone. In: Sperber S, Premack D, Premack A (eds) Causal cognition: a multi-disciplinary debate. Oxford University Press, Cambridge, pp 150–184 Google Scholar
  42. 42.
    Bassili JN (1976) Temporal and spatial contingencies in the perception of social events. J Pers Soc Psychol 33:680–685 CrossRefGoogle Scholar
  43. 43.
    Tremoulet PD, Feldman J (2000) Perception of animacy from the motion of a single object. Perception 29:943–951 CrossRefGoogle Scholar
  44. 44.
    Trafton J, Trickett S, Stitzlein C, Saner L, Schunn C, Kirschenbaum S (2006) The relationship between spatial transformations and iconic gestures. Spat Cogn Comput 6(1):1–29 CrossRefGoogle Scholar
  45. 45.
    Chase WG, Simon HA (1974) Perception in chess. Cogn Psychol 4:55–81 CrossRefGoogle Scholar
  46. 46.
    Chi MTH, Feltovich PJ, Glaser R (1981) Categorization and representation of physics problems by experts and novices. Cogn Sci 5:121–152 CrossRefGoogle Scholar
  47. 47.
    Loula F, Prasad S, Harber K, ShiVrar M (2005) Recognizing people from their movement. J Exp Psychol Hum Percept Perform 31:210–220 CrossRefGoogle Scholar
  48. 48.
    Latshaw G (1978) The complete book of puppetry. Dover, Mineola Google Scholar
  49. 49.
    Blumenthal E (2005) Puppetry: a world history. Harry N Abrams, New York Google Scholar
  50. 50.
    Logan D (2007) Puppetry. Brisbane Dramatic Arts Company, Brisbane Google Scholar
  51. 51.
    Sturman D (1998) Computer puppetry. IEEE Comput Graph Appl 18(1):38–45 CrossRefGoogle Scholar
  52. 52.
    Fukuda H, Ueda K (2010) Interaction with a moving object affects one’s perception of its animacy. Int J Soc Robot 2(2):187–193 CrossRefGoogle Scholar
  53. 53.
    Lim H, Ishii A, Takanishi A (1999) Basic emotional walking using a biped humanoid robot. In: Proceedings of the IEEE SMC 1999 Google Scholar
  54. 54.
    Nakagawa K, Shinozawa K, Ishiguro H, Akimoto T, Hagita N (2009) Motion modification method to control affective nuances for robots. In: Proceedings of the 2009 IEEE/RSJ international conference on intelligent robots and systems, pp 3727–3734 Google Scholar
  55. 55.
    Biocca F (1997) The cyborg’s dilemma: progressive embodiment in virtual environments. J Comput-Mediat Commun 3(2). Available:
  56. 56.
    Kidd C, Breazeal C (2005) Comparison of social presence in robots and animated characters. In: Proc of human-computer interaction (CHI) Google Scholar
  57. 57.
    Ono T, Ishiguro H, Imai M (2001) A model of embodied communications with gestures between humans and robots. In: Proceedings of 23rd annual meeting of the cognitive science society, Mahlwal. Erlbaum, Hillsdale Google Scholar
  58. 58.
    Kanda T, Ishiguro H, Imai M, Ono T (2003) Body movement analysis of human-robot interaction. In: Proc of int joint conference on artificial intelligence (IJCAI 2003), pp 177–182 Google Scholar
  59. 59.
    Zlatev J (1999) The epigenesis of meaning in human beings and possibly in robots. Lund University Cognitive Studies 79, Lund University Google Scholar
  60. 60.
    Demiris J, Hayes G (1999) Active and passive routes to imitation. In: Proceedings of the AISB symposium on imitation in animals and artifacts Google Scholar
  61. 61.
    Xing S, Chen I-M (2002) Design expressive behaviors for robotic puppet. In: Proceedings of 7th international conference on control automation robotics and vision (ICARCV ’02), Dec 2002, Singapore, pp 378–382 Google Scholar
  62. 62.
    Plaisant C, Druin A, Lathan C, Dakhane K, Edwards K, Vice JM, Montemayor J (2000) A storytelling robot for pediatric rehabilitation. In: Proc ASSETS ’00 Google Scholar
  63. 63.
    Sabanovic S, Meisner E, Caporael L, Isler V, Trinkle J (2009) Outside-in design for interdisciplinary HRI research. In 2009 AAAI spring symposium on experimental design for real-world systems Google Scholar
  64. 64.
    Meisner E, Sabanovic S, Isler Volkan Caporael L, Trinkle J (2009) ShadowPlay: a generative model for nonverbal human-robot interaction. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction (HRI’09) 11–13 March 2009, La Jolla, California. ACM, New York Google Scholar
  65. 65.
    Sekiguchi D, Inami M, Tachi S (2004) The design of internet-based RobotPHONE. In: Proceedings of 14th international conference on artificial reality, pp 223–228 Google Scholar
  66. 66.
    Nomura T, Suzuki T, Kanda T, Kato K (2006) Altered attitudes of people toward robots: investigation through the negative attitudes toward robots scale. In: Proc AAAI-06 workshop on human implications of human-robot interaction, pp 29–35 Google Scholar
  67. 67.
    Nomura T, Nakao A (2010) Comparison on identification of affective body motions by robots between elder people and university students: a case study in Japan. Int J Soc Robot 2(2):147–157 CrossRefGoogle Scholar
  68. 68.
    Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A (2004) Various emotion expression humanoid robot WE-4RII. In: 1st IEEE technical exhibition based conference on robotics and automation (TExCRA 2004), November 18–19, 2004, Tokyo, Japan, pp 35–36 Google Scholar
  69. 69.
    Carpenter J, Davis J, Erwin-Stewart N, Lee T, Bransford J, Vye N (2009) Gender representation and humanoid robots designed for domestic use. Int J Soc Robot 1(3):261–265 CrossRefGoogle Scholar
  70. 70.
    Tanaka A, Koizumi A, Imai H, Hiramatsu S, Hiramoto E, de Gelder B (2010) I feel your voice: cultural differences in the multisensory perception of emotion. Psychol Sci (in press). doi: 10.1177/0956797610380698
  71. 71.
    Beattie G (2003) Visible thought: the new psychology of body language. Routledge, London Google Scholar

Copyright information

© Springer Science & Business Media BV 2010

Authors and Affiliations

  1. 1.Department of Mechanical and Industrial EngineeringUniversity of TorontoTorontoCanada

Personalised recommendations