International Journal of Social Robotics

, Volume 6, Issue 2, pp 261–280 | Cite as

Recognizing Emotional Body Language Displayed by a Human-like Social Robot



Natural social human–robot interactions (HRIs) require that robots have the ability to perceive and identify complex human social behaviors and, in turn, be able to also display their own behaviors using similar communication modes. Recently, it has been found that body language plays an important role in conveying information about changes in human emotions during human–human interactions. Our work focuses on extending this concept to robotic affective communication during social HRI. Namely, in this paper, we explore the design of emotional body language for our human-like social robot, Brian 2.0. We develop emotional body language for the robot using a variety of body postures and movements identified in human emotion research. To date, only a handful of researchers have focused on the use of robotic body language to display emotions, with a significant emphasis being on the display of emotions through dance. Such emotional dance can be effective for small robots with large workspaces, however, it is not as appropriate for life-sized robots such as Brian 2.0 engaging in one-on-one interpersonal social interactions with a person. Experiments are presented to evaluate the feasibility of the robot’s emotional body language based on human recognition rates. Furthermore, a unique comparison study is presented to investigate the perception of human body language features displayed by the robot with respect to the same body language features displayed by a human actor.


Emotional body language Social robots Human–robot interactions Human emotion research 


  1. 1.
    Breazeal C (2004) Social interaction in HRI: the robot view. IEEE Trans Syst Man Cybern C 34:181–186. doi:10.1109/TSMCC.2004.826268 CrossRefGoogle Scholar
  2. 2.
    Xin M, Sharlin E (2007) Playing games with robots: a method for evaluating human–robot interaction. In: Sankar N (ed) Human–robot interaction. Austria, Vienna, pp 469–480Google Scholar
  3. 3.
    Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42:143–166. doi:10.1016/S0921-8890(02)00372-X MATHCrossRefGoogle Scholar
  4. 4.
    McColl D, Zhang Z, Nejat G (2011) Human body pose interpretation and classification for social human–robot interaction. Int J Soc Robot 3(3):313–332. doi:10.1007/s12369-011-0099-6 CrossRefGoogle Scholar
  5. 5.
    McColl D, Nejat G (2011) A socially assistive robot that can interpret human body language. In: ASME 2012 international design engineering technical conferences and computers and information in engineering conference, DETC 2011-48031. doi:10.1115/DETC2011-48031
  6. 6.
    Zhang Z, Nejat G (2009) Human affective state recognition and classification during human–robot interaction scenarios. In: ASME 2012 international design engineering technical conferences and computers and information in engineering conference, DETC 2009–87647. doi:10.1115/DETC2009-87647
  7. 7.
    Nejat G, Ficocelli M (2008) Can I be of assistance? The intelligence behind an assistive robot. In: IEEE international conference on robotics and automation (ICRA), pp 3564–3569. doi:10.1109/ROBOT.2008.4543756
  8. 8.
    Terao J, Trejos L, Zhang Z, Nejat G (2008) An intelligent socially assistive robot for health care. In: ASME international mechanical engineering congress and exposition, IMECE 2008–67678. doi:10.1115/IMECE2008-67678
  9. 9.
    Nejat G, Allison B, Gomez N, Rosenfeld A (2007) The design of an interactive socially assistive robot for patient care. In: ASME international mechanical engineering congress and exposition, IMECE 2007–41811. doi:10.1115/IMECE2007-41811
  10. 10.
    Collett P, Marsh P, O’Shaughnessy M (1979) Gestures: their origin and distribution. Jonathan Cape, LondonGoogle Scholar
  11. 11.
    Fasel B, Luettin J (2003) Automatic facial expression analysis: a survey. Pattern Recogn 36(1):259–275. doi:10.1016/S0031-3203(02)00052-3 MATHCrossRefGoogle Scholar
  12. 12.
    Murray IR, Arnott JL (1993) Toward the simulation of emotion in synthetic speech: a review of the literature on human vocal emotion. J Acoust Soc Am 93:1097–1108. doi:10.1121/1.405558 CrossRefGoogle Scholar
  13. 13.
    Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58. doi:10.1109/TPAMI.2008.52 CrossRefGoogle Scholar
  14. 14.
    Graham JA, Bitti PR, Argyle M (1975) A cross-cultural study of the communication of emotion by facial and gestural cues. J Hum Mov Stud 1(2):68–77Google Scholar
  15. 15.
    App B, Reed CL, McIntosh DN (2012) Relative contributions of face and body configurations: perceiving emotional state and motion intention. Cogn Emot 26(4):690–698. doi:10.1080/02699931.2011.588688 CrossRefGoogle Scholar
  16. 16.
    App B, McIntosh DN, Reed CL, Hertenstein MJ (2011) Nonverbal channel use in communication of emotion: how may depend on why. Emotion 11(3):603–617. doi:10.1037/a0023164 CrossRefGoogle Scholar
  17. 17.
    De Gelder B, Van den Stock J (2011) The bodily expressive action stimulus test (BEAST). Construction and validation of a stimulus basis for measuring perception of whole body expression of emotions. Frontiers Psychol 2: article 181. doi:10.3389/fpsyg.2011.00181
  18. 18.
    Ekman P (2003) Darwin, deception, and facial expression. Ann N Y Acad Sci 1000:205–221CrossRefGoogle Scholar
  19. 19.
    Pelachaud C (2009) Modelling multimodal expression of emotion in a virtual agent. Philos Trans Royal Soc B 364(1535):3539–3548. doi:10.1098/rstb 2009.0186CrossRefGoogle Scholar
  20. 20.
    Castellano G, Mancini M, Peters C, McOwan PW (2012) Expressive copying behavior for social agents: a perceptual analysis. IEEE Trans Syst Man Cybern A 42(3):776–783. doi:10.1109/TSMCA.2011.2172415 CrossRefGoogle Scholar
  21. 21.
    Hashimoto T, Hiramatsu S, Kobayashi H (2008) Dynamic display of facial expressions on the face robot made by using a life mask. In: IEEE-RAS international conference on humanoid robots, pp 521–526. doi:10.1109/ICHR.2008.4756017
  22. 22.
    Miwa H, Okuchi T, Takanobu H, Takanishi A (2002) Development of a new human-like head robot WE-4. In: IEEE/RSJ international conference on intelligent robots and systems, 3:2443–2448. doi:10.1109/IRDS.2002.1041634
  23. 23.
    Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: IEEE workshop on affective computational intelligence, pp 1–8. doi:10.1109/WACI.2011.5953147
  24. 24.
    Nakata T, Mori T, Sato T (2002) Analysis of impression of robot bodily expression. J Robot Mechatron 14(1):27–36Google Scholar
  25. 25.
    Masuda M, Kato S (2010) Motion rendering system for emotion expression of human form robots based on laban movement analysis. In: IEEE international symposium on robots and human interactive communications, pp 324–329. doi:10.1109/ROMAN.2010.5598692
  26. 26.
    Masuda M, Kato S, Itoh H (2009) Emotion detection from body motion of human form robot based on laban movement analysis. In: Principles of practice in multi-agent systems. Springer, Berlin, pp 322–334. doi:10.1007/978-3-642-11161-722
  27. 27.
    Takahashi K, Hosokawa M, Hashimoto M (2010) Remarks on designing of emotional movement for simple communication robot. In: IEEE international conference on industrial technology, pp 585–590. doi:10.1109/ICIT.2010.5472735
  28. 28.
    Nomura T, Nakao A (2010) Comparison on identification of affective body motions by robots between elder people and university students: a case study in Japan. Int J Soc Robot 2(2):147–157. doi:10.1007/s12369-010-0050-2 CrossRefGoogle Scholar
  29. 29.
    Tanaka F, Suzuki H (2004) Dance interaction with QRIO: a case study for non-boring interaction by using an entrainment ensemble model. In: IEEE international workshop on robots and human interactive communications, pp 419–424. doi:10.1109/ROMAN.2004.1374797
  30. 30.
    Mizoguchi H, Sato T, Takagi K, Nakao M, Hatamura Y (1997) Realization of expressive mobile robot. In: IEEE international conference on robotics and automation, 1:581–586. doi:10.1109/ROBOT.1997.620099
  31. 31.
    Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896CrossRefGoogle Scholar
  32. 32.
    de Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13(4):247–268. doi:10.1007/BF00990296 CrossRefGoogle Scholar
  33. 33.
    James WT (1932) A study of the expression of bodily posture. J Gen Psychol 7(2):405–437. doi:10.1080/00221309.1932.9918475 CrossRefGoogle Scholar
  34. 34.
    Walters KL, Walk RD (1986) Perception of emotion from body posture. Bull Psychol Soc 24:329–329Google Scholar
  35. 35.
    Schouwstra SJ, Hoogstraten J (1995) Head position and spinal position as determinants of perceived emotional state. Percept Motor Skills 81(2):673–674. doi:10.2466/pms.1995.81.2.673 CrossRefGoogle Scholar
  36. 36.
    Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28(2):117–139. doi:10.1023/ MathSciNetCrossRefGoogle Scholar
  37. 37.
    Shaarani AS, Romano DM (2007) Perception of emotions from static postures. In: Affective computing and intelligent interaction. Springer, Berlin, pp 761–762. doi:10.1007/978-3-540-74889-287
  38. 38.
    Meeren HK, van Heijnsbergen CC, de Gelder B (2005) Rapid perceptual integration of facial expression and emotional body language. Proc Natl Acad Sci USA 102(45):16518–16523. doi:10.1073/pnas.0507650102 CrossRefGoogle Scholar
  39. 39.
    Van den Stock J, Righart R, de Gelder B (2007) Body expressions influence recognition of emotions in the face and voice. Emotion 7(3):487–494. doi:10.1037/1528-3542.7.3.487 CrossRefGoogle Scholar
  40. 40.
    Ekman P, Friesen V (1967) Head and body cues in the judgment of emotion: a reformation. Percept Motor Skills 24:711–724CrossRefGoogle Scholar
  41. 41.
    Mehrabian A (1969) Significance of posture and position in the communication of attitude and status relationships. Psychol Bull 71(5):359–372. doi:10.1037/h0027349 CrossRefGoogle Scholar
  42. 42.
    Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception (Lond) 33: 717–746Google Scholar
  43. 43.
    Montepare JM, Goldstein SB, Clausen A (1987) The identification of emotions from gait information. J Nonverbal Behav 11(1): 33–42. doi:10.1007/BF00999605 CrossRefGoogle Scholar
  44. 44.
    Brownlow S, Dixon AR, Egbert CA, Radcliffe RD (1997) Perception of movement and dancer characteristics from point-light displays of dance. Psychol Rec 47:411–421Google Scholar
  45. 45.
    Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 23(2):133–152. doi:10.1023/A:1021435526134 CrossRefGoogle Scholar
  46. 46.
    Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82(2):B51–B61. doi:10.1016/S0010-0277(01)00147-0 CrossRefGoogle Scholar
  47. 47.
    James A (1980) A circumplex model of affect. J Pers Social Psychol 39(6):1161–1178CrossRefGoogle Scholar
  48. 48.
    Laban R (1992) The mastery of movement. Northcote House, PlymouthGoogle Scholar
  49. 49.
    Kozima H, Michalowski MP, Nakagawa C (2009) Keepon: a playful robot for research, therapy, and entertainment. Int J Social Robot 1(1):3–18. doi:10.1007/s12369-008-0009-8 CrossRefGoogle Scholar
  50. 50.
    Song H, Kwon DS (2007) Design of a robot head with arm-type antennae for emotional expression. In: IEEE international conference on control, automation and systems, pp 1842–1846. doi:10.1109/ICCAS.2007.4406645
  51. 51.
    Beck A, Cañamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language. In: IEEE international conference on robot and human interactive communication, pp 464–469. doi:10.1109/ROMAN.2010.5598649
  52. 52.
    Itoh K, Miwa H, Matsumoto M, et al. (2004) Various emotional expressions with emotion expression humanoid robot WE-4RII. In: IEEE technical exhibition based conference on robotics and automation, pp 35–36. doi:10.1109/TEXCRA.2004.1424983
  53. 53.
    Haring M, Bee N, André E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: IEEE international conference on robot and human interactive communication, pp 204–209. doi:10.1109/ROMAN.2011.6005263
  54. 54.
    Li J, Chignell M (2011) Communication of emotion in social robots through simple head and arm movements. Int J Social Robot 3(2):125–142. doi:10.1007/s12369-010-0071-x CrossRefGoogle Scholar
  55. 55.
    Hareli S, Parkinson B (2008) What’s social about social emotions? J Theory Social Behav 38(2):131–156. doi:10.1111/j.1468-5914.2008.00363.x CrossRefGoogle Scholar
  56. 56.
    Silvia PJ (2008) Interest—the curious emotion. Curr Dir Psychol Sci 17(1):57–60. doi:10.1111/j.1467-8721.2008.00548.x CrossRefGoogle Scholar
  57. 57.
    Barbalet JM (1999) Boredom and social meaning. Br J Sociol 50(4):631–646. doi:10.1111/j.1468-4446.1999.00631.x CrossRefGoogle Scholar
  58. 58.
    Parkinson B (1996) Emotions are social British. J Psychol 87(4):663–683. doi:10.1111/j.2044-8295.1996.tb02615.x Google Scholar
  59. 59.
    Shaver PR, Wu S, Schwartz JC (1992) Cross-cultural similarities and differences in emotion and its representation: a prototype approach. In: Clark MS (ed) Review of personality and social psychology. Sage, Newbury Park, pp 175–212Google Scholar
  60. 60.
    Ben-Ze’ev A, Oatley K (1996) The intentional and social nature of human emotions: reconsideration of the distinction between basic and non-basic emotions. J Theory Soc Behav 26(1):81–94. doi:10.1111/j.1468-5914.1996.tb00287.x CrossRefGoogle Scholar
  61. 61.
    Barrett KC, Nelson-Goens GC (1997) Emotion communication and development of the social emotions. In: Barrett KC (ed) The communication of emotion: current research from diverse perspectives. Jossey-Bass, San Francisco, pp 69–88Google Scholar
  62. 62.
    Elfenbein HA, Mandal MK, Ambady N, Harizuka S, Kumar S (2002) Cross-cultural patterns in emotion recognition: highlighting design and analytical techniques. Emotion 2(1):75–84. doi:10.1037/1528-3542.2.1.75 CrossRefGoogle Scholar
  63. 63.
    Landis BN, Welge-Luessen A, Brämerson A, Bende M, Mueller CA, Nordin S, Hummel T (2009) “Taste Strips”: a rapid, lateralized, gustatory bedside identification test based on impregnated filter papers. J Neurol 256(2):242–248. doi:10.1007/s00415-009-0088-y CrossRefGoogle Scholar
  64. 64.
    Mogg K, Bradley BP (1999) Some methodological issues in assessing attentional biases for threatening faces in anxiety: a replication study using a modified version of the probe detection task. Behav Res Ther 37(6):595–604. doi:10.1016/S0005-7967(98)00158-2
  65. 65.
    Neill SSJ (1986) Children’s reported responses to teachers’ non-verbal Signals: a pilot study. J Educ Teach 12(1):53–63. doi:10.1080/0260747860120106 CrossRefGoogle Scholar
  66. 66.
    Aziz K (1998) The key to perfect presentations. Ind Commer Train 30(6):214–217. doi:10.1108/00197859810232988 Google Scholar
  67. 67.
    Dael N, Mortillaro M, Scherer KR (2012) Emotion expression in body action and posture. Emotion 12(5):1085–1101. doi:10.1037/a0025737 CrossRefGoogle Scholar
  68. 68.
    Kohler CG, Turner T, Stolar NM et al (2004) Differences in facial expressions of four universal emotions. Psychiatry Res 128(3):235–244. doi:10.1016/j.psychres.2004.07.003 CrossRefGoogle Scholar
  69. 69.
    Darwin C (1872) The expression of the emotions in man and animals. Murray, London (reprinted: University of Chicago Press, Chicago, 1965)Google Scholar
  70. 70.
    Bull P (1978) The interpretation of posture through an alternative methodology to role play. Br J Soc Clin Psychol 17(1):1–6. doi:10.1111/j.2044-8260.1978.tb00888.x CrossRefGoogle Scholar
  71. 71.
    Bindemann M, Mike Burton A, Langton SR (2008) How do eye gaze and facial expression interact? Vis Cogn 16(6):708–733. doi:10.1080/13506280701269318 CrossRefGoogle Scholar
  72. 72.
    Ganel T (2011) Revisiting the relationship between the processing of gaze direction and the processing of facial expression. J Exp Psychol 37(1):48–57. doi:10.1037/a0019962 Google Scholar
  73. 73.
    Bassili JN (1978) Facial motion in the perception of faces and of emotional expression. J Exp Psychol 4(3):373–379. doi:10.1037/0096-1523.4.3.373 Google Scholar
  74. 74.
    Tinwell A, Grimshaw M, Nabi DA, Williams A (2011) Facial expression of emotion and perception of the Uncanny Valley in virtual characters. Comput Hum Behav 27(2):741–749. doi:10.1016/j.chb.2010.10.018 CrossRefGoogle Scholar
  75. 75.
    Roese N, Amir E (2009) Speculations on human–android interaction in the near and distant future. Perspect Psychol Sci 4:429–434CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.Autonomous Systems and Biomechatronics Laboratory, Department of Mechanical and Industrial EngineeringUniversity of TorontoTorontoCanada

Personalised recommendations