International Journal of Social Robotics

, Volume 5, Issue 3, pp 325–334 | Cite as

Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children

  • Aryel Beck
  • Lola Cañamero
  • Antoine Hiolle
  • Luisa Damiano
  • Piero Cosi
  • Fabio Tesser
  • Giacomo Sommavilla


The work reported in this paper focuses on giving humanoid robots the capacity to express emotions with their body. Previous results show that adults are able to interpret different key poses displayed by a humanoid robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy) and valence (positive or negative emotion) whereas moving the head up produces an increase along these dimensions. Hence, changing the head position during an interaction should send intuitive signals. The study reported in this paper tested children’s ability to recognize the emotional body language displayed by a humanoid robot. The results suggest that body postures and head position can be used to convey emotions during child-robot interaction.


Robotic Emotional body language Perception 



The authors would like to thank the school “scuola media Dante Alighieri” for hosting the study as well as Arnaud Ducamp and Cornelius Glackin for their feedback on an earlier version of this paper.


  1. 1.
    Argyle M (1975) Bodily communication, Methuen, London Google Scholar
  2. 2.
    Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33(6):717–746 CrossRefGoogle Scholar
  3. 3.
    Avizer H, Trope Y, Todorov A (2012) Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338:1225–1229 CrossRefGoogle Scholar
  4. 4.
    Beck A (2012) Perception of emotional body language displayed by animated characters. PhD thesis, University of Portsmouth, UK Google Scholar
  5. 5.
    Beck A, Cañamero L, Bard K (2010) Towards an affect space for robots to display emotional body language. In: Ro-Man 2010. IEEE Press, New York, pp 464–469 Google Scholar
  6. 6.
    Beck A, Stevens B, Bard KA, Cañamero L (2012) Emotional body language displayed by artificial agents. ACM Trans Interact Intell Syst 2(1):2:1–2:29 CrossRefGoogle Scholar
  7. 7.
    Belpaeme T, Baxter P, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V, Looije R, Neerincx M, Demiris Y, Ros-Espinoza R, Beck A, Cañamero L, Hiolle A, Lewis M, Baroni I, Nalin M, Cosi P, Paci G, Tesser F, Sommavilla G, Humbert R (2013) Multimodal child-robot interaction: building social bonds. J Human-Robot Interact 1(2):33–53 Google Scholar
  8. 8.
    Bethel C, Murphy R (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern, Part C, Appl Rev 38(1):83–92 CrossRefGoogle Scholar
  9. 9.
    Boone R, Cunningham J (1996) The attribution of emotion to expressive body movements: a structural cue analysis. Manuscript Google Scholar
  10. 10.
    Boone R, Cunningham J (1998) Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34(5):579–589 CrossRefGoogle Scholar
  11. 11.
    Breazeal C (2002) Designing sociable robots. Intelligent robotics and autonomous agents. MIT Press, Cambridge Google Scholar
  12. 12.
    Cassell J (2000) Nudge nudge wink wink: elements of face-to-face conversation. MIT Press, Cambridge, pp 1–27 Google Scholar
  13. 13.
    De Silva P, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animat Virtual Worlds 15(3–4):269–276 CrossRefGoogle Scholar
  14. 14.
    Elfenbein H, Ambady N (2002) On the universality and cultural specificity of emotion recognition: a meta-analysis. Psychol Bull 128:203–235 CrossRefGoogle Scholar
  15. 15.
    de Gelder B (2006) Towards the neurobiology of emotional body language. Nat Rev 7(3):242–249 CrossRefGoogle Scholar
  16. 16.
    Haring M, Bee N, Andre E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: Ro-Man 2011. IEEE Press, New York, pp 204–209 CrossRefGoogle Scholar
  17. 17.
    Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza M, Dario P, Takanishi A (2004) Various emotional expressions with emotion expression humanoid robot we-4rii. In: First IEEE technical exhibition based conference on robotics and automation, TExCRA’04, pp 35–36. CrossRefGoogle Scholar
  18. 18.
    Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput 4(1):15–33. doi: 10.1109/TAFFC.2012.16 CrossRefGoogle Scholar
  19. 19.
    Kleinsmith A, Bianchi-Berthouze N, Steed A (2011) Automatic recognition of non-acted affective postures. IEEE Trans Syst Man Cybern, Part B, Cybern 41(4):1027–1038. doi: 10.1109/TSMCB.2010.2103557 CrossRefGoogle Scholar
  20. 20.
    Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18(6):1371–1389 CrossRefGoogle Scholar
  21. 21.
    Knapp M, Hall J (1972) Non verbal communication in human interaction. Harcout Brace College Google Scholar
  22. 22.
    Lewis M (2000) The emergence of human emotions. In: Lewis M, Haviland-Jones J (eds) Handbook of emotions. Guilford, New York Google Scholar
  23. 23.
    Maestri G (2006) In: Kissane E, Kalning K (eds) Digital character animation Google Scholar
  24. 24.
    Marc C (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28:117–139 CrossRefGoogle Scholar
  25. 25.
    Nalin M, Baroni I, Sanna A, Pozzi C (2012) Robotic companion for diabetic children: emotional and educational support to diabetic children, through an interactive robot. In: Proceedings of the 11th international conference on interaction design and children, IDC’12. ACM, New York, pp 260–263 CrossRefGoogle Scholar
  26. 26.
    Niedenthal P, Krauth-Gruber S, Ric F (2006) Psychology of emotion. Interpersonal, experiential, and cognitive approaches. Psychology Press, New York Google Scholar
  27. 27.
    Pollick F (2001) Perceiving affect from arm movement. Cognition 82(2):51–61 CrossRefGoogle Scholar
  28. 28.
    Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vis 9(6):15. doi: 10.1167/9.6.15 CrossRefGoogle Scholar
  29. 29.
    Russell JA (1980) A circumflex model of affect. J Pers Soc Psychol 39:1161–1178 CrossRefGoogle Scholar
  30. 30.
    Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: Proceedings of the 5th ACM/IEEE international conference on human-robot interaction, HRI’10. IEEE Press, Piscataway, pp 53–60 Google Scholar
  31. 31.
    Schouwstra S, Hoogstraten J (1995) Head position and spinal position as determinants of perceived emotional state. Percept Mot Skills 81(2):673–674 CrossRefGoogle Scholar
  32. 32.
    Thomas F, Johnston O (1995) The illusion of life. Abbeville Press, New York Google Scholar
  33. 33.
    Tonk J (2007) Assessing emotion recognition in 9–15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj 21(6):623–629 CrossRefGoogle Scholar
  34. 34.
    Vinayagamoorthy V, Gillies M, Steed A, Tanguy E, Pan X, Loscos C, Slater M (2006) Building expression into virtual characters. In: Proceedings of the Eurographics Google Scholar
  35. 35.
    Wallbott H (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896 CrossRefGoogle Scholar
  36. 36.
    Walter M, Dautenhahn K, Te Boekhorst R, Koay K, Syrdal D, Nehaniv C (2009) An empirical framework for human-robot proxemics. In: Symposium on new frontiers in human-robot interaction, AISB09, pp 144–149 Google Scholar
  37. 37.
    Woods S, Dautenhahn K, Schultz J (2005) Child and adults’ perspectives on robot appearance. In: Symposium on robot companion, AISB05, pp 126–132 Google Scholar
  38. 38.
    Nalin M, Baroni I, Kruijff-Korbayova I, Canamero L, Lewis M, Beck A, Cuayahuitl H, Sanna A (2012) Children’s adaptation in multisession interaction with a humanoid robot. In: Ro-Man 2012. IEEE Press, New York, pp 351–357. doi: 10.1109/ROMAN.2012.6343778 Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2013

Authors and Affiliations

  • Aryel Beck
    • 1
  • Lola Cañamero
    • 1
  • Antoine Hiolle
    • 1
  • Luisa Damiano
    • 1
    • 2
  • Piero Cosi
    • 3
  • Fabio Tesser
    • 3
  • Giacomo Sommavilla
    • 3
  1. 1.Embodied Emotion, Cognition and (Inter-)Action Lab, School of Computer Science & STRIUniversity of HertfordshireHatfieldUK
  2. 2.Department of Human and Social Sciences, ESARGUniversity of BergamoBergamoItaly
  3. 3.Institute of Cognitive Sciences and TechnologiesPadovaItaly

Personalised recommendations