Children Interpretation of Emotional Body Language Displayed by a Robot

  • Aryel Beck
  • Lola Cañamero
  • Luisa Damiano
  • Giacomo Sommavilla
  • Fabio Tesser
  • Piero Cosi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7072)

Abstract

Previous results show that adults are able to interpret different key poses displayed by the robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy), valence (positive or negative) and stance (approaching or avoiding) whereas moving the head up produces an increase along these dimensions [1]. Hence, changing the head position during an interaction should send intuitive signals which could be used during an interaction. The ALIZ-E target group are children between the age of 8 and 11. Existing results suggest that they would be able to interpret human emotional body language [2, 3].

Based on these results, an experiment was conducted to test whether the results of [1] can be applied to children. If yes body postures and head position could be used to convey emotions during an interaction.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Beck, A., Cañamero, L., Bard, K.: Towards an affect space for robots to display emotional body language. In: Ro-Man 2010. IEEE, Viareggio (2010)Google Scholar
  2. 2.
    Boone, R.T., Cunningham, J.G.: The Attribution of Emotion to Expressive Body Movements: A Structural Cue Analysis (1996) (manuscript)Google Scholar
  3. 3.
    Boone, R.T., Cunningham, J.G.: Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev. Psychol. 34(5), 1007–1016 (1998)CrossRefGoogle Scholar
  4. 4.
    Gillies, M., et al.: Responsive listening behavior. Computer Animation and Virtual Worlds 19(5), 579–589 (2008)CrossRefGoogle Scholar
  5. 5.
    Breazeal, C.: Designing sociable robots. In: Intelligent Robotics & Autonomous Agents. MIT Press, Cambridge (2002)Google Scholar
  6. 6.
    Beck, A., Stevens, B., Bard, K.: Comparing perception of affective body movements displayed by actors and animated characters. In: AISB 2009, Edinburgh, UK (2009)Google Scholar
  7. 7.
    de Gelder, B.: Towards the neurobiology of emotional body language. Nature Reviews Neuro-Science 7(3), 242–249 (2006)CrossRefGoogle Scholar
  8. 8.
    Kleinsmith, A., De Silva, P.R., Bianchi-Berthouze, N.: Cross-cultural differences in recognizing affect from body posture. Interacting with Computers 18(6), 1371–1389 (2006)CrossRefGoogle Scholar
  9. 9.
    Thomas, F., Johnston, O.: The illusion of life. Abbeville Press, New-York (1995)Google Scholar
  10. 10.
    Ekman, P., Friesen, W.V., Hager, J.C.: Facial action coding system. The manual. Human Face, Salt Lake (2002)Google Scholar
  11. 11.
    Cassell, J.: Nudge nudge wink wink: elements of face-to-face conversation. In: Cassell, J., et al. (eds.) Embodied Conversational Agents, pp. 1–27. MIT Press, Cambridge (2000)Google Scholar
  12. 12.
    Vinayagamoorthy, V., et al.: Building Expression into Virtual Characters. In: Eurographics 2006. Proc. Eurographics, Vienna (2006)Google Scholar
  13. 13.
    De Silva, P.R., Bianchi-Berthouze, N.: Modeling human affective postures: an information theoretic characterization of posture features. Computer Animation and Virtual Worlds 15(3-4), 269–276 (2004)CrossRefGoogle Scholar
  14. 14.
    Atkinson, A.P., et al.: Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33, 717–746 (2004)CrossRefGoogle Scholar
  15. 15.
    Wallbott, H.G.: Bodily expression of emotion. European Journal of Social Psychology 28(6), 879–896 (1998)CrossRefGoogle Scholar
  16. 16.
    Pollick, F.E., et al.: Perceiving affect from arm movement. Cognition 82(2), 51–61 (2001)CrossRefGoogle Scholar
  17. 17.
    Kleinsmith, A., Bianchi-Berthouze, N., Steed, A.: Automatic Recognition of Non-Acted Affective Postures. IEEE Trans. Syst. Man Cybern. B Cybern. (2011)Google Scholar
  18. 18.
    Schouwstra, S., Hoogstraten, J.: Head position and spinal position as determinants of perceived emotional state. Perceptual and Motor Skills 81(2), 673–674 (1995)CrossRefGoogle Scholar
  19. 19.
    Maestri, G.: Digital character animation. In: Kissane, E., Kalning, K. (eds.), 3rd edn. New Riders, Berkeley (2006)Google Scholar
  20. 20.
    Tonks, J., et al.: Assessing emotion recognition in 9-15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj. 21(6), 623–629 (2007)CrossRefGoogle Scholar
  21. 21.
    Woods, S., Dautenhahn, K., Schultz, J.: Child and adults’ perspectives on robot appearance. In: AISB 2005 Symposium on Robot Companion, pp. 126–132. SSAISB, Hatfield (2005)Google Scholar
  22. 22.
    Beck, A., et al.: Interpretation of Emotional Body Language Displayed by Robots. In: Affine 2010. ACM, Firenze (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Aryel Beck
    • 1
  • Lola Cañamero
    • 1
  • Luisa Damiano
    • 1
  • Giacomo Sommavilla
    • 2
  • Fabio Tesser
    • 2
  • Piero Cosi
    • 2
  1. 1.Adaptive Systems Research Group, School of Computer Science & STRIUniversity of HertfordshireUnited Kingdom
  2. 2.Institute of Cognitive Sciences and TechnologiesPadovaItaly

Personalised recommendations