Advertisement

Real-Time Emotion Recognition from Natural Bodily Expressions in Child-Robot Interaction

  • Weiyi WangEmail author
  • Georgios Athanasopoulos
  • Georgios Patsis
  • Valentin Enescu
  • Hichem Sahli
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8927)

Abstract

Emotion perception and interpretation is one of the key desired capabilities of assistive robots, which could largely enhance the quality and naturalness in human-robot interaction. According to psychological studies, bodily communication has an important role in human social behaviours. However, it is very challenging to model such affective bodily expressions, especially in a naturalistic setting, considering the variety of expressive patterns, as well as the difficulty of acquiring reliable data. In this paper, we investigate the spontaneous dimensional emotion prediction problem in a child-robot interaction scenario. The paper presents emotion elicitation, data acquisition, 3D skeletal representation, feature design and machine learning algorithms. Experimental results have shown good predictive performance on the variation trends of emotional dimensions, especially the arousal dimension.

Keywords

Spontaneous emotion recognition Child-robot interaction Bodily expressions 

References

  1. 1.
    Aggarwal, J., Cai, Q.: Human motion analysis: a review. In: Proc. of Nonrigid and Articulated Motion Workshop, pp. 90–102 (1997)Google Scholar
  2. 2.
    Alaerts, K., Nackaerts, E., Meyns, P., Swinnen, S.P., Wenderoth, N.: Action and emotion recognition from point light displays: an investigation of gender differences. PLoS ONE 6(6), e20989 (2011)CrossRefGoogle Scholar
  3. 3.
    Atkinson, A., Dittrich, W., Gemmell, A., Young, A.: Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33(6), 717–746 (2004)CrossRefGoogle Scholar
  4. 4.
    Baron-Cohen, S., Tead, T.: Mind Reading: the Interactive Guide to Emotions. Jessica Kingsley Publishers Ltd. (2003)Google Scholar
  5. 5.
    Beck, A., Stevens, B., Bard, K., Canamero, L.: Emotional Body Language Displayed by Artificial Agents. ACM Transactions on Interactive Intelligent Systems 2(1), 1–29 (2012). Special issue on Affective Interaction in Natural EnvironmentsCrossRefGoogle Scholar
  6. 6.
    Bernhardt, D.: Emotion inference from human body motion. Tech. Rep. 787, Computer Laboratory, University of Cambridge, Cambridge (2010)Google Scholar
  7. 7.
    Bianchi-Berthouze, N., Kleinsmith, A.: A categorical approach to affective gesture recognition. Connection Science 15(4), 259–269 (2003)CrossRefGoogle Scholar
  8. 8.
    Csato, L., Opper, M.: Sparse On-line Gaussian Processes. Neural Computation 14(3), 641–668 (2002)CrossRefzbMATHGoogle Scholar
  9. 9.
    De Silva, P., Osano, M., Marasinghe, A., Madurapperuma, A.: Towards recognizing emotion with affective dimensions through body gestures. In: Proceedings of 7th International Conference on Automatic Face and Gesture Recognition (FG 2006), pp. 269–274. IEEE (2006)Google Scholar
  10. 10.
    Dittrich, W., Troscianko, T., Lea, S., Morgan, D.: Perception of emotion from dynamic point-light displays represented in dance. Perception 25(6), 727–738 (1996)CrossRefGoogle Scholar
  11. 11.
    Ekman, P.: Basic emotions. In: Handbook of Cognition and Emotion, chap. 3. No. 1992 (1999)Google Scholar
  12. 12.
    de Gelder, B.: Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philosophical Tran. of the Royal Society B: Biological Sciences 364, 3475–3484 (2009)CrossRefGoogle Scholar
  13. 13.
    Gonzalez, I., Sahli, H., Enescu, V., Verhelst, W.: Context-independent facial action unit recognition using shape and gabor phase information. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011, Part I. LNCS, vol. 6974, pp. 548–557. Springer, Heidelberg (2011) CrossRefGoogle Scholar
  14. 14.
    Gross, J.J., Levenson, R.W.: Emotion Elicitation Using Films. Cognition and Emotion 9(1), 87–108 (1995)CrossRefGoogle Scholar
  15. 15.
    Gross, M.M., Crane, E.A., Fredrickson, B.L.: Methodology for Assessing Bodily Expression of Emotion. Journal of Nonverbal Behavior 34(4), 223–248 (2010)CrossRefGoogle Scholar
  16. 16.
    Gunes, H., Piccardi, M.: Automatic temporal segment detection and affect recognition from face and body display. IEEE Transactions on Systems, Man, and Cybernetics. Part B, Cybernetics: A Publication of the IEEE Systems, Man, and Cybernetics Society 39(1), 64–84 (2009)CrossRefGoogle Scholar
  17. 17.
    Jiang, D., Cui, Y., Zhang, X., Fan, P., Ganzalez, I., Sahli, H.: Audio visual emotion recognition based on triple-stream dynamic bayesian network models. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011, Part I. LNCS, vol. 6974, pp. 609–618. Springer, Heidelberg (2011) CrossRefGoogle Scholar
  18. 18.
    Kahol, K., Tripathi, P., Panchanathan, S.: Gesture segmentation in complex motion sequences. In: Proc. of International Conference on Image Processing (ICIP 2003) (2003)Google Scholar
  19. 19.
    Kapur, A., Kapur, A., Virji-Babul, N., Tzanetakis, G., Driessen, P.F.: Gesture-based affective computing on motion capture data. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 1–7. Springer, Heidelberg (2005) CrossRefGoogle Scholar
  20. 20.
    Kleinsmith, A., Bianchi-Berthouze, N.: Affective Body Expression Perception and Recognition: A Survey. IEEE Transactions on Affective Computing 4(1), 15–33 (2013)CrossRefGoogle Scholar
  21. 21.
    Larsen, J.T., McGraw, A.P.: Further evidence for mixed emotions. Journal of Personality and Social Psychology 100(6), 1095–1110 (2011)CrossRefGoogle Scholar
  22. 22.
    Mckeown, G., Valstar, M., Cowie, R., Pantic, M., Member, S., Schr, M.: The SEMAINE database: annotated multimodal records of emotionally coloured conversations between a person and a limited agent. IEEE Transactions on Affective Computing 3(1), 5–17 (2012)CrossRefGoogle Scholar
  23. 23.
    Metallinou, A., Katsamanis, A., Narayanan, S.: Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information. Image and Vision Computing, September 2012Google Scholar
  24. 24.
    N/A: Aldebaran Robotics. http://www.aldebaran.com
  25. 25.
    N/A: Ipi Mocap Studio. http://ipisoft.com/
  26. 26.
    Picard, R.: Affective computing. Tech. Rep. 321, MIT (1995)Google Scholar
  27. 27.
    Preston, S., de Waal, F.: Empathy: Its Ultimate and Proximate. Behavioral and Brian Sciences 252, 1–72 (2002)Google Scholar
  28. 28.
    Roberts, N.A., Tsai, J.L., Coan, J.A.: Emotion elicitation using dyadic interaction tasks. In: Handbook of Emotion Elicitation and Assessment, pp. 106–123 (2007)Google Scholar
  29. 29.
    Russell, J.A.: A Circumplex Model of Affect. Journal of Personality & Social Psychology 39, 1161–1178 (1980)CrossRefGoogle Scholar
  30. 30.
    Russell, J.A.: Core affect and the psychological construction of emotion. Psychological Review 110(1), 145–172 (2003)CrossRefGoogle Scholar
  31. 31.
    Scherer, K.R.: What Are Emotions? And How Can They Be Measured. Social Science Information 44(4), 695–729 (2005)CrossRefGoogle Scholar
  32. 32.
    Schindler, K., Van Gool, L., de Gelder, B.: Recognizing emotions expressed by body pose: a biologically inspired neural model. Neural Networks: The Official Journal of the International Neural Network Society 21(9), 1238–1246 (2008)CrossRefGoogle Scholar
  33. 33.
    Soh, H.: Online spatio-temporal gaussian process experts with application to tactile classification. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (2012)Google Scholar
  34. 34.
    Verhelst, W., Roelands, M.: An overlap-add technique based on waveform similarity (wsola) for high quality time-scale modification of speech. In: ICASSP 1993, vol. 2, pp. 554–557 (1993)Google Scholar
  35. 35.
    Wallbott, H.G.: Bodily Expression of Emotion. European Journal of Social Psychology 28(6), 879–896 (1998)CrossRefGoogle Scholar
  36. 36.
    Wang, F., Verhelst, W., Sahli, H.: Relevance vector machine based speech emotion recognition. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011, Part II. LNCS, vol. 6975, pp. 111–120. Springer, Heidelberg (2011) CrossRefGoogle Scholar
  37. 37.
    Wang, W., Athanasopoulos, G., Yilmazyildiz, S., Patsis, G., Enescu, V., Sahli, H., Verhelst, W., Hiolle, A., Lewis, M., Cañamero, L.: Natural emotion elicitation for emotion modeling in child-robot interactions. In: Proc. of Workshop on Child Computer Interaction (WOCCI 2014) (2014, to appear)Google Scholar
  38. 38.
    Wang, W., Enescu, V., Sahli, H.: Towards real-time continuous emotion recognition from body movements. In: Salah, A.A., Hung, H., Aran, O., Gunes, H. (eds.) HBU 2013. LNCS, vol. 8212, pp. 235–245. Springer, Heidelberg (2013) CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Weiyi Wang
    • 1
    Email author
  • Georgios Athanasopoulos
    • 1
  • Georgios Patsis
    • 1
  • Valentin Enescu
    • 1
  • Hichem Sahli
    • 1
    • 2
  1. 1.Department of Electronics and Informatics (ETRO) - AVSPVrije Universiteit Brussel (VUB)BrusselsBelgium
  2. 2.Interuniversity Microelectronics Centre (IMEC)HeverleeBelgium

Personalised recommendations