Advertisement

Expressive motions recognition and analysis with learning and statistical methods

  • Insaf AjiliEmail author
  • Zahra Ramezanpanah
  • Malik Mallem
  • Jean-Yves Didier
Article
  • 23 Downloads

Abstract

This paper proposes to recognize and analyze expressive gestures using a descriptive motion language, the Laban Movement Analysis (LMA) method. We extract body features based on LMA factors which describe both quantitative and qualitative aspects of human movement. In the direction of our study, a dataset of 5 gestures performed with 4 emotions is created using the motion capture Xsens. We used two different approaches for emotions analysis and recognition. The first one is based on a machine learning method, the Random Decision Forest. The second approach is based on the human’s perception. We derive the most important features for each expressed emotion using the same methods, the RDF and the human’s ratings. We compared the results obtained from the automatic learning method against human perception in the discussion section.

Keywords

Expressive motion recognition Laban movement analysis Random decision forest Human perception Features importance 

Notes

Acknowledgements

We would like to thank the staff of the University of Evry Val d’Essonne for participating in our dataset. As well, we would like to thank Mrs. Alice Jourlin for help with data gathering and tabulation. This work was partially supported by the Strategic Research Initiatives project iCODE accredited by University Paris Saclay.

Funding Information

This study was funded by the Strategic Research Initiatives project iCODE, University Paris Saclay.

Compliance with Ethical Standards

Conflict of interests

Author Insaf Ajili declares that she has no conflict of interest. Author Zahra Ramezanpanah declares that she has no conflict of interest. Author Malik Mallem declares that he has no conflict of interest. Author Jean Yves Didier declares that he has no conflict of interest.

References

  1. 1.
    Ajili I, Mallem M, Didier J-Y (2017) Robust human action recognition system using laban movement analysis. Procedia Comput Sci 112(Supplement C):554–563. Knowledge-Based and Intelligent Information & Engineering Systems: Proceedings of the 21st International Conference, KES-20176-8, September 2017, Marseille, FranceCrossRefGoogle Scholar
  2. 2.
    Ajili I, Mallem M, Didier JY (2017) Gesture recognition for humanoid robot teleoperation. In: 2017 26Th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 1115–1120Google Scholar
  3. 3.
    Amaya K, Bruderlin A, Calvert T (1996) Emotion from motion. In: Proceedings of the Graphics Interface 1996 Conference, May 22-24, 1996, Toronto, Ontario, Canada, pp 222-229. Canadian Human-Computer Communications SocietyGoogle Scholar
  4. 4.
    Argyle M (1975) Bodily communication. Methuen Publishing Company, LondonGoogle Scholar
  5. 5.
    Aristidou A, Stavrakis E, Chrysanthou Y (2014) LMA-based motion retrieval for folk dance cultural heritage. Springer International Publishing, Cham, pp 207–216Google Scholar
  6. 6.
    Aristidou A, Stavrakis E, Papaefthimiou M, Papagiannakis G, Chrysanthou Y (2017) Style-based motion analysis for dance composition. The visual computerGoogle Scholar
  7. 7.
    Barakova EI, Lourens T (2010) Expressing and interpreting emotional movements in social games with robots. Pers Ubiquit Comput 14(5):457–467CrossRefGoogle Scholar
  8. 8.
    Bradford Barber C, Dobkin DP, Huhdanpaa H (1996) The quickhull algorithm for convex hulls. ACM Trans Math Softw 22(4):469–483MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140zbMATHGoogle Scholar
  10. 10.
    Breiman L (2001) Random forests. Mach Learn 45(1):5–32zbMATHCrossRefGoogle Scholar
  11. 11.
    Chen LS, Huang TS (2000) Emotional expressions in audiovisual human computer interaction. In: 2000 Proceedings IEEE International conference on multimedia and expo. ICME2000. Latest advances in the fast changing world of multimedia (cat. no.00TH8532), vol. 1, pp 423–426Google Scholar
  12. 12.
    Chi D, Costa M, Zhao L, Badler N (2000) The emote model for effort and shape. In: Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH ’00, pp 173–182, New York, NY, USA. ACM Press/Addison-Wesley Publishing CoGoogle Scholar
  13. 13.
    Cimen G, Ilhan H, Capin T, Gurcay H (2013) Classification of human motion based on affective state descriptors. Comput Anim Virtual Worlds 24(3-4):355–363CrossRefGoogle Scholar
  14. 14.
    de Gelder B (2009) Why bodies twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc, B 364(1535):3475–3484CrossRefGoogle Scholar
  15. 15.
    De Silva LC, Ng PC (2000) Bimodal emotion recognition. In: Proceedings 4th IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 332–335Google Scholar
  16. 16.
    Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems. Springer, Berlin, pp 1–15Google Scholar
  17. 17.
    Durupinar F, Kapadia M, Deutsch S, Neff M, Badler NI (2016) Perform: Perceptual approach for adding ocean personality to human motion using laban movement analysis. ACM Trans Graph, 36(1)Google Scholar
  18. 18.
    Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844CrossRefGoogle Scholar
  19. 19.
    Kamaruddin N, Wahab A (2010) Driver behavior analysis through speech emotion understanding. In: 2010 IEEE Intelligent vehicles symposium, pp 238–243Google Scholar
  20. 20.
    Kanade T, Cohn JF, Tian Y (2000) Comprehensive database for facial expression analysis. In: Proceedings 4th IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 46–53Google Scholar
  21. 21.
    Kapadia M, Chiang I-k, Thomas T, Badler NI, Kider Jr. JT (2013) Efficient motion retrieval in large motion databases. In: Proceedings of the ACM SIGGRAPH symposium on interactive 3D graphics and games, I3D ’13, pp 19–28, New York, NY, USA. ACMGoogle Scholar
  22. 22.
    Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput 4(1):15–33CrossRefGoogle Scholar
  23. 23.
    Kwon YH, da Vitoria Lobo N (1994) Age classification from facial images. In: 1994 Proceedings of IEEE conference on computer vision and pattern recognition, pp 762–767Google Scholar
  24. 24.
    Lanitis A, Draganova C, Christodoulou C (2004) Comparing different classifiers for automatic age estimation. IEEE Trans Syst Man Cybern B Cybern 34(1):621–628CrossRefGoogle Scholar
  25. 25.
    Liu Y, Nie L, Han L, Zhang L, Rosenblum DS (2016) Action2activity: Recognizing complex activities from sensor data. CoRR, arXiv:1611.01872
  26. 26.
    Liu Y, Nie L, Liu L, Rosenblum DS (2016) From action to activity: Sensor-based activity recognition. Neurocomputing 181:108–115. Big data driven intelligent transportation systemsCrossRefGoogle Scholar
  27. 27.
    Lourens T, van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using laban movement analysis. Robot Auton Syst 58(12):1256–1265. Intelligent Robotics and NeuroscienceCrossRefGoogle Scholar
  28. 28.
    Masuda M, Kato S (2010) Motion rendering system for emotion expression of human form robots based on laban movement analysis. In: 19Th international symposium in robot and human interactive communication, pp 324–329Google Scholar
  29. 29.
    Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 6(2):133–152CrossRefGoogle Scholar
  30. 30.
    Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91(9):1370–1390CrossRefGoogle Scholar
  31. 31.
    Ramanathan N, Chellappa R (2006) Face verification across age progression. IEEE Trans Image Process 15(11):3349–3361CrossRefGoogle Scholar
  32. 32.
    Shafir T, Tsachor RP, Welch KB (2016) Emotion regulation through movement: Unique sets of movement characteristics are associated with and enhance basic emotions. Front Psychol 6:2030CrossRefGoogle Scholar
  33. 33.
    Tavakol M, Dennick R (2011) Making sense of cronbach’s alpha. Int J Med Educ 2:53–55CrossRefGoogle Scholar
  34. 34.
    von Laban R, Ullmann L (1980) The mastery of movement. Macdonald and Evans, EvansGoogle Scholar
  35. 35.
    Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896CrossRefGoogle Scholar
  36. 36.
    Xsens technologies. 2005-2014Google Scholar
  37. 37.
    Zacharatos H, Gatzoulis C, Chrysanthou Y, Aristidou A (2013) Emotion recognition for exergames using laban movement analysis. In: Proceedings of Motion on Games, MIG ’13, pp 39:61–39:66, New York, NY, USA. ACMGoogle Scholar
  38. 38.
    Zhao L, Badler NI (2005) Acquiring and validating motion qualities from live limb gestures. Graph Model 67(1):1–16CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.IBISCUniv Evry, Université Paris-SaclayEvryFrance

Personalised recommendations