Expressive motions recognition and analysis with learning and statistical methods
- 23 Downloads
This paper proposes to recognize and analyze expressive gestures using a descriptive motion language, the Laban Movement Analysis (LMA) method. We extract body features based on LMA factors which describe both quantitative and qualitative aspects of human movement. In the direction of our study, a dataset of 5 gestures performed with 4 emotions is created using the motion capture Xsens. We used two different approaches for emotions analysis and recognition. The first one is based on a machine learning method, the Random Decision Forest. The second approach is based on the human’s perception. We derive the most important features for each expressed emotion using the same methods, the RDF and the human’s ratings. We compared the results obtained from the automatic learning method against human perception in the discussion section.
KeywordsExpressive motion recognition Laban movement analysis Random decision forest Human perception Features importance
We would like to thank the staff of the University of Evry Val d’Essonne for participating in our dataset. As well, we would like to thank Mrs. Alice Jourlin for help with data gathering and tabulation. This work was partially supported by the Strategic Research Initiatives project iCODE accredited by University Paris Saclay.
This study was funded by the Strategic Research Initiatives project iCODE, University Paris Saclay.
Compliance with Ethical Standards
Conflict of interests
Author Insaf Ajili declares that she has no conflict of interest. Author Zahra Ramezanpanah declares that she has no conflict of interest. Author Malik Mallem declares that he has no conflict of interest. Author Jean Yves Didier declares that he has no conflict of interest.
- 1.Ajili I, Mallem M, Didier J-Y (2017) Robust human action recognition system using laban movement analysis. Procedia Comput Sci 112(Supplement C):554–563. Knowledge-Based and Intelligent Information & Engineering Systems: Proceedings of the 21st International Conference, KES-20176-8, September 2017, Marseille, FranceCrossRefGoogle Scholar
- 2.Ajili I, Mallem M, Didier JY (2017) Gesture recognition for humanoid robot teleoperation. In: 2017 26Th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 1115–1120Google Scholar
- 3.Amaya K, Bruderlin A, Calvert T (1996) Emotion from motion. In: Proceedings of the Graphics Interface 1996 Conference, May 22-24, 1996, Toronto, Ontario, Canada, pp 222-229. Canadian Human-Computer Communications SocietyGoogle Scholar
- 4.Argyle M (1975) Bodily communication. Methuen Publishing Company, LondonGoogle Scholar
- 5.Aristidou A, Stavrakis E, Chrysanthou Y (2014) LMA-based motion retrieval for folk dance cultural heritage. Springer International Publishing, Cham, pp 207–216Google Scholar
- 6.Aristidou A, Stavrakis E, Papaefthimiou M, Papagiannakis G, Chrysanthou Y (2017) Style-based motion analysis for dance composition. The visual computerGoogle Scholar
- 11.Chen LS, Huang TS (2000) Emotional expressions in audiovisual human computer interaction. In: 2000 Proceedings IEEE International conference on multimedia and expo. ICME2000. Latest advances in the fast changing world of multimedia (cat. no.00TH8532), vol. 1, pp 423–426Google Scholar
- 12.Chi D, Costa M, Zhao L, Badler N (2000) The emote model for effort and shape. In: Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH ’00, pp 173–182, New York, NY, USA. ACM Press/Addison-Wesley Publishing CoGoogle Scholar
- 15.De Silva LC, Ng PC (2000) Bimodal emotion recognition. In: Proceedings 4th IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 332–335Google Scholar
- 16.Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems. Springer, Berlin, pp 1–15Google Scholar
- 17.Durupinar F, Kapadia M, Deutsch S, Neff M, Badler NI (2016) Perform: Perceptual approach for adding ocean personality to human motion using laban movement analysis. ACM Trans Graph, 36(1)Google Scholar
- 19.Kamaruddin N, Wahab A (2010) Driver behavior analysis through speech emotion understanding. In: 2010 IEEE Intelligent vehicles symposium, pp 238–243Google Scholar
- 20.Kanade T, Cohn JF, Tian Y (2000) Comprehensive database for facial expression analysis. In: Proceedings 4th IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 46–53Google Scholar
- 21.Kapadia M, Chiang I-k, Thomas T, Badler NI, Kider Jr. JT (2013) Efficient motion retrieval in large motion databases. In: Proceedings of the ACM SIGGRAPH symposium on interactive 3D graphics and games, I3D ’13, pp 19–28, New York, NY, USA. ACMGoogle Scholar
- 23.Kwon YH, da Vitoria Lobo N (1994) Age classification from facial images. In: 1994 Proceedings of IEEE conference on computer vision and pattern recognition, pp 762–767Google Scholar
- 25.Liu Y, Nie L, Han L, Zhang L, Rosenblum DS (2016) Action2activity: Recognizing complex activities from sensor data. CoRR, arXiv:1611.01872
- 28.Masuda M, Kato S (2010) Motion rendering system for emotion expression of human form robots based on laban movement analysis. In: 19Th international symposium in robot and human interactive communication, pp 324–329Google Scholar
- 34.von Laban R, Ullmann L (1980) The mastery of movement. Macdonald and Evans, EvansGoogle Scholar
- 36.Xsens technologies. 2005-2014Google Scholar
- 37.Zacharatos H, Gatzoulis C, Chrysanthou Y, Aristidou A (2013) Emotion recognition for exergames using laban movement analysis. In: Proceedings of Motion on Games, MIG ’13, pp 39:61–39:66, New York, NY, USA. ACMGoogle Scholar