Advertisement

Using Dynamics to Recognize Human Motion

  • Gentiane VentureEmail author
  • Takumi Yabuki
  • Yuta Kinase
  • Alain Berthoz
  • Naoko Abe
Chapter
Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 111)

Abstract

We explore the importance of the dynamics of motion, and how it can be used first to develop and to personalize intelligent systems that can understand human motions, then to analyze motions. We propose a framework that uses not only the kinematics information of movements but also the dynamics and allows to classify, analyze and recognize motions, emotions in a non-verbal context. We use the direct measure of the dynamics when available. If not we propose to compute the dynamics from the kinematics, and use it to understand human motions. Finally, we discuss some developments and concrete applications in the field of motion analysis and give some experimental results using gait and simple choreography.

Keywords

Feature Vector Contact Force Emotion Recognition Human Motion Joint Torque 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    A. Torezen, Human Body Dynamics: Classical Mechanics and Human Movement (Springer, Berlin, 2000)Google Scholar
  2. 2.
    M.Z. Patoli et al., Real time online motion capture for entertainment applications, in The 3rd IEEE International Conference on Digital Game and Intelligent Toy Enhanced Learning DIGITEL (2010), pp. 139–145Google Scholar
  3. 3.
    A. Fern’ndez-Baena et al., Biomechanical validation of upper-body and lower-body joint movements of kinect motion capture data for rehabilitation treatments, in Proceedings of 4th International Conference on INCoS (2012), pp. 656–661Google Scholar
  4. 4.
    F. Loula et al., Recognizing people from their movement. J. Exp. Psychol. Human Percept. Perform. 31(1):210–220 (2005)Google Scholar
  5. 5.
    A. Hutchinson Guest, Choreographics: A Comparison of Dance Notation Systems from the Fifteenth Century to the Present (Routledge, 1989)Google Scholar
  6. 6.
    W. Choensawat, M. Nakamura, K. Hachimura, GenLaban: a tool for generating Labanotation from motion capture data. Multimedia Tools Appl. 1–24 (2014)Google Scholar
  7. 7.
    A. Soga, B. Umino, T. Yasuda, S. Yokoi, Automatic composition and simulation system for ballet sequences. Vis. Comput. 23(5), 309–316 (2007)CrossRefGoogle Scholar
  8. 8.
    A. Hutchinson Guest, Labanotation: The System of Analyzing and Recording Movement, 4th edn. (Routledge, 2005)Google Scholar
  9. 9.
    K. Yoshida, D. Nenchev, M. Uchiyama, Moving base robotics and reaction management control, in Proceedings of the 7th International Symposium of Robotics Research (1995), pp. 100–109Google Scholar
  10. 10.
    Y. Fujimoto, S. Obata, A. Kawamura, Robust biped walking with active interaction control between foot and ground, in Proceedings of the IEEE International Conference on Robotics and Automation (1998),pp. 2030–2035Google Scholar
  11. 11.
    W. Khalil, E. Dombre, Modeling, Identification and Control of Robots (Hermès Penton, London, 2002)zbMATHGoogle Scholar
  12. 12.
    A, Safonova, J.K. Hodgins, N.S. Pollard, Synthesizing physically realistic human motion in low-dimensional, behavior-specific spaces, in ACM Transactions on Graphics, vol. 23, no. 3 (ACM, 2004), pp. 514–521Google Scholar
  13. 13.
    G. Venture, H. Kadone, T. Zhang, J. Grezes, A. Berthoz, H. Hicheur, Recognizing emotions conveyed by Human Gait. Int. J. Social Robot. 6(4), 621–632 (2014)CrossRefGoogle Scholar
  14. 14.
    W. Takano et al., Humanoid robot’s autonomous acquisition of proto-symbols through motion segmentation, in IEEE International Conference on Humanoid Robots (2006), pp. 425–431Google Scholar
  15. 15.
    D. Kulic et al., Online segmentation and clustering from continuous observation of whole body motions. IEEE Trans. Rob. 25(5), 1158–1166 (2009)CrossRefGoogle Scholar
  16. 16.
    F. Ofli, R. Chaudhry, G. Kurillo, R. Vidal, R. Bajcsy, Sequence of the most informative joints (smij): a new representation for human skeletal action recognition. J. Vis. Commun. Image Represent. 25(1), 24–38 (2014)CrossRefGoogle Scholar
  17. 17.
    A. Hutchinson Guest, Dance Notation: The process of recording Movement on Paper (Dance Horizons, New York, 1984)Google Scholar
  18. 18.
    J.P. Scholz, G. Schöner, The uncontrolled manifold concept: identifying control variables for a functional task. Exp. Brain Res. 126(3), 289–306 (1999)CrossRefGoogle Scholar
  19. 19.
    C. Yang, G. Ganesh, S. Haddadin, S. Parusel, A. Albu-Schaeffer, E. Burdet, Human-like adaptation of force and impedance in stable and unstable interactions. IEEE Trans. Robot. 27(5), 918–930 (2011)CrossRefGoogle Scholar
  20. 20.
    G. Venture, C. Hamon, Motion recognition from contact forces information and identification of the human body dynamics, in IEEE/RAS-EMBS Proceedings of International Conference on Biomedical Robotics and Biomechatronics (Tokyo, Japan, 2010), pp. 295–300Google Scholar
  21. 21.
    R. Laban, The Mastery of Movement. 4th edn. ed. by L. Ullmann. (MacDonald and Evans, London, 1980). (First published as The Mastery of Movement on the Stage, 1950Google Scholar
  22. 22.
    H. Kadone, Y. Nakamura, Symbolic memory of motion patterns by an associative memory dynamics with self-organizing non-monotonicity. Curr. Biol. 18(8), 300–329 (2008)Google Scholar
  23. 23.
    T. Zhang, G. Venture, Individual recognition from gait using feature value method. Cybern. Inf. Technol. 12(3), 86–95 (2012)Google Scholar
  24. 24.
    B. de Gelder, Towards the neurobiology of emotional body language. Nat Rev Neuroscience 7, 242–249 (2006)CrossRefGoogle Scholar
  25. 25.
    C. Roether, L. Omlor, A. Christensen, M. Giese, Critical features for the perception of emotion from gait. J. Vis. 9(6), 1–32 (2009)CrossRefGoogle Scholar
  26. 26.
    T. Yabuki, G. Venture, Human motion classification and recognition using only contact force, in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (Hamburg, Germany, 2015). (submitted)Google Scholar
  27. 27.
    P.M.A. Desmet, Measuring emotions, development and application of an instrument to measure emotional responses to products. in Funology: From Usability to Enjoyment eds. by M.A. Blythe, A.F. Monk, K. Overbeeke, P.C. Wright (Kluwer Academic Publishers, 2003), pp. 111–123Google Scholar
  28. 28.
    X. Zhu, Emotion recognition of EMG based on BP neural network, in Proceedings of the Second International Symposium on Networking and Network Security (2010), pp. 227–229Google Scholar
  29. 29.
    M. Murugappan, Electromyogram signal based human emotion classification using KNN and LDA, in IEEE International Conference on System Engineering and Technology (2011), pp. 106–110Google Scholar
  30. 30.
    T. Partala, V. Surakka, T. Vanhala, Real-time estimation of emotional experiences from facial expressions. Interact. Comput. 18, 208–226 (2006)CrossRefGoogle Scholar
  31. 31.
    A. Gruebler, K. Suzuki, Measurement of distal EMG Signals using a wearable device for reading facial expressions, in Engineering for Medicine and Biology Conference (EMBC) (2010), pp. 4594–4597Google Scholar
  32. 32.
    T. Taniguchi, Construction of an affective value scale of music and examination of relations between the scale and a multiple mood scale. Jpn. J. Psychol. 65(6), 463–470 (1995)CrossRefGoogle Scholar
  33. 33.
    Y. Sakairi, H. Tokuda, M. Kawahara, T. Tanigi, H. Soya, Development of the two dimension mood scale for measuring psychological arousal level and hedonic tone. Bull. Inst. Health Sport Sci. Univ. Tsukuba 26:27–36 (2003)Google Scholar
  34. 34.
    T. Yabuki, G. Venture, Motion recognition from contact force measurement, in Proceedings of the IEEE International Conference Engineering in Medicine and Biology (Osaka, Japan, 2013), pp. 7245–7248 Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Gentiane Venture
    • 1
    Email author
  • Takumi Yabuki
    • 1
  • Yuta Kinase
    • 1
  • Alain Berthoz
    • 2
  • Naoko Abe
    • 3
  1. 1.Tokyo University of Agriculture and TechnologyKoganeiJapan
  2. 2.Laboratoire de Physiologie et de Perception de l’ActionCollege de FranceParisFrance
  3. 3.Gepetto TEAMLAAS-CNRSToulouseFrance

Personalised recommendations