Journal of Nonverbal Behavior

, Volume 32, Issue 2, pp 79–92 | Cite as

Recognition of Emotions in Gait Patterns by Means of Artificial Neural Nets

  • Daniel JanssenEmail author
  • Wolfgang I. Schöllhorn
  • Jessica Lubienetzki
  • Karina Fölling
  • Henrike Kokenge
  • Keith Davids
Original Paper


This paper describes an application of emotion recognition in human gait by means of kinetic and kinematic data using artificial neural nets. Two experiments were undertaken, one attempting to identify participants’ emotional states from gait patterns, and the second analyzing effects on gait patterns of listening to music while walking. In the first experiment gait was analyzed as participants attempted to simulate four distinct emotional states (normal, happy, sad, angry). In the second experiment, participants were asked to listen to different types of music (excitatory, calming, no music) before and during gait analysis. Derived data were fed into different types of artificial neural nets. Results showed not only a clear distinction between individuals, but also revealed clear indications of emotion recognition in nets.


Emotion Gait Music Neural network Pattern recognition 



We wish to thank Larry Katz and Veronica Everton-Williams for their useful comments.


  1. Barton, G., Lees, A., Lisboa, P., & Attfield, S. (2006). Visualisation of gait data with Kohonen self-organising neural maps. Gait & Posture, 24, 46–53.CrossRefGoogle Scholar
  2. Bauer, H. U., & Schöllhorn, W. (1997). Self-organizing maps for the analysis of complex movement patterns. Neural Processing Letters, 5, 193–199.CrossRefGoogle Scholar
  3. Becker, N., Brett, S., Chambliss, C., Crowers, K., Haring, P., Marsh, C., et al. (1994). Mellow and frenetic antecedent music during athletic performance of children, adults, and seniors. Perceptual and Motor Skills, 79, 1043–1046.PubMedGoogle Scholar
  4. Benabdelkader, C., Cutler, R. G., & Davis, L. S. (2004). Gait recognition using image self-similarity. Eurasip Journal on Applied Signal Processing, 2004, 572–585.CrossRefGoogle Scholar
  5. Bichot, N. P., & Desimone, R. (2006). Finding a face in the crowd: Parallel and serial neural mechanisms of visual selection. In S. Martinez-Conde, S. L. Macknik, L. M. Martinez, J.-M. Alonso, & P. U. Tse (Eds.), Progress in brain research; visual perception – fundamentals of awareness: Multi-sensory integration and high-order perception (pp. 147–156). Elsevier.Google Scholar
  6. Bradley, M. M., Codispoti, M., Cuthbert, B. N., & Lang, P. J. (2001). Emotion and motivation I: Defensive and appetitive reactions in picture processing. Emotion, 1, 276–298.PubMedCrossRefGoogle Scholar
  7. Camurri, A., Castellano, G., Ricchetti, M., & Volpe, G. (2006). Subject interfaces: Measuring bodily activation during an emotional experience of music. Gesture in Human-Computer Interaction and Simulation, 3881, 268–279.CrossRefGoogle Scholar
  8. Camurri, A, Mazzarino, B., & Volpe, G. (2004). Expressive interfaces. Cognition, Technology & Work, 6, 15–22.CrossRefGoogle Scholar
  9. Camurri, A., Lagerlof, I., & Volpe, G. (2003). Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques. International Journal of Human-Computer Studies, 59, 213–225.CrossRefGoogle Scholar
  10. Clarke, T. J., Bradshaw, M. F., Field, D. T., Hampson, S. E., & Rose, D. (2005). The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception, 34, 1171–1180.PubMedCrossRefGoogle Scholar
  11. Coombes, S. A., Cauraugh, J. H., & Janelle, C. M. (2006). Emotion and movement: Activation of defensive circuitry alters the magnitude of a sustained muscle contraction. Neuroscience Letters, 396, 192–196.PubMedCrossRefGoogle Scholar
  12. Coombes, S. A., Janelle, C. M., & Duley, A. R. (2005). Emotion and motor control: Movement attributes following affective picture processing. Journal of Motor Behavior, 37, 425–436.PubMedCrossRefGoogle Scholar
  13. Coulson, M. (2004). Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior, 28, 117–139.CrossRefGoogle Scholar
  14. Cutting, J. E., & Kozlowski, L. T. (1977). Recognizing friends by their walk – gait perception without familiarity cues. Bulletin of the Psychonomic Society, 9, 353–356.Google Scholar
  15. De Meijer, M. (1989). The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior, 13, 247–268.CrossRefGoogle Scholar
  16. Dittrich, W. H., Troscianko, T., Lea, S. E., & Morgan, D. (1996). Perception of emotion from dynamic point-light displays represented in dance. Perception, 25, 727–738.PubMedCrossRefGoogle Scholar
  17. Ekman, P., & Friesen, W. (1978). The facial action coding system. Palo Alto: Consulting Psychologists Press.Google Scholar
  18. Ferguson, A. R., Carbonneau, M. R., & Chambliss, C. (1994). Effects of positive and negative music on performance of a karate drill. Perceptual and Motor Skills, 78, 1217–1218.PubMedGoogle Scholar
  19. Fragopanagos, N., & Taylor, J. G. (2005). Emotion recognition in human–computer interaction. Neural Networks, 18, 389–405.PubMedCrossRefGoogle Scholar
  20. Gunes, H., & Piccardi, M. (2005). Fusing face and body display for bi-modal emotion recognition: Single frame analysis and multi-frame post integration. Affective Computing and Intelligent Interaction, Proceedings, 3784, 102–111.CrossRefGoogle Scholar
  21. Guzzetta, C. E. (1989). Effects of relaxation and music-therapy on patients in a coronary-care unit with presumptive acute myocardial-infarction. Heart & Lung, 18, 609–616.Google Scholar
  22. Haykin, S. (1998). Neural networks: A comprehensive foundation. Prentice Hall PTR.Google Scholar
  23. Ichimura, T., Ishida, H., Terauchi, M., Takahama, T., & Isomichi, Y. (2001). Extraction of emotion from facial expression by parallel sand glass type neural networks. In Proceedings of KES 2001, 5th International Conference on Knowledge Based Intelligent Information Engineering Systems and Allied Technology, Osaka, Japan, 6–8 Sept. 2001 (Vol. 2, pp. 988–992). Amsterdam: IOS Press.Google Scholar
  24. Ioannou, S. V., Raouzaiou, A. T., Tzouvaras, V. A., Mailis, T. P., Karpouzis, K. C., & Kollias, S. D. (2005). Emotion recognition through facial expression analysis based on a neurofuzzy network. Neural Networks, 18, 423–435.PubMedCrossRefGoogle Scholar
  25. Kaiser, S., & Wehrle, T. (1992). Automated coding of facial behavior in human–computer interactions with FACS. Journal of Nonverbal Behavior, 16, 67–84.CrossRefGoogle Scholar
  26. Kohonen, T. (1982). Self-organized formation of topologically correct feature maps. Biological Cybernetics, 43, 59–69.CrossRefGoogle Scholar
  27. Kohonen, T., Hynninen, J., Kangas, J., & Laaksonen, J. (1995). The self-organizing map program package, version 3.1. SOM Programming Team of the Helsinki University of Technology, Laboratory of Computer and Information Science, Espoo.Google Scholar
  28. Møller, M. F. (1993). A scaled conjugate gradient algorithm for fast supervised learning. Neural Networks, 6, 525–533.CrossRefGoogle Scholar
  29. Montepare, J., Koff, E., Zaitchik, D., & Albert, M. (1999). The use of body movements and gestures as cues to emotions in younger and older adults. Journal of Nonverbal Behavior, 23, 133–152.CrossRefGoogle Scholar
  30. Montepare, J. M., Goldstein, S. B., & Clausen, A. (1987). The identification of emotions from gait information. Journal of Nonverbal Behavior, 11, 33–42.CrossRefGoogle Scholar
  31. Nguyen, D., & Widrow, B. (1990). Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. Proceedings of the International Joint Conference on Neural Networks, 3, 21–26.CrossRefGoogle Scholar
  32. Nicholson, J., Takahashi, K., & Nakatsu, R. (2000). Emotion recognition in speech using neural networks. Neural Computing & Applications, 9, 290–296.CrossRefGoogle Scholar
  33. Nwe, T. L., Foo, S. W., & De Silva, L. C. (2003). Speech emotion recognition using Hidden Markov models. Speech Communication, 41, 603–623.CrossRefGoogle Scholar
  34. Park, C. H., Byun, K. S., & Sim, K. B. (2005). The implementation of the emotion recognition from speech and facial expression system. Lecture Notes in Computer Science, 3611, 85–88.CrossRefGoogle Scholar
  35. Richardson, M. J., & Johnston, L. (2005). Person recognition from dynamic events: The kinematic specification of individual identity in walking style. Journal of Nonverbal Behavior, 29, 25–44.CrossRefGoogle Scholar
  36. Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning representations by back-propagating errors. Nature, 323, 533–536.CrossRefGoogle Scholar
  37. Sawada, M., Suda, K., & Ishii, M. (2003). Expression of emotions in dance: Relation between arm movement characteristics and emotion. Perceptual and Motor Skills, 97, 697–708.PubMedCrossRefGoogle Scholar
  38. Schöllhorn, W. I. (2004). Applications of artificial neural nets in clinical biomechanics. Clinical Biomechanics, 19, 876–898.PubMedCrossRefGoogle Scholar
  39. Schöllhorn, W. I., Jäger, J. M., & Janssen, D. (2008). Artificial neural network models of sports motions. In Y.B. Hong & R. Bartlett (Eds.), Routledge handbook of biomechanics and human movement science. Routledge: London.Google Scholar
  40. Schöllhorn, W. I., Nigg, B. M., Stefanyshyn, D. J., & Liu, W. (2002). Identification of individual walking patterns using time discrete and time continuous data sets. Gait & Posture, 15, 180–186.CrossRefGoogle Scholar
  41. Sloman, L., Berridge, M., Homatidis, S., Hunter, D., & Duck, T. (1982). Gait patterns of depressed patients and normal subjects. American Journal of Psychiatry, 139, 94–97.PubMedGoogle Scholar
  42. Sloman, L., Pierrynowski, M., Berridge, M., Tupling, S., & Flowers, J. (1987). Mood, depressive illness and gait patterns. Canadian Journal of Psychiatry, 32, 190–193.Google Scholar
  43. Tenenbaum, G., Lidor, R., Lavyan, N., Morrow, K., Tonnel, S., Gershgoren, A., et al. (2004). The effect of music type on running perseverance and coping with effort sensations. Psychology of Sport and Exercise, 5, 89–109.CrossRefGoogle Scholar
  44. Vesanto, J., Himberg, J., Alhoniemi, E., & Parhankangas, J. (2000). SOM Toolbox for Matlab 5. Report A57. Helsinki University of Technology, Neural Networks Research Centre, Espoo.Google Scholar
  45. Walk, R. D., & Homan, C. P. (1984). Emotion and dance in dynamic light displays. Bulletin of the Psychonomic Society, 22, 437–440.Google Scholar
  46. Wallbott, H. G., & Scherer, K. R. (1986). Cues and channels in emotion recognition. Journal of Personality and Social Psychology, 51, 690–699.CrossRefGoogle Scholar
  47. Yamamoto, T., Ohkuwa, T., Itoh, H., Kitoh, M., Terasawa, J., Tsuda, T., et al. (2003). Effects of pre-exercise listening to slow and fast rhythm music on supramaximal cycle performance and selected metabolic variables. Archives of Physiology and Biochemistry, 111, 211–214.PubMedCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2008

Authors and Affiliations

  • Daniel Janssen
    • 1
    Email author
  • Wolfgang I. Schöllhorn
    • 1
  • Jessica Lubienetzki
    • 2
  • Karina Fölling
    • 2
  • Henrike Kokenge
    • 2
  • Keith Davids
    • 3
  1. 1.Training and Movement ScienceUniversity of MainzMainzGermany
  2. 2.Training ScienceUniversity of MuensterMünsterGermany
  3. 3.School of Human Movement StudiesQueensland University of TechnologyBrisbaneAustralia

Personalised recommendations