Analyzing Gait Using a Time-of-Flight Camera

  • Rasmus R. Jensen
  • Rasmus R. Paulsen
  • Rasmus Larsen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5575)

Abstract

An algorithm is created, which performs human gait analysis using spatial data and amplitude images from a Time-of-flight camera. For each frame in a sequence the camera supplies cartesian coordinates in space for every pixel. By using an articulated model the subject pose is estimated in the depth map in each frame. The pose estimation is based on likelihood, contrast in the amplitude image, smoothness and a shape prior used to solve a Markov random field. Based on the pose estimates, and the prior that movement is locally smooth, a sequential model is created, and a gait analysis is done on this model. The output data are: Speed, Cadence (steps per minute), Step length, Stride length (stride being two consecutive steps also known as a gait cycle), and Range of motion (angles of joints). The created system produces good output data of the described output parameters and requires no user interaction.

Keywords

Time-of-flight camera Markov random fields gait analysis computer vision 

References

  1. 1.
    Artts (2009), http://www.artts.eu
  2. 2.
  3. 3.
    Alkjaer, E.B., Simonsen, T., Dygre-Poulsen, P.: Comparison of inverse dynamics calculated by two- and three-dimensional models during walking. In: 2001 Gait and Posture, pp. 73–77 (2001)Google Scholar
  4. 4.
    Besag, J.: On the statistical analysis of dirty pictures. Journal of the Royal Statistical Society. Series B (Methodological) 48(3), 259–302 (1986)MATHMathSciNetGoogle Scholar
  5. 5.
    Bray, M., Kohli, P., Torr, P.H.S.: Posecut: simultaneous segmentation and 3D pose estimation of humans using dynamic graph-cuts. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006. LNCS, vol. 3952, pp. 642–655. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  6. 6.
    Kolmogorov, V., Zabin, R.: What energy functions can be minimized via graph cuts? IEEE Transactions on Pattern Analysis and Machine Intelligence 26(2), 147–159 (2004)CrossRefGoogle Scholar
  7. 7.
    Latt, M.D., Menz, H.B., Fung, V.S., Lord, S.R.: Walking speed, cadence and step length are selected to optimize the stability of head and pelvis accelerations. Experimental Brain Research 184(2), 201–209 (2008)CrossRefGoogle Scholar
  8. 8.
    Nikolova, G.S., Toshev, Y.E.: Estimation of male and female body segment parameters of the bulgarian population using a 16-segmental mathematical model. Journal of Biomechanics 40(16), 3700–3707 (2007)CrossRefGoogle Scholar
  9. 9.
    Shakhnarovich, G., Viola, P., Darrell, T.: Fast pose estimation with parameter-sensitive hashing. In: Proceedings Ninth IEEE International Conference on Computer Vision, vol. 2, pp. 750–757 (2003)Google Scholar
  10. 10.
    Stauffer, C., Grimson, W.E.L.: Adaptive background mixture models for real-time tracking. In: Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149), vol. 2, pp. 246–252 (1999)Google Scholar
  11. 11.
    Wan, C., Yuan, B., Miao, Z.: Markerless human body motion capture using Markov random field and dynamic graph cuts. Visual Computer 24(5), 373–380 (2008)CrossRefGoogle Scholar
  12. 12.
    Ye, Q.-Z.: The signed Euclidean distance transform and its applications. In: 1988 Proceedings of 9th International Conference on Pattern Recognition, vol. 1, pp. 495–499 (1988)Google Scholar
  13. 13.
    Zhu, Y., Dariush, B., Fujimura, K.: Controlled human pose estimation from depth image streams. In: 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR Workshops), pp. 1–8 (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Rasmus R. Jensen
    • 1
  • Rasmus R. Paulsen
    • 1
  • Rasmus Larsen
    • 1
  1. 1.Informatics and Mathematical ModellingTechnical University of DenmarkKgs. LyngbyDenmark

Personalised recommendations