Advertisement

Analyzing and Evaluating Markerless Motion Tracking Using Inertial Sensors

  • Andreas Baak
  • Thomas Helten
  • Meinard Müller
  • Gerard Pons-Moll
  • Bodo Rosenhahn
  • Hans-Peter Seidel
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6553)

Abstract

In this paper, we introduce a novel framework for automatically evaluating the quality of 3D tracking results obtained from markerless motion capturing. In our approach, we use additional inertial sensors to generate suitable reference information. In contrast to previously used marker-based evaluation schemes, inertial sensors are inexpensive, easy to operate, and impose comparatively weak additional constraints on the overall recording setup with regard to location, recording volume, and illumination. On the downside, acceleration and rate of turn data as obtained from such inertial systems turn out to be unsuitable representations for tracking evaluation. As our main contribution, we show how tracking results can be analyzed and evaluated on the basis of suitable limb orientations, which can be derived from 3D tracking results as well as from enhanced inertial sensors fixed on these limbs. Our experiments on various motion sequences of different complexity demonstrate that such limb orientations constitute a suitable mid-level representation for robustly detecting most of the tracking errors. In particular, our evaluation approach reveals also misconfigurations and twists of the limbs that can hardly be detected from traditional evaluation metrics.

Keywords

Tracking Error Inertial Sensor Orientation Data Tracking Result Global Coordinate System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Agarwal, A., Triggs, B.: Learning to track 3D human motion from silhouettes. In: ICML 2004: Proceedings of the 21st International Conference on Machine Learning, pp. 2–9. ACM, New York (2004)Google Scholar
  2. 2.
    Bregler, C., Malik, J., Pullen, K.: Twist based acquisition and tracking of animal and human kinetics. International Journal of Computer Vision 56(3), 179–194 (2004)CrossRefGoogle Scholar
  3. 3.
    Brodie, M., Walmsley, A., Page, W.: Fusion motion capture: a prototype system using inertial measurement units and GPS for the biomechanical analysis of ski racing. Sports Technology 1(1), 17–28 (2008)CrossRefGoogle Scholar
  4. 4.
    Brox, T., Rosenhahn, B., Kersting, U.G., Cremers, D.: Nonparametric Density Estimation for Human Pose Tracking. In: Franke, K., Müller, K.-R., Nickolay, B., Schäfer, R. (eds.) DAGM 2006. LNCS, vol. 4174, pp. 546–555. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  5. 5.
    CMU. CMU multi-modal activity database (2010), http://kitchen.cs.cmu.edu
  6. 6.
    Daniilidis, K.: Hand-eye calibration using dual quaternions. The International Journal of Robotics Research 18(3), 286–298 (1999)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Dejnabadi, H., Jolles, B.M., Casanova, E., Fua, P., Aminian, K.: Estimation and visualization of sagittal kinematics of lower limbs orientation using body-fixed sensors. IEEE Transactions on Biomedical Engineering 53(7), 1385–1393 (2006)CrossRefGoogle Scholar
  8. 8.
    Foxlin, E.: Pedestrian tracking with shoe-mounted inertial sensors. IEEE Computer Graphics and Applications 25(6), 38–46 (2005)CrossRefGoogle Scholar
  9. 9.
    Grassia, F.S.: Practical parameterization of rotations using the exponential map. Journal of Graphics, GPU, and Game Tools 3(3), 29–48 (1998)Google Scholar
  10. 10.
    Harada, T., Mori, T., Sato, T.: Development of a tiny orientation estimation device to operate under motion and magnetic disturbance. The International Journal of Robotics Research 26(6), 547–559 (2007)CrossRefGoogle Scholar
  11. 11.
    Hol, J.D., Schön, T.B., Gustafsson, F.: Relative pose calibration of a spherical camera and an IMU. In: 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, pp. 21–24 (September 2008)Google Scholar
  12. 12.
    Huynh, D.Q.: Metrics for 3D rotations: Comparison and analysis. Journal of Mathematical Imaging and Vision 35(2), 155–164 (2009)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Kunze, K., Lukowicz, P.: Dealing with sensor displacement in motion-based onbody activity recognition systems. In: UbiComp 2008: Proceedings of the 10th International Conference on Ubiquitous Computing, pp. 20–29. ACM, New York (2008)CrossRefGoogle Scholar
  14. 14.
    Lowe, D.: Solving for the parameters of object models from image descriptions. In: Image Understanding Workshop, College Park, pp. 121–127 (April 1980)Google Scholar
  15. 15.
    Müller, M.: Information Retrieval for Music and Motion. Springer (2007)Google Scholar
  16. 16.
    Multimodal human motion database MPI08, http://www.tnt.uni-hannover.de/project/MPI08_Database/
  17. 17.
    Park, F.C., Martin, B.J.: Robot sensor calibration: solving AX = XB on the Euclidean group. IEEE Transactions on Robotics and Automation 10(5), 717–721 (1994)CrossRefGoogle Scholar
  18. 18.
    Pons-Moll, G., Baak, A., Helten, T., Müller, M., Seidel, H.-P., Rosenhahn, B.: Multisensor-fusion for 3D full-body human motion capture. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR (June 2010) (to appear)Google Scholar
  19. 19.
    Roetenberg, D.: Inertial and magnetic sensing of human motion. These de doctorat (2006)Google Scholar
  20. 20.
    Rosenhahn, B., Brox, T., Seidel, H.-P.: Scaled motion dynamics for markerless motion capture. In: IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, Minnesota, pp. 1203–1210. IEEE (2007)Google Scholar
  21. 21.
    Schmaltz, C., Rosenhahn, B., Brox, T., Cremers, D., Weickert, J., Wietzke, L., Sommer, G.: Region-Based Pose Tracking. In: Martí, J., Benedí, J.M., Mendonça, A.M., Serrat, J. (eds.) IbPRIA 2007. LNCS, vol. 4478, pp. 56–63. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  22. 22.
    Seo, Y., Choi, Y.-J., Lee, S.W.: A branch-and-bound algorithm for globally optimal calibration of a camera-and-rotation-sensor system. In: IEEE 12th International Conference on Computer Vision, ICCV (September 2009)Google Scholar
  23. 23.
    Shiratori, T., Hodgins, J.K.: Accelerometer-based user interfaces for the control of a physically simulated character. In: ACM SIGGRAPH Asia, pp. 1–9. ACM, New York (2008)CrossRefGoogle Scholar
  24. 24.
    Shiu, Y.C., Ahmad, S.: Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX = XB. IEEE Transactions on Robotics and Automation 5(1), 16–29 (1989)CrossRefGoogle Scholar
  25. 25.
    Shoemake, K.: Animating rotation with quaternion curves. ACM SIGGRAPH Computer Graphics 19(3), 245–254 (1985)CrossRefGoogle Scholar
  26. 26.
    Sidenbladh, H., Black, M.J., Sigal, L.: Implicit Probabilistic Models of Human Motion for Synthesis and Tracking. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002, Part I. LNCS, vol. 2350, pp. 784–800. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  27. 27.
    Sigal, L., Black, M.: HumanEva: Synchronized video and motion capture dataset for evaluation of articulated human motion. Technical Report CS-06-08, Brown University, USA (2006), http://vision.cs.brown.edu/humaneva/
  28. 28.
    Slyper, R., Hodgins, J.: Action capture with accelerometers. In: ACM SIGGRAPH/Eurographics Symposium on Computer Animation (July 2008)Google Scholar
  29. 29.
    Strobl, K., Hirzinger, G.: Optimal hand-eye calibration. In: 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4647–4653 (October 2006)Google Scholar
  30. 30.
    Tao, Y., Hu, H., Zhou, H.: Integration of vision and inertial sensors for 3D arm motion tracking in home-based rehabilitation. International Journal of Robotics Research 26(6), 607–624 (2007)CrossRefGoogle Scholar
  31. 31.
    Thong, Y.K., Woolfson, M.S., Crowe, J.A., Hayes-Gill, B.R., Jones, D.A.: Numerical double integration of acceleration measurements in noise. Measurement 36(1), 73–92 (2004)CrossRefGoogle Scholar
  32. 32.
    Tsai, R., Lenz, R.: Real time versatile robotics hand/eye calibration using 3D machine vision. In: Proceedings of the IEEE International Conference on Robotics and Automation, vol. 1, pp. 554–561 (April 1988)Google Scholar
  33. 33.
    Vlasic, D., Adelsberger, R., Vannucci, G., Barnwell, J., Gross, M., Matusik, W., Popović, J.: Practical motion capture in everyday surroundings. ACM Transactions on Graphics 26(3), 35 (2007)CrossRefGoogle Scholar
  34. 34.
    Vlasic, D., Baran, I., Matusik, W., Popović, J.: Articulated mesh animation from multi-view silhouettes. ACM Transactions on Graphics 27(3), 1–9 (2008)CrossRefGoogle Scholar
  35. 35.
    Xsens Motion Technologies, http://www.xsens.com/ (accessed November 19, 2009)

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Andreas Baak
    • 1
  • Thomas Helten
    • 1
  • Meinard Müller
    • 1
  • Gerard Pons-Moll
    • 2
  • Bodo Rosenhahn
    • 2
  • Hans-Peter Seidel
    • 1
  1. 1.Saarland University & MPI InformatikGermany
  2. 2.Leibniz Universität HannoverGermany

Personalised recommendations