Advertisement

Visual tracking of high DOF articulated structures: An application to human hand tracking

  • James M. Rehg
  • Takeo Kanade
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 801)

Abstract

Passive sensing of human hand and limb motion is important for a wide range of applications from human-computer interaction to athletic performance measurement. High degree of freedom articulated mechanisms like the human hand are difficult to track because of their large state space and complex image appearance. This article describes a model-based hand tracking system, called DigitEyes, that can recover the state of a 27 DOF hand model from ordinary gray scale images at speeds of up to 10 Hz.

Keywords

Kinematic Chain Visual Tracking Human Hand Residual Vector Hand Tracking 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    A. Blake, R. Curwen, and A. Zisserman. A framework for spatiotemporal control in the tracking of visual contours. Int. J. Computer Vision, 11(2):127–145, 1993.Google Scholar
  2. 2.
    T. Darrell and A. Pentland. Space-time gestures. In Looking at People Workshop, Chambery, France, 1993.Google Scholar
  3. 3.
    J. Dennis and R. Schnabel. Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Englewood Cliffs, NJ, 1983.Google Scholar
  4. 4.
    B. Dorner. Hand shape identification and tracking for sign language interpretation. In Looking at People Workshop, Chambery, France, 1993.Google Scholar
  5. 5.
    D. Gennery. Visual tracking of known three-dimensional objects. Int. J. Computer Vision, 7(3):243–270, 1992.Google Scholar
  6. 6.
    S. B. Kang and K. Ikeuchi. Grasp recognition using the contact web. In Proc. IEEE/RSJ Int. Conf. on Int. Robots and Sys., Raleigh, NC, 1992.Google Scholar
  7. 7.
    D. Lowe. Robust model-based motion tracking through the integration of search and estimation. Int. J. Computer Vision, 8(2):113–122, 1992.Google Scholar
  8. 8.
    R. Mann and E. Antonsson. Gait analysis-precise, rapid, automatic, 3-d position and orientation kinematics and dynamics. BULLETIN of the Hospital for Joint Diseases Orthopaedic Institute, XLIII(2):137–146, 1983.Google Scholar
  9. 9.
    J. O'Rourke and N. Badler. Model-based image analysis of human motion using constraint propagation. IEEE Trans. Pattern Analysis and Machine Intelligence, 2(6):522–536, 1980.Google Scholar
  10. 10.
    A. Pentland and B. Horowitz. Recovery of nonrigid motion and structure. IEEE Trans. Pattern Analysis and Machine Intelligence, 13(7):730–742, 1991.Google Scholar
  11. 11.
    J. Rehg and T. Kanade. Digiteyes: Vision-based human hand tracking. Technical Report CMU-CS-TR-93-220, Carnegie Mellon Univ. School of Comp. Sci., 1993.Google Scholar
  12. 12.
    H. Rijpkema and M. Girard. Computer animation of knowledge-based human grasping. Computer Graphics, 25(4):339–348, 1991.Google Scholar
  13. 13.
    M. Spong. Robot Dynamics and Control. John Wiley and Sons, 1989.Google Scholar
  14. 14.
    M. Yamamoto and K. Koshikawa. Human motion analysis based on a robot arm model. In IEEE Conf. Comput. Vis. and Pattern Rec., pages 664–665, 1991.Google Scholar
  15. 15.
    T. Zimmerman, J. Lanier, C. Blanchard, S. Bryson, and Y. Harvill. A hand gesture interface device. In Proc. Human Factors in Comp. Sys. and Graphics Interface (CHI+GI'87), pages 189–192, Toronto, Canada, 1987.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1994

Authors and Affiliations

  • James M. Rehg
    • 1
  • Takeo Kanade
    • 2
  1. 1.Department of Electrical and Computer EngineeringCarnegie Mellon UniversityPittsburgh
  2. 2.The Robotics InstituteCarnegie Mellon UniversityPittsburgh

Personalised recommendations