Complex Articulated Object Tracking

  • Andrew I. Comport
  • Éric Marchand
  • François Chaumette
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3179)

Abstract

In this paper new results are presented for tracking complex multi-body objects. The theoretical framework is based on robotics techniques and uses an a-priori model of the object including a general mechanical link description. A new kinematic-set formulation takes into account that articulated degrees of freedom are directly observable from the camera and therefore their estimation does not need to pass via a kinematic-chain back to the root. By doing this the tracking techniques are efficient and precise leading to real-time performance and accurate measurements. The system is locally based upon an accurate modeling of a distance criteria. A general method is given for defining any type of mechanical link and experimental results show prismatic, rotational and helical type links. A statistical M-estimation technique is applied to improve robustness. A monocular camera system was used as a real-time sensor to verify the theory.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aggarwal, J., Cai, Q., Liao, W., Sabata, B.: Nonrigid motion analysis: Articulated and elastic motion. Computrer Vision and Image Understanding 70, 142–156 (1998)CrossRefGoogle Scholar
  2. 2.
    Lowe, D.: Three-dimensional object recognition from single two-dimensional images. Artificial Intelligence 31, 355–394 (1987)CrossRefGoogle Scholar
  3. 3.
    Drummond, T., Cipolla, R.: Real-time visual tracking of complex structures. IEEE Trans. on Pattern Analysis and Machine Intelligence 27, 932–946 (2002)CrossRefGoogle Scholar
  4. 4.
    Marchand, E., Bouthemy, P., Chaumette, F., Moreau, V.: Robust real-time visual tracking using a 2d-3d model-based approach. In: IEEE Int. Conf. on Computer Vision, ICCV 1999, Kerkira, Greece, vol. 1, pp. 262–268 (1999)Google Scholar
  5. 5.
    Comport, A.I., Marchand, E., Chaumette, F.: A real-time tracker for markerless augmented reality. In: ACM/IEEE Int. Symp. on Mixed and Augmented Reality, ISMAR 2003, Tokyo, Japan, pp. 36–45 (2003)Google Scholar
  6. 6.
    Dhome, M., Richetin, M., Lapresté, J.T., Rives, G.: Determination of the attitude of 3-d objects from a single perspective view. IEEE Trans. on Pattern Analysis and Machine Intelligence 11, 1265–1278 (1989)CrossRefGoogle Scholar
  7. 7.
    Dementhon, D., Davis, L.: Model-based object pose in 25 lines of codes. Int. J. of Computer Vision 15, 123–141 (1995)CrossRefGoogle Scholar
  8. 8.
    Lu, C., Hager, G., Mjolsness, E.: Fast and globally convergent pose estimation from video images. IEEE Trans. on Pattern Analysis and Machine Intelligence 22, 610–622 (2000)CrossRefGoogle Scholar
  9. 9.
    Lowe, D.: Fitting parameterized three-dimensional models to images. IEEE Trans. on Pattern Analysis and Machine Intelligence 13, 441–450 (1991)CrossRefGoogle Scholar
  10. 10.
    Nunomaki, T., Yonemoto, S., Arita, D., Taniguchi, R.: Multipart non-rigid object tracking based on time model-space gradients. In: Articulated Motion and Deformable Objects First International Workshop, pp. 78–82 (2000)Google Scholar
  11. 11.
    Ruf, A., Horaud, R.: Rigid and articulated motion seen with an uncalibrated stereo rig. In: IEEE Int. Conf. on Computer Vision, Corfu, Greece, pp. 789–796 (1999)Google Scholar
  12. 12.
    Marchand, E., Chaumette, F.: Virtual visual servoing: a framework for realtime augmented reality. In: EUROGRAPHICS 2002 Conference Proceeding, Saarebrücken, Germany. Computer Graphics Forum, vol. 21(3), pp. 289–298 (2002)Google Scholar
  13. 13.
    Huber, P.J.: Robust Statistics. Wiler, New York (1981)MATHCrossRefGoogle Scholar
  14. 14.
    Hutchinson, S., Hager, G., Corke, P.: A tutorial on visual servo control. IEEE Trans. on Robotics and Automation 12, 651–670 (1996)CrossRefGoogle Scholar
  15. 15.
    Espiau, B., Chaumette, F., Rives, P.: A new approach to visual servoing in robotics. IEEE Trans. on Robotics and Automation 8, 313–326 (1992)CrossRefGoogle Scholar
  16. 16.
    Comport, A.I., Marchand, E., Chaumette, F.: Object-based visual 3d tracking of articulated objects via kinematic sets. In: IEEE Workshop on Articulated and Non-Rigid Motion, Washington, DC (2004)Google Scholar
  17. 17.
    Fischler, N., Bolles, R.: Random sample consensus: A paradigm for model fitting with application to image analysis and automated cartography. Communication of the ACM 24, 381–395 (1981)CrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Andrew I. Comport
    • 1
  • Éric Marchand
    • 1
  • François Chaumette
    • 1
  1. 1.IRISA – INRIA Rennes, Campus de BeaulieuRennesFrance

Personalised recommendations