Advertisement

Tracking a person with pre-recorded image database and a pan, tilt, and zoom camera

  • Yiming Ye
  • John K. Tsotsos
  • Karen Bennet
  • Eric Harley
Poster Session III
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1352)

Abstract

This paper proposes a novel tracking strategy that can robustly track a person or other object within a fixed environment using a pan, tilt, and zoom camera with the help of a pre-recorded image database. We define a set called the Minimum Camera Parameter Settings (MCPS) which contains just enough camera states as required to survey the environment for the target. This set of states is used to facilitate tracking and segmentation. The idea is to store a background image of the environment for every camera state in MOPS, thus creating an image database. During tracking camera movements are restricted to states in MCPS. Scanning for the target and segmentation of the target from the background are simplified as each current image can be compared with the corresponding pre-recorded background image.

Keywords

Tracking Image database Segmentation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    D.M. Gavrila and L.S. Davis. 3-d model based tracking of humans in action: a multi-view approach. In CVPR, pages 73–79, 1996.Google Scholar
  2. 2.
    H.P. Graf, E. Cosatto, D. Gibbon, M. Kocheisen, and E. Petajan. Multi-modal system for locating heads and faces. In International Conference on Automatic Face and Gesture Recognition, pages 88–93, Killington, Vermont, October 1996.Google Scholar
  3. 3.
    S.X. Ju, M.J. Black, and Y. Yacoob. Cardboard people: A parameterized model of articulated image motion. In International Conference on Automatic Face and Gesture Recognition, pages 38–44, Killington, Vermont, October 1996.Google Scholar
  4. 4.
    I. Kakadiaris and D. Metaxas. Model-based estimation of 3d human motion with occlusion based on active multi-viewpoint selection. In CVPR, pages 81–87, 1996.Google Scholar
  5. 5.
    J.J. Kuch and T.S. Huang. Vision based hand modeling and tracking. In Proceedings of International Conference on Computer Vision, pages 81–87, 1996.Google Scholar
  6. 6.
    B. Moghaddam, T. Darrell and A. P. Pentland. Active face tracking and pose estimation in an interactive room. In CVPR, pages 67–71, 1996.Google Scholar
  7. 7.
    J. Noh, D. Huttenlocher and W. Rucklidge. Tracking non-rigid objects in complex scenes. In ICCV93, pages 93–101, 1993.Google Scholar
  8. 8.
    N. Oliver, A. Pentland and A. Lafter. Lips and Face Real Time Tracker. In CVPR, 1997.Google Scholar
  9. 9.
    K. Rohr. Towards model-based recognition of human movements in image sequences. CVGIP: Image Understanding, 59(1):94–115, 1986.Google Scholar
  10. 10.
    C. Wren, A. Azarbayejani, T. Darrell, and A. Pentland. Pfinder: Real-time tracking of the human body. In International Conference on Automatic Face and Gesture Recognition, pages 51–60, Killington, Vermont, October 1996.Google Scholar
  11. 11.
    J. Yang and A. Waibel. A Real-Time Face Tracker. In WACV, 1996.Google Scholar
  12. 12.
    Y. Ye. Sensor Planning for Object Search. PhD Thesis, Department of Computer Science, University of Toronto, January 17, 1997.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Yiming Ye
    • 1
  • John K. Tsotsos
    • 2
  • Karen Bennet
    • 3
  • Eric Harley
    • 2
  1. 1.IBM T.J Watson Research CenterYorktown Heights
  2. 2.Department of Computer ScienceUniversity of TorontoCanada
  3. 3.IBM Canada Center for Advanced StudiesNorth York

Personalised recommendations