Advertisement

W4S: A real-time system for detecting and tracking people in 2 1/2D

  • Ismail Haritaoglu
  • David Harwood
  • Larry S. Davis
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1406)

Abstract

W4S is a real time visual surveillance system for detecting and tracking people and monitoring their activities in an outdoor environment by integrating realtime stereo computation into an intensitybased detection and tracking system. Unlike many systems for tracking people, W4S makes no use of color cues. Instead, W4S employs a combination of stereo, shape analysis and tracking to locate people and their parts (head, hands, feet, torso) and create models of people's appearance so that they can be tracked through interactions such as occlusions. W4S is capable of simultaneously tracking multiple people even with occlusion. It runs at 5–20 Hz for 320×120 resolution images on a dual-pentium 200 PC.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    C. Wren, A. Azarbayejani, T. Darrell, A. Pentland “Pfinder: Real-Time Tracking of the Human Body”, In Proc. of the SPIE Conference on Integration Issues in Large Commercial Media Delivery Systems, October 1995.Google Scholar
  2. 2.
    A. Azarbayejani, C. Wren and A. Pentland “Real-Time Tracking of The Human Body” In Proc. IMAGE COM 96, Bordeaux, France, May 1996Google Scholar
  3. 3.
    S. Intille, J. Davis, A. Bobick “Real-Time Closed-Word Tracking”, In Proc. of CVPR, pp.697–703, June 1997Google Scholar
  4. 4.
    A. F. Bobick and J. Davis “Real-Time recognition of activity using Temporal Templates” In Proc. Third IEEE Workshop on Application of Computer Vision, pp.1233–1251, December, 1996Google Scholar
  5. 5.
    C. Bregler “Learning and Recognizing Human Dynamics in Video Sequences” In Proc. CVPR 97, pp.569–574, June 1997Google Scholar
  6. 6.
    T. Olson, F. Brill “Moving Object Detection and Event Recognition algorithms for Smart Cameras” In Proc. DARPA Image Understanding Workshop, pp.159–176, May 1997.Google Scholar
  7. 7.
    R. Polona, R. Nelson “Low Level Recognition of Human Motion”, In Proc. Non Rigid Motion Workshop, November 1994.Google Scholar
  8. 8.
    A. Pentland “Machine Understanding Human Actions” In Proc. DARPA Image Understanding Workshop, pp.757–764, 1996.Google Scholar
  9. 9.
    A. Bobick, J. Davis, S. Intille, F. Baird, L. Cambell, Y. Irinov, C. Pinhanez, A. Wilson “KidsRoom: Action Recognition In An Interactive Story environment” M.I.T. TR No: 398, December 1996Google Scholar
  10. 10.
    S. Fejes, L.S. Davis “Exploring Visual Motion Using Projections of Flow Fields” In. Proc. of the DARPA Image Understanding Workshop, pp.113–122 New Orleans, LA,1997Google Scholar
  11. 11.
    S. Ju, M. Black, Y. Yacoob, “Cardboard People: A Parameterized Model of Articulated Image Motion”, International Conference on Face and Gesture Analysis, 1996Google Scholar
  12. 12.
    K. Konolige “Small Vision systems: Hardware and Implementation”, Eighth International Symposium on Robotics Research, Hayama, Japan, November, 1997Google Scholar
  13. 13.
    T. Kanade, “A Stereo Machine For Video Rate Dense Depth Mapping and Its New Application”, In Proc. CVPR, 1996.Google Scholar
  14. 14.
    -, “W4: Who, When, Where, What: A Real Time System for Detecting and Tracking People” Submitted to Face and Gesture Recognition Workshop, 1998.Google Scholar
  15. 15.
    T. Westman, D. Harwood, T. Laitinen, M. Pietikainen “Color Segmentation by Hierarchical Connected Components Analysis With Image Enhancement by Symmetric Neighborhood Filters” In Proc. CVPR, pp:796–802, June, 1990Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Ismail Haritaoglu
    • 1
  • David Harwood
    • 1
  • Larry S. Davis
    • 1
  1. 1.Computer Vision LaboratoryUniversity of MarylandCollege ParkUSA

Personalised recommendations