Advertisement

Dynamic Integration of Generalized Cues for Person Tracking

  • Kai Nickel
  • Rainer Stiefelhagen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5305)

Abstract

We present an approach for the dynamic combination of multiple cues in a particle filter-based tracking framework. The proposed algorithm is based on a combination of democratic integration and layered sampling. It is capable of dealing with deficiencies of single features as well as partial occlusion using the very same dynamic fusion mechanism. A set of simple but fast cues is defined, which allow us to cope with limited computational resources. The system is capable of automatic track initialization by means of a dedicated attention tracker permanently scanning the surroundings.

Keywords

Partial Occlusion Proposal Distribution Layered Sampling Dynamic Integration Limited Computational Resource 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Triesch, J., Malsburg, C.V.D.: Democratic integration: Self-organized integration of adaptive cues. Neural Comput. 13(9), 2049–2074 (2001)zbMATHCrossRefGoogle Scholar
  2. 2.
    Spengler, M., Schiele, B.: Towards robust multi-cue integration for visual tracking. Machine Vision and Applications 14, 50–58 (2003)CrossRefGoogle Scholar
  3. 3.
    Shen, C., Hengel, A., Dick, A.: Probabilistic multiple cue integration for particle filter based tracking. In: International Conference on Digital Image Computing - Techniques and Applications, pp. 309–408 (2003)Google Scholar
  4. 4.
    Pérez, P., Vermaak, J., Blake, A.: Data fusion for visual tracking with particles. Proceedings of the IEEE 92(3), 495–513 (2004)CrossRefGoogle Scholar
  5. 5.
    Isard, M., Blake, A.: Condensation–conditional density propagation for visual tracking. International Journal of Computer Vision 29(1), 5–28 (1998)CrossRefGoogle Scholar
  6. 6.
    MacCormick, J., Blake, A.: A probabilistic exclusion principle for tracking multiple objects. International Journal of Computer Vision 39(1), 57–71 (2000)zbMATHCrossRefGoogle Scholar
  7. 7.
    Smith, K., Gatica-Perez, D., Odobez, J.M.: Using particles to track varying numbers of interacting people. In: IEEE Conf. on Computer Vision and Pattern Recognition, Washington, DC, USA, pp. 962–969 (2005)Google Scholar
  8. 8.
    Lanz, O.: Approximate bayesian multibody tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(9), 1436–1449 (2006)CrossRefGoogle Scholar
  9. 9.
    Viola, P., Jones, M.: Robust real-time object detection. In: ICCV Workshop on Statistical and Computation Theories of Vision (July 2001)Google Scholar
  10. 10.
    Lienhart, R., Maydt, J.: An extended set of haar-like features for rapid object detection. In: ICIP, vol. 1, pp. 900–903 (September 2002)Google Scholar
  11. 11.
    Kruppa, H., Castrillon-Santana, M., Schiele, B.: Fast and robust face finding via local context. In: IEEE Intl. Workshop on Visual Surveillance and Performance Evaluation of Tracking and Surveillance (October 2003)Google Scholar
  12. 12.
    Scharstein, D., Szeliski, R.: A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. IJCV 47(1/2/3), 7–42 (2002)zbMATHCrossRefGoogle Scholar
  13. 13.
    Veksler, O.: Fast variable window for stereo correspondence using integral images. In: IEEE Conf. on Computer Vision and Pattern Recognition, pp. 556–561 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Kai Nickel
    • 1
  • Rainer Stiefelhagen
    • 1
  1. 1.Universität Karlsruhe (TH), InterACTKarlsruheGermany

Personalised recommendations