A Bayesian Framework for Multi-cue 3D Object Tracking

  • J. Giebel
  • D. M. Gavrila
  • C. Schnörr
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3024)


This paper presents a Bayesian framework for multi-cue 3D object tracking of deformable objects. The proposed spatio-temporal object representation involves a set of distinct linear subspace models or Dynamic Point Distribution Models (DPDMs), which can deal with both continuous and discontinuous appearance changes; the representation is learned fully automatically from training data. The representation is enriched with texture information by means of intensity histograms, which are compared using the Bhattacharyya coefficient. Direct 3D measurement is furthermore provided by a stereo system.

State propagation is achieved by a particle filter which combines the three cues shape, texture and depth, in its observation density function. The tracking framework integrates an independently operating object detection system by means of importance sampling. We illustrate the benefit of our integrated multi-cue tracking approach on pedestrian tracking from a moving vehicle.


Importance Sampling Object Tracking Bayesian Framework Active Track Deformable Object 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Bar-Shalom, Y., Li, X.R., Kirubarajan, T. (eds.): Estimation with applications to tracking and navigation. Wiley, Chichester (2001)Google Scholar
  2. 2.
    Blake, A., Isard, M.: Active Contours. Springer, Heidelberg (1998)Google Scholar
  3. 3.
    Cootes, T.F., Taylor, C.J., Cooper, D.C., Graham, J.: Active shape models - their training and application. Computer Vision and Image Understanding 61(1), 38–59 (1995)CrossRefGoogle Scholar
  4. 4.
    Doucet, A., de Freitas, N., Gordon, N. (eds.): Sequential Monte Carlo Methods in Practice. Springer, Heidelberg (2001)zbMATHGoogle Scholar
  5. 5.
    Elliot, R.J., Aggoun, L., Moore, J.B.: Hidden Markov Models, 2nd edn. Springer, Heidelberg (1997)Google Scholar
  6. 6.
    Gavrila, D.M.: Multi-feature hierarchical template matching using distance transforms. In: Proc. of the ICPR, Brisbane, pp. 439–444 (1998)Google Scholar
  7. 7.
    Gavrila, D.M.: Pedestrian detection from a moving vehicle. In: Vernon, D. (ed.) ECCV 2000. LNCS, vol. 1843, pp. 37–49. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  8. 8.
    Gavrila, D.M., Giebel, J.: Virtual sample generation for template-based shape matching. In: Proc. of the IEEE CVPR Conf., pp. I:676–681 (2001)Google Scholar
  9. 9.
    Heap, T., Hogg, D.: Wormholes in shape space: Tracking through discontinuous changes in shape. In: Proc. of the ICCV, pp. 344–349 (1998)Google Scholar
  10. 10.
    Isard, M., Blake, A.: Icondensation: Unifying low-level and high-level tracking in a stochastic framework. In: Burkhardt, H.-J., Neumann, B. (eds.) ECCV 1998. LNCS, vol. 1406, pp. 893–908. Springer, Heidelberg (1998)Google Scholar
  11. 11.
    Isard, M., Blake, A.: A mixed-state condensation tracker with automatic modelswitching. In: Proc. of the ICCV, pp. 107–112 (1998)Google Scholar
  12. 12.
    Isard, M., MacCormick, J.: Bramble: A bayesian multiple-blob tracker. In: Proc. of the ICCV, pp. 34–41 (2001)Google Scholar
  13. 13.
    Julier, S., Uhlmann, J.: A new extension of the kalman filter to nonlinear systems. In: Int. Symp. Aerospace/Defense Sensing, Simul. and Controls (1997)Google Scholar
  14. 14.
    Nummiaro, K., Koller-Meier, E., Van Gool, L.: Object tracking with an adaptive color-based particle filter. In: Proc. of the Deutsche Arbeitsgemeinschaft für Mustererkennung, Zurich, Switzerland (2002)Google Scholar
  15. 15.
    Pérez, P., Hue, C., Vermaak, J., Gangnet, M.: Color-based probabilistic tracking. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002. LNCS, vol. 2350, pp. 661–675. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  16. 16.
    Rubner, Y., Puzicha, J., Tomasi, C., Buhmann, J.M.: Empirical evaluation of dissimilarity measures for color and texture. CVIU 84(1), 25–43 (2001)zbMATHGoogle Scholar
  17. 17.
    Spengler, M., Schiele, B.: Towards robust multi-cue integration for visual tracking. Machine, Vision and Applications 14, 50–58 (2003)CrossRefGoogle Scholar
  18. 18.
    Toyama, K., Blake, A.: Probabilistic tracking with exemplars in a metric space. Int. J. of Computer Vision 48(1), 9–19 (2002)zbMATHCrossRefGoogle Scholar
  19. 19.
    Triesch, J., von der Malsburg, C.: Self-organized integration of adaptive visual cues for face tracking. In: Proc. of the IEEE Int. Conf. on Automatic Face and Gesture Recognition, Los Alamitos (2000)Google Scholar
  20. 20.
    Turk, M., Pentland, A.: Eigenfaces for recognition. Journal of Cognitive Neuro Science 3(1), 71–86 (1991)CrossRefGoogle Scholar
  21. 21.
    Vermaak, J., Doucet, A., Perez, P.: Maintaining multi-modality through mixture tracking. In: Proc. of the ICCV (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • J. Giebel
    • 1
  • D. M. Gavrila
    • 1
  • C. Schnörr
    • 2
  1. 1.Machine Perception, DaimlerChrysler Research and TechnologyUlmGermany
  2. 2.Computer Vision, Graphics and Pattern Recognition Group, Department of Mathematics and Computer ScienceUniversity of MannheimMannheimGermany

Personalised recommendations