International Journal of Computer Vision

, Volume 91, Issue 1, pp 45–58 | Cite as

Multi-Camera Tracking with Adaptive Resource Allocation

  • Bohyung Han
  • Seong-Wook Joo
  • Larry S. Davis


Sensor fusion for object tracking is attractive since the integration of multiple sensors and/or algorithms with different characteristics can improve performance. However, there exist several critical limitations to sensor fusion techniques: (1) the measurement cost increases typically as many times as the number of sensors, (2) it is not straightforward to measure the confidence of each source and give it a proper weight for state estimation, and (3) there is no principled dynamic resource allocation algorithm for better performance and efficiency. We describe a method to fuse information from multiple sensors and estimate the current tracker state by using a mixture of sequential Bayesian filters (e.g., particle filter)—one filter for each sensor, where each filter makes a different level of contribution to estimate the combined posterior in a reliable manner. In this framework, multiple sensors interact to determine an appropriate sensor for each particle dynamically; each particle is allocated to only one of the sensors for measurement and a different number of particles is assigned to each sensor. The level of the contribution of each sensor changes dynamically based on its prior information and relative measurement confidence. We apply this technique to visual tracking with multiple cameras, and demonstrate its effectiveness through tracking results in videos.


Object tracking Resource allocation Multi-camera tracking Sensor fusion Kernel-based Bayesian filtering Mixture model 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Adlers, M. (2000). Topics in sparse least squares problems. PhD thesis, Linköpings Universitet, Sweden. Available at
  2. Azoz, Y., Devi, L., & Sharma, R. (1998). Reliable tracking of human arm dynamics by multiple cue integration and constraint fusion. In Proc. IEEE conf. on computer vision and pattern recognition, Santa Barbara, CA. Google Scholar
  3. Birchfield, S. (1998). Elliptical head tracking using intensity gradients and color histograms. In Proc. IEEE conf. on computer vision and pattern recognition, Santa Barbara, CA (pp. 232–237). Google Scholar
  4. Black, J., & Ellis, T. (2006). Multi camera image tracking. Image and Vision Computing Journal, 24(11), 1256–1267. CrossRefGoogle Scholar
  5. Blom, H., & Bar-Shalom, Y. (1988). The interacting multiple model algorithm for systems with Markovian switching coefficients. IEEE Transactions on Automatic Control, 33(8), 780–783. zbMATHCrossRefGoogle Scholar
  6. Cantarella, J., & Piatek, M. (2004). tsnnls: A solver for large sparse least squares problem with non-negative variables. Preprint, available at
  7. Chen, R., & Liu, J. (2000). Mixture Kalman filters. Journal of the Royal Statistical Society. Series B, Statistical Methodology, 62(3), 493–508. zbMATHCrossRefMathSciNetGoogle Scholar
  8. Chen, Y., & Rui, Y. (2004). Real-time speaker tracking using particle filter sensor fusion. Proceedings of IEEE, 92(3), 485–494. CrossRefGoogle Scholar
  9. Dockstader, S. L., & Tekalp, A. M. (2001). Multiple camera tracking of interacting and occluded human motion. Proceedings of the IEEE, 89(10), 1441–1455. CrossRefGoogle Scholar
  10. Doucet, A., de Freitas, N., & Gordon, N. (2001). Sequential Monte Carlo methods in practice. Berlin: Springer. zbMATHGoogle Scholar
  11. Han, B., Joo, S.-W., & Davis, L. (2007). Probabilistic fusion tracking using mixture kernel-based Bayesian filtering. In Proc. 11th intl. conf. on computer vision, Rio de Janeiro, Brazil. Google Scholar
  12. Han, B., Comaniciu, D., Zhu, Y., & Davis, L. (2008). Sequential kernel density approximation and its application to real-time visual tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30(7), 1186–1197. CrossRefGoogle Scholar
  13. Han, B., Zhu, Y., Comaniciu, D., & Davis, L. (2009). Visual tracking by continuous density propagation in sequential Bayesian filtering framework. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(5), 919–930. CrossRefGoogle Scholar
  14. Hoffmann, C., & Dang, T. (2009). Cheap joint probabilistic data association filters in an interacting multiple model design. Robotics and Autonomous Systems, 57(3), 268–278. CrossRefGoogle Scholar
  15. Isard, M., & Blake, A. (1998). ICONDENSATION: Unifying low-level and high-level tracking in a stochastic framework. In Proc. European conf. on computer vision, Freiburg, Germany (pp. 893–908). Google Scholar
  16. Jia, Z., Balasuriya, A., & Challa, S. (2008). Vision based data fusion for autonomous vehicles target tracking using interacting multiple dynamic models. Computer Vision and Image Understanding, 109(1), 1–21. CrossRefGoogle Scholar
  17. Julier, S., & Uhlmann, J. (1997). A new extension of the Kalman filter to nonlinear systems. In Proceedings SPIE (Vol. 3068, pp. 182–193). Google Scholar
  18. Khan, S., & Shah, M. (2006). A multiview approach to tracking people in crowded scenes using a planar homography constraint. In Proc. European conf. on computer vision, Graz, Austria (Vol. IV, pp. 133–146). Google Scholar
  19. Kim, K., & Davis, L. (2006). Multi-camera tracking and segmentation of occluded people on ground plane using search-guided particle filtering. In Proc. European conf. on computer vision, Graz, Austria (Vol. III, pp. 98–109). Google Scholar
  20. Lawson, C. L., & Hanson, B. J. (1974). Solving least squares problems. New York: Prentice-Hall. zbMATHGoogle Scholar
  21. Leichter, I., Lindenbaum, M., & Rivlin, E. (2006). A probabilistic framework for combining tracking algorithms. In Proc. IEEE conf. on computer vision and pattern recognition, Washington DC (pp. 445–451). Google Scholar
  22. Mazor, E., Averbuch, A., Bar-Shalom, Y., & Dayan, J. (1998). Interacting multiple model methods in target tracking: a survey. IEEE Transactions on Aerospace and Electronic Systems, 34(1), 103–123. CrossRefGoogle Scholar
  23. Mccane, B., Galvin, B., & Novins, K. (2002). Algorithmic fusion for more robust feature tracking. International Journal of Computer Vision, 49(1), 79–89. zbMATHCrossRefGoogle Scholar
  24. Mittal, A., & Davis, L. S. (2003). M2tracker: a multi-view approach to segmenting and tracking people in a cluttered scene. International Journal of Computer Vision, 51(3), 189–203. CrossRefGoogle Scholar
  25. Musicki, D. (2008). Bearings only multi-sensor maneuvering target tracking. Systems and Control Letters, 57(3), 216–221. zbMATHCrossRefMathSciNetGoogle Scholar
  26. Musicki, D., & La Scala, B. L. (2008). Multi-target tracking in clutter without measurement assignment. IEEE Transactions on Aerospace and Electronic Systems, 44(3). Google Scholar
  27. Okuma, K., Taleghani, A., de Freitas, N., Little, J., & Lowe, D. (2004). A boosted particle filter: multitarget detection and tracking. In Proc. European conf. on computer vision, Prague, Czech Republic, May 2004. Google Scholar
  28. Perez, P., Vermaak, J., & Blake, A. (2004). Data fusion for visual tracking with particle filter. Proceedings of IEEE, 92(3), 495–513. CrossRefGoogle Scholar
  29. Rui, Y., & Chen, Y. (2001). Better proposal distributions: object tracking using unscented particle filter. In Proc. IEEE conf. on computer vision and pattern recognition, Kauai, Hawaii (Vol. II, pp. 786–793). Google Scholar
  30. Salmond, D. (1990). Mixture reduction algorithms for target tracking in clutter. In SPIE signal and data processing of small targets, Orlando, FL (Vol. 1305, pp. 434–445). Google Scholar
  31. Sherrah, J., & Gong, S. (2001). Continuous global evidence-based Bayesian modality fusion for simultaneous tracking of multiple objects. In Proc. 8th intl. conf. on computer vision, Vancouver, Canada. Google Scholar
  32. Siebel, N. T., & Maybank, S. J. (2002). Fusion of multiple tracking algorithms for robust people tracking. In Proc. European conf. on computer vision, Copenhagen, Denmark (Vol. IV, pp. 373–387). Google Scholar
  33. Spengler, M., & Schiele, B. (2003). Towards robust multi-cue integration for visual tracking. Machine Vision and Applications, 14(1), 50–58. CrossRefGoogle Scholar
  34. Triesch, J., & von der Malsburg, C. (2001). Democratic integration: self-organized integration of adaptive cues. Neural Computation, 13(9), 2049–2074. zbMATHCrossRefGoogle Scholar
  35. van der Merwe, R., Doucet, A., Freitas, N., & Wan, E. (2000). The unscented particle filter (Technical Report CUED/F-INFENG/TR 380). Cambridge University Engineering Department. Google Scholar
  36. Vermaak, J., Gangnet, M., Blake, A., & Perez, P. (2001). Sequential Monte Carlo fusion of sound and vision for speaker tracking. In Proc. 8th intl. conf. on computer vision, Vancouver, Canada (Vol. I, pp. 741–746). Google Scholar
  37. Vermaak, J., Doucet, A., & Perez, P. (2003). Maintaining multi-modality through mixture tracking. In Proc. 9th intl. conf. on computer vision, Nice, France (Vol. II). Google Scholar
  38. Williams, J., & Maybeck, P. (2003). Cost-function-based Gaussian mixture reduction for target tracking. In Int. conf. information fusion (Vol. 2, pp. 1047–1054). Google Scholar
  39. Wu, Y., & Huang, T. (2001). A co-inference approach to robust visual tracking. In Proc. 8th intl. conf. on computer vision, Vancouver, Canada (Vol. II, pp. 26–33). Google Scholar
  40. Yang, C., Duraiswami, R., & Davis, L. (2005). Fast multiple object tracking via a hierarchical particle filter. In Proc. 10th intl. conf. on computer vision, Beijing, China (Vol. I, pp. 212–219). Google Scholar
  41. Zhong, X., Xue, J., & Zheng, N. (2006). Graphical model based cue integration strategy for head tracking. In Proc. British machine vision conference, Edinburgh, UK. Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Dept. of Computer Science and EngineeringPohang University of Science and Technology (POSTECH)PohangKorea
  2. 2.Google Inc.Mountain ViewUSA
  3. 3.Dept. of Computer ScienceUniversity of MarylandCollege ParkUSA

Personalised recommendations