Advertisement

Object Tracking within the Framework of Concept Drift

  • Li Chen
  • Yue Zhou
  • Jie Yang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7726)

Abstract

It is well known that the backgrounds or the targets always change in real scenes, which weakens the effectiveness of classical tracking algorithms because of frequent model mismatches. In this paper, an object tracking algorithm within the framework of concept drift is proposed to solve this problem. We detect the driftpoints using a simple message-passing algorithm based on Bayesian Approach. The analyzed probability distribution lays the foundation for the self-adaption of our new model. Our tracking algorithm within the framework of concept drift improves the tracking robustness and accuracy which is illustrated by the two experiments on two real-world changing scenes.

Keywords

Tracking Algorithm Object Tracking Neural Information Processing System Concept Drift Scene Change 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Thayananthan, A., Iwasaki, M., Cipolla, R.: Principled fusion of high-level model and low-level cues for motion segmentation. In: IEEE Conference Vision and Pattern Recognition (2008)Google Scholar
  2. 2.
    Widmer, G., Kubat, M.: Learning in the presence of concept drift and hidden contexts. Machine Learning 23(1), 69–101 (1996)Google Scholar
  3. 3.
    Pavlovic, V., Rehg, J., Cham, T., Murphy, K.: A dynamic Bayesian network approach to figure tracking using learned dynamic models. In: IEEE Conference on Computer Vision (1999)Google Scholar
  4. 4.
    Fearnhead, P.: Exact and efficient Bayesian inference for multiple chagepoint problems. Statistic and Computing 16(2), 203–213 (2006)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Fearnhead, P., Liu, Z.: Online inference for multiple changepoint problems. Journal of the Royal Statistical Society: Series B 69(4), 589–605 (2007)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Bach, S.H., Maloof, M.A.: A Bayesian approach to concept drift. The Neural Information Processing System (2010)Google Scholar
  7. 7.
    Adams, R.P., Mackay, D.J.C.: Bayesian online changepoint detection. Technical Report, University of Cambridge (2007)Google Scholar
  8. 8.
    Barry, D., Hartigan, J.A.: Product partition models for change point problems. The Annals of Statistics 20, 260–279 (1992)MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    Nummiaro, K., Koller-Meier, E., Van Gool, L.: An adaptive color-based particle filter. Image and Vision Computing 21(2), 99–110 (2003)CrossRefGoogle Scholar
  10. 10.
    Barry, D., Hartigan, J.A.: A Bayesian Analysis for Change Point Problems. Journal of the American Statistical Association 35(3), 309–319 (1993)MathSciNetGoogle Scholar
  11. 11.
    Bach, S.H., Maloof, M.A.: Paired learners for concept drift. In: Proceedings of the Eighth IEEE International Conference on Data Mining. IEEE Press (2008)Google Scholar
  12. 12.
    Doucet, A., Godsill, S.: On sequential Monte Carlo methods for Bayesian filtering. Statistics and Computing 10, 197–208 (2000)CrossRefGoogle Scholar
  13. 13.
    Ekinci, M.: Human Identification using gait. Turk. J. Elec. Engin. 14(2), 267–291 (2006)Google Scholar
  14. 14.
    Wang, L., Tan, T.: Silhouette analysis-based gait recognition for human identification. IEEE Transactions on Pattern Analysis and Machine Intelligence 25(12), 1505–1518 (2003)CrossRefGoogle Scholar
  15. 15.
    Yam, C.Y., Nixon, M.S., Carter, J.N.: Gait recognition by walking and running: a model-based approach. In: The 5th Asian Conference on Computer Vision (2002)Google Scholar
  16. 16.
    Bouchrika, I., Carter, J.N., Nixon, M.S., Thallinger, G.: Using gait features for improving walking people detection. In: International Conference on Pattern Recognition (2010)Google Scholar
  17. 17.
    de la Torre, F., Gong, S., McKenna, S.: View-Based Adaptive Affine Tracking. In: Burkhardt, H.-J., Neumann, B. (eds.) ECCV 1998. LNCS, vol. 1406, pp. 828–842. Springer, Heidelberg (1998)Google Scholar
  18. 18.
    Zhou, S.H.K., Chellappa, R., Moghaddam, B.: Visual tracking and recognition usingappearance-adaptive models in particle filters. IEEE Transactions on Image Processing 13(11), 1491–1506 (2004)CrossRefGoogle Scholar
  19. 19.
    Lim, J., Ross, D., Lin, R.S., et al.: Incremental learning for visual tracking. In: Saul, L., et al. (eds.) Advances in Neural Information Processing Systems. MIT Press (2005)Google Scholar
  20. 20.
    Skočaj, D.: Robust subspace approaches to visual learning and recognition. PhD thesis, Computer and Information Science, University of Ljubljana (2003)Google Scholar
  21. 21.
    Ross, A.D., Lim, J., Lin, R.-S., Yang, M.H.: Incremental learning for robust visual tracking. International Journal of Computer Vision 77(1-3) (2008)Google Scholar
  22. 22.
    Pavlovic, V., Rehg, J.M., MacCormick, J.: Learning Switching Linear Models of Human Motion. In: Neural Information Processing Systems, pp. 981–987 (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Li Chen
    • 1
    • 2
  • Yue Zhou
    • 1
    • 2
  • Jie Yang
    • 1
    • 2
  1. 1.Institute of Image Processing and Pattern RecognitionShanghai Jiao Tong UniversityShanghaiChina
  2. 2.Key Laboratory of System Control and Information Processing, Ministry of Education of ChinaShanghai Jiao Tong UniversityShanghaiChina

Personalised recommendations