Advertisement

An improved correlation filter tracking method with occlusion and drift handling

  • Jun Liu
  • Zhongqiang LuoEmail author
  • Xingzhong Xiong
Original Article
  • 47 Downloads

Abstract

Despite remarkable progress, visual object tracking is still a challenging task as objects usually suffer from significant appearance changes, fast motion, and serious occlusion. In this paper, we propose a correlation filter-based tracking method with reliability evaluation and re-detection mechanism (CF-RERM) to deal with drift and occlusion problems. We first propose a criterion that uses the fluctuation trend of the response values, the displacement difference of the object, and the peak-to-sidelobe ratio to comprehensively evaluate the reliability of the tracking process. Then, a re-detection mechanism with a two-stage screening strategy is proposed for implementing the re-detection task when the criterion is triggered. Experimental results show that our method has achieved considerable performance in terms of accuracy and success rate on widely used OTB-50, OTB-100 and Temple-Color-128 tracking benchmark dataset. In addition, CF-RERM is able to achieve real-time tracking speed.

Keywords

Visual tracking Circulant matrices Correlation filter Kernel methods Occlusion 

Notes

Acknowledgements

The authors would like to thank the editors and anonymous reviewers for their constructive comments and suggestions, which greatly helped improve the overall quality of the manuscript.

Author Contributions

JL conceived the main idea, designed the algorithm, performed the experiments, analyzed the data, and wrote the manuscript. ZL and XX proofread the manuscript. All the authors discussed the results and commented on the manuscript.

Funding

This work is supported by National Natural Science Foundation of China (No. 61801319), Sichuan University of Science and Engineering Talent Introduction Project (No. 2017RCL11), the Opening Project of Key Laboratory of Higher Education of Sichuan Province for Enterprise Informationalization and Internet of Things (No. 2017WZJ01), the Major Frontier Project of Science and Technology Plan of Sichuan Province (No. 2018JY0512), the Education Agency Project of Sichuan Province (No.18ZB0419), and the Sichuan Institute of Technology Graduate Innovation Foundation (No. D10-501128).

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

References

  1. 1.
    Comaniciu, D., Ramesh, V., Meer, P.: Kernel-based object tracking. IEEE Trans. Pattern Anal. Mach. Intell. 25(5), 564–575 (2003)CrossRefGoogle Scholar
  2. 2.
    Avidan, S.: Support vector tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 184–191 (2001)Google Scholar
  3. 3.
    Mei, X., Ling, H.: Robust visual tracking using \(\ell 1\) minimization. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1436–1443 (2009)Google Scholar
  4. 4.
    Zhong, W., Lu, H., Yang, M.: Robust object tracking via sparse collaborative appearance model. IEEE Trans. Image Process. 23(5), 2356–2368 (2014)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Nebehay, G., Pflugfelder, R.: Clustering of static-adaptive correspondences for deformable object tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2784–2791 (2015)Google Scholar
  6. 6.
    Liu, W., Li, J., Shi, Z., Chen, X., Chen, X.: Oversaturated part-based visual tracking via spatio-temporal context learning. Appl. Opt. 55(25), 6960–6968 (2016)CrossRefGoogle Scholar
  7. 7.
    Bai, B., Li, Y., Fan, J., Price, C., Shen, Q.: Object tracking based on incremental Bi-2DPCA learning with sparse structure. Appl. Opt. 54(10), 2897–2907 (2015)CrossRefGoogle Scholar
  8. 8.
    Ross, D.A., Lim, J., Lin, R., Yang, M.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77(1), 125–141 (2008)CrossRefGoogle Scholar
  9. 9.
    Mei, X., Ling, H.: Robust visual tracking and vehicle classification via sparse representation. IEEE Trans. Pattern Anal. Mach. Intell. 33(11), 2259–2272 (2011)CrossRefGoogle Scholar
  10. 10.
    Babenko, B., Yang, M., Belongie, S.: Robust object tracking with online multiple instance learning. IEEE Trans. Pattern Anal. Mach. Intell. 33(8), 1619–1632 (2011)CrossRefGoogle Scholar
  11. 11.
    Zhang, K., Zhang, L., Yang, M.H.: Fast compressive tracking. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2002–2015 (2014)CrossRefGoogle Scholar
  12. 12.
    Zitnick, C.L., Dollar, P.: Edge boxes: locating object proposals from edges. In: Proceedings of the European Conference on Computer Vision, pp. 391–405 (2014)CrossRefGoogle Scholar
  13. 13.
    Wu, Y., Lim, J., Yang, M.: Online object tracking: a benchmark. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2411–2418 (2013)Google Scholar
  14. 14.
    Wu, Y., Lim, J., Yang, M.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)CrossRefGoogle Scholar
  15. 15.
    Liang, P., Blasch, E., Ling, H.: Encoding color information for visual tracking: algorithms and benchmark. IEEE Trans. Image Process. 24(12), 5630–5644 (2015)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 2544–2550 (2010)Google Scholar
  17. 17.
    Hester, C.F., Casasent, D.: Multivariant technique for multiclass pattern recognition. Appl. Opt. 19(11), 1758–1761 (1980)CrossRefGoogle Scholar
  18. 18.
    Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: Proceedings of the European Conference on Computer Vision, pp. 702–715 (2012)CrossRefGoogle Scholar
  19. 19.
    Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)CrossRefGoogle Scholar
  20. 20.
    Danelljan, M., Hager, G., Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. In: Proceeding of the British Machine Vision Conference, pp. 98–111 (2014)Google Scholar
  21. 21.
    Li, Y., Zhu, J.: A scale adaptive kernel correlation filter tracker with feature integration. In: Proceedings of the European Conference on Computer Vision, pp. 254–265 (2014)Google Scholar
  22. 22.
    Ma, C., Yang, X., Zhang, C.: Long-term correlation tracking. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 5388–5396 (2015)Google Scholar
  23. 23.
    Choi, J., Chang, H.J., Yun, S.: Attentional correlation filter network for adaptive visual tracking. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 4828–4837 (2017)Google Scholar
  24. 24.
    Hare, S., Golodetz, S., Saffari, A., Vineet, V., Cheng, M.M., Hicks, S.L., Torr, P.H.S.: Struck: structured output tracking with kernels. IEEE Trans. Pattern Anal. Mach. Intell. 38(10), 2096–2109 (2015)CrossRefGoogle Scholar
  25. 25.
    Chen, D., Yuan, Z., Wu, Y., Zhang, G., Zheng, N.: Constructing adaptive complex cells for robust visual tracking. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1113–1120 (2013)Google Scholar
  26. 26.
    Holzer, S., Ilic, S., Navab, N.: Multilayer adaptive linear predictors for real-time tracking. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 105–117 (2013)CrossRefGoogle Scholar
  27. 27.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking–learning–detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2012)CrossRefGoogle Scholar
  28. 28.
    Zhang, J., Ma, S., Sclaroff, S.: Meem: Robust tracking via multiple experts using entropy minimization. In: Proceedings of the European Conference on Computer Vision, pp. 188–203 (2014)CrossRefGoogle Scholar
  29. 29.
    Dong, X., Shen, J., Yu, D., Wang, W., Liu, J., Huang, H.: Occlusion-aware real-time object tracking. IEEE Trans. Multimed. 19(4), 763–771 (2017)CrossRefGoogle Scholar
  30. 30.
    Jun, Z., Gang, X., Xing, Z., et al.: An improved long-term correlation tracking method with occlusion handling. Chin. Opt. Lett. 17(3), 31001–31006 (2019)CrossRefGoogle Scholar
  31. 31.
    Zhu, Z., Wang, Q., Li, B.: Distractor-aware siamese networks for visual object tracking. In: Proceedings of the European Conference on Computer Vision, pp. 103–119 (2018)CrossRefGoogle Scholar
  32. 32.
    Lukezic, A., Zajc L.C., Vojir, T.: FuCoLoT—a fully-correlational long-term tracker. In: Proceedings of the Asian Conference on Computer Vision, pp. 595–611 (2018)Google Scholar
  33. 33.
    Zhang, Y., Wang, D., Wang, L.: Learning regression and verification networks for long-term visual tracking. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, arXiv: Computer Vision and Pattern Recognition (2018)Google Scholar
  34. 34.
    Sun, C., Wang, D., Lu, H.: Correlation Tracking via Joint Discrimination and Reliability Learning. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 489–497 (2018)Google Scholar
  35. 35.
    Dai, K., Wang, D., Lu, H.: Visual tracking via adaptive spatially-regularized correlation filters. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 4670–4679 (2019)Google Scholar
  36. 36.
    Bhat, G., Johnander, J., Danelljan, M.: Unveiling the power of deep tracking. In: Proceedings of the European Conference on Computer Vision, pp. 493–509 (2018)CrossRefGoogle Scholar
  37. 37.
    Gray, R.M.: Toeplitz and circulant matrices: a review. Now Publishers 77(3), 125–141 (2006)zbMATHGoogle Scholar
  38. 38.
    Dollar, P., Zitnick, C.L.: Fast edge detection using structured forests. IEEE Trans. Pattern Anal. Mach. Intell. 37(8), 1558–1570 (2014)CrossRefGoogle Scholar
  39. 39.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Forward-backward error: automatic detection of tracking failures. In: Proceedings of the IEEE International Conference on Pattern Recognition, pp. 2756–2759 (2010)Google Scholar
  40. 40.
    Ning, J., Yang, J., Jiang, S.: Object tracking via dual linear structured SVM and explicit feature map. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 4266–4274 (2016)Google Scholar
  41. 41.
    Bertinetto, L., Valmadre, J., Golodetz, S.: Staple: complementary learners for real-time tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1401–1409 (2016)Google Scholar
  42. 42.
    Wang, M., Liu, Y., Huang, Z.: Large margin object tracking with circulant feature maps. In: Proceedings of the IEEE computer vision and pattern recognition, pp. 4800–4808 (2017)Google Scholar
  43. 43.
    Zhang, B., Li, Z., Cao, X.: Output constraint transfer for kernelized correlation filter in tracking. IEEE Trans. Syst. Man Cybern. 47(4), 693–703 (2017)CrossRefGoogle Scholar
  44. 44.
    Bao, C., Wu, Y., Ling, H.: Real time robust L1 tracker using accelerated proximal gradient approach. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 1830–1837 (2012)Google Scholar
  45. 45.
    Mueller, M., Smith, N., Ghanem, B.: A Benchmark and simulator for UAV tracking. In: Proceedings of the European Conference on Computer Vision, pp. 445–461 (2016)CrossRefGoogle Scholar
  46. 46.
    Kristan, M., Matas, J., Leonardis, A.: A novel performance evaluation methodology for single-target trackers. IEEE Trans. Pattern Anal. Mach. Intell. 38(11), 2137–2155 (2016)CrossRefGoogle Scholar
  47. 47.
    Jia, X., Lu, H., Yang, M.: Visual tracking via adaptive structural local sparse appearance model. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 1822–1829 (2012)Google Scholar
  48. 48.
    Babenko, B., Yang, M., Belongie S.J.: Visual tracking with online multiple instance learning. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 983–990 (2009)Google Scholar
  49. 49.
    Adam, A., Rivlin, E., Shimshoni, I.: Robust fragments-based tracking using the integral histogram. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 798–805 (2006)Google Scholar
  50. 50.
    Kwon, J., Lee K.M.: Visual tracking decomposition. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, pp. 1269–1276 (2010)Google Scholar
  51. 51.
    Zhang, K., Zhang, L., Yang, M.: Real-time compressive tracking. In: Proceedings of the European Conference on Computer Vision, pp. 864–877 (2012)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Artificial Intelligence Key Laboratory of Sichuan ProvinceSichuan University of Science and EngineeringYibinChina
  2. 2.School of Automation and Information EngineeringSichuan University of Science and EngineeringYibinChina
  3. 3.Key Laboratory of Higher Education of Sichuan Province for Enterprise Informationalization and Internet of ThingsSichuan University of Science and EngineeringYibinChina

Personalised recommendations