Skip to main content
Log in

Adaptive multi-branch correlation filters for robust visual tracking

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In recent years, deep convolutional features have been applied to discriminative correlation filters-based methods, which have achieved impressive performance in tracking. Most of them utilize hierarchical features from a certain layer. However, this is not always sufficient to learn target appearance changes and to suppress the background interference in complicated interfering factors (e.g., deformation, fast motion, low resolution, and rotations). In this paper, we propose an adaptive multi-branch correlation filter tracking method, by constructing multi-branch models and using an adaptive selection strategy to improve the accuracy and robustness of visual tracking. Specially, the multi-branch models are introduced to tolerate temporal changes of the object, which can serve different circumstances. In addition, the adaptive selection strategy incorporates both foreground and background information to learn background suppression. To further improve the tracking performance, we propose a measurement method to handle tracking failures from unreliable samples. Extensive experiments on OTB-2013, OTB-2015, and VOT-2016 datasets show that the proposed tracker has comparable performance compared to state-of-the-art tracking methods. Especially, on the OTB-2015, our method significantly improves the baseline with a gain of 5.5% in overlap precision.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Gao J, Zhang T, Yang X, Xu C (2017) Deep relative tracking. IEEE Trans Image Process 99:1

    Article  MathSciNet  Google Scholar 

  2. Qi Y, Qin L, Zhang J, Zhang S, Huang Q, Yang M-H (2018) Structure-aware local sparse coding for visual tracking. IEEE Trans Image Process 27(8):1

    Article  MathSciNet  Google Scholar 

  3. Xuan S et al (2013) A fully online and unsupervised system for large and high-density area surveillance: tracking, semantic scene learning and abnormality detection. Acm Trans Intell Syst Technol 4(2):1–21

    Google Scholar 

  4. Ojha S, Sakhare S (2015) Image processing techniques for object tracking in video surveillance—a survey. In: International conference on pervasive computing

  5. Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43(1):1–54

    Article  Google Scholar 

  6. Cho H, Seo YW, Kumar BVKV, Rajkumar RR (2014) A multi-sensor fusion system for moving object detection and tracking in urban driving environments. In: IEEE international conference on robotics & automation

  7. Kristian M, Matas J, Leonardis A, Porikli F (2016) A novel performance evaluation methodology for single-target trackers. IEEE Trans Pattern Anal Mach Intell 38(11):2137–2155

    Article  Google Scholar 

  8. Wu Y, Lim J, Yang MH (2013) Online object tracking: a benchmark. In: Computer vision and pattern recognition. pp 2411–2418

  9. Kalal Z, Mikolajczyk K, Matas J (2012) Tracking-learning-detection. IEEE Trans Pattern Anal Mach Intell 34(7):1409–1422

    Article  Google Scholar 

  10. Henriques JF, Caseiro R, Martins P, Batista J (2015) High-speed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell 37(3):583–596

    Article  Google Scholar 

  11. Chao M, Huang JB, Yang X, Yang MH (2016) Hierarchical convolutional features for visual tracking. In: IEEE international conference on computer vision

  12. Ma C, Yang X, Zhang C, Yang MH (2015) Long-term correlation tracking. In: Computer vision & pattern recognition

  13. Danelljan M, Hager G, Shahbaz Khan F, Felsberg M (2015) Learning spatially regularized correlation filters for visual tracking. In: The IEEE international conference on computer vision (ICCV)

  14. Li F, Tian C, Zuo W, Zhang L, Yang MH (2018) Learning spatial-temporal regularized correlation filters for visual tracking. In: The IEEE conference on computer vision and pattern recognition (CVPR)

  15. Danelljan M, Häger G, Shahbaz Khan F, Felsberg M (2014) Accurate scale estimation for robust visual tracking. In: Proceedings of the British machine vision conference

  16. Li Y, Zhu J (2014) Tracker with feature integration. In: European conference on computer vision, pp 254–265

  17. Danelljan M, Khan FS, Felsberg M, Weijer JVD (2014) Adaptive color attributes for real-time visual tracking. In: Computer vision and pattern recognition, pp 1090–1097

  18. Yang K, Song H, Zhang K, Liu Q (2019) Hierarchical attentive Siamese network for real-time visual tracking. Neural Comput Appl 2:1–2

    Google Scholar 

  19. Valmadre J, Bertinetto L, Henriques J, Vedaldi A, Torr PHS (2017) End-to-end representation learning for correlation filter based tracking. In: The IEEE conference on computer vision and pattern recognition (CVPR)

  20. Gao J, Zhang T, Yang X, Xu C (2018) P2T: part-to-target tracking via deep regression learning. IEEE Trans Image Process 27(6):3074–3086

    Article  MathSciNet  Google Scholar 

  21. Qi Y, Zhang S, Zhang W, Su L, Huang Q, Yang M-H (2019) Learning attribute-specific representations for visual tracking. In: AAAI

  22. Gao J, Zhang T, Xu C (2019) SMART: joint sampling and regression for visual tracking. IEEE Trans Image Process 28(8):3923–3935

    Article  MathSciNet  Google Scholar 

  23. Qi Y et al (2018) Hedging deep features for visual tracking. IEEE Trans Pattern Anal Mach Intell 99:1

    Google Scholar 

  24. Danelljan M, Bhat G, Khan FS, Felsberg M (2017) ECO: efficient convolution operators for tracking. In: IEEE conference on computer vision & pattern recognition

  25. Chao M, Huang JB, Yang X, Yang MH (2017) Robust visual tracking via hierarchical convolutional features. IEEE Trans Pattern Anal Mach Intell 99:1

    Google Scholar 

  26. Mueller M, Smith N, Ghanem B (2017) Context-aware correlation filter tracking. In: Computer vision and pattern recognition, pp 1387–1395

  27. Kiani Galoogahi H, Fagg A, Lucey S (2017) Learning background-aware correlation filters for visual tracking. In: The IEEE International Conference on Computer Vision (ICCV)

  28. Yi W, Jongwoo L, Ming-Hsuan Y (2015) Object tracking benchmark. IEEE Trans Pattern Anal Mach Intell 37(9):1834–1848

    Article  Google Scholar 

  29. Kristan M et al. (2016) The visual object tracking VOT2016 challenge results. In: IEEE international conference on computer vision workshops

  30. González RC, Woods RE (1993) Digital image proccesing. Prentice Hall Int 49(3):423–435

    Google Scholar 

  31. Bolme DS, Beveridge JR, Draper BA, Lui YM (2010) Visual object tracking using adaptive correlation filters. In: Computer vision and pattern recognition, pp 2544–2550

  32. Least-squares R, Rifkin R, Yeo G, Poggio T (2003) Regularized least-squares classification. Adv Learn Theory Methods Model Appl NATO Sci Ser III Comput Syst Sci 190:131–154

    Google Scholar 

  33. Danelljan M, Hager G, Shahbaz Khan F, Felsberg M (2016) Adaptive decontamination of the training set: a unified formulation for discriminative visual tracking. In: The IEEE conference on computer vision and pattern recognition (CVPR)

  34. Danelljan M, Robinson A, Shahbaz Khan F, Felsberg M (2016) Beyond correlation filters: learning continuous convolution operators for visual tracking. In: Computer vision—ECCV 2016, Cham, pp 472–488

  35. He Z, Fan Y, Zhuang J, Yuan D, Bai HL (2017) Correlation filters with weighted convolution responses. In: IEEE international conference on computer vision workshop

  36. Avidan S (2007) Ensemble tracking. IEEE Trans Pattern Anal Mach Intell 29(2):261–271

    Article  Google Scholar 

  37. Saffari A, Leistner C, Santner J, Godec M, Bischof H (2009) On-line random forests. In: 2009 IEEE 12th international conference on computer vision workshops, ICCV Workshops, pp 1393–1400

  38. Meshgi K, Oba S, Ishii S (2018) Efficient diverse ensemble for discriminative co-tracking. In: 2018 IEEE/CVF conference on computer vision and pattern recognition, pp 4814–4823

  39. Nam H, Baek M, Han B (2016) Modeling and propagating CNNs in a tree structure for visual tracking. CoRR 1608:07242

    Google Scholar 

  40. Wang N, Zhou W, Tian Q, Hong R, Wang M, Li H (2018) Multi-cue correlation filters for robust visual tracking. In: The IEEE conference on computer vision and pattern recognition (CVPR)

  41. Chaudhuri K, Freund Y, Hsu D (2009) A parameter-free hedging algorithm. In: International conference on neural information processing systems

  42. Zhang T, Xu C, Yang MH (2018) Learning multi-task correlation particle filters for visual tracking. IEEE Trans Pattern Anal Mach Intell 99:1

    Google Scholar 

  43. Zhang J, Ma S, Sclaroff S (2014) MEEM: robust tracking via multiple experts using entropy minimization. In: Computer vision—ECCV 2014, Cham, pp 188–203

  44. Leang I, Herbin S, Girard B, Droulez J (2018) On-line fusion of trackers for single-object tracking. Pattern Recognit 74:459–473

    Article  Google Scholar 

  45. Crammer K, Dekel O, Keshet J, Shalev-Shwartz S, Singer Y (2006) Online passive-aggressive algorithms. J Mach Learn Res 7(3):551–585

    MathSciNet  MATH  Google Scholar 

  46. Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition. In: 3rd international conference on learning representations, ICLR 2015, San Diego, CA, May 7–9, 2015, conference track proceedings

  47. Deng J, Dong W, Socher R, Li LJ, Li K, Li FF (2009) ImageNet: a large-scale hierarchical image database. In: IEEE conference on computer vision & pattern recognition

  48. Vedaldi A, Lenc K (2015) MatConvNet—convolutional neural networks for MATLAB

  49. Danelljan M, Hager G, Khan FS, Felsberg M (2016) Convolutional features for correlation filter based visual tracking. In: IEEE international conference on computer vision workshop

  50. Bertinetto L, Valmadre J, Henriques JF, Vedaldi A, Torr PHS (2016) Fully-convolutional siamese networks for object tracking. In: European conference on computer vision

  51. Bertinetto L, Valmadre J, Golodetz S, Miksik O, Torr P (2016) Staple: complementary learners for real-time tracking. In: Computer vision & pattern recognition

  52. Li X, Ma C, Wu B, He Z, Yang M-H (2019) Target-aware deep tracking. In: The IEEE conference on computer vision and pattern recognition (CVPR)

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China (Nos. 61872326, 61672475, 61772526); Shandong Provincial Natural Science Foundation (ZR2019MF044).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhiqiang Wei.

Ethics declarations

Conflict of interest

We declare that we have no conflicts of interest to this work.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, X., Huang, L., Wei, Z. et al. Adaptive multi-branch correlation filters for robust visual tracking. Neural Comput & Applic 33, 2889–2904 (2021). https://doi.org/10.1007/s00521-020-05126-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-05126-9

Keywords

Navigation