Skip to main content
Log in

Improving the anti-occlusion ability of correlation filter-based trackers via segmentation

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

In recent years, correlation filter (CF)-based trackers have undergone rapid development and achieved state-of-the-art performance. However, CF-based trackers lack the ability to perceive the local variation of the target, such as occlusion, because they rely on the features from a bounding box to distinguish the target. In contrast, segmentation-based trackers, which distinguish the target based on pixel- or superpixel-level information, can sufficiently perceive local variation. However, their robustness is inferior to that of CF methods due to the lack of high-level semantic information. In this research, the advantages of both methods were combined to improve the anti-occlusion ability of CF trackers. The image segmentation-based occlusion estimation agency (ISBOEA) method is proposed to perceive occlusions, after which the occlusion information is used to guide the training and searching of CF trackers. In experiments conducted in this study, two CF-based trackers, namely the background-aware CF (BACF) and the efficient convolution operators: hand-crafted feature version (ECO_HC), were employed as baselines for tracking. Moreover, the minimum barrier distance (MBD) transform method was employed to conduct image segmentation. Extensive experiments were performed on standard benchmark datasets, namely OTB50, OTB100, GOT-10k and VIVID. The results demonstrate that the proposed ISBOEA method can remarkably improve the anti-occlusion ability of CF-based trackers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Bertinetto L, Valmadre J, Henriques JF, Vedaldi A, Torr PHS (2016) Fully-convolutional siamese networks for object tracking. In: Hua G, Jégou H (eds) Computer vision - ECCV 2016 workshops. Springer International Publishing, Cham, pp 850–865

    Chapter  Google Scholar 

  2. Bolme DS, Beveridge JR, Draper BA, Lui YM (2010) Visual object tracking using adaptive correlation filters. In: 2010 IEEE computer society conference on computer vision and pattern recognition. IEEE, pp 2544–2550

  3. Collins R, Zhou X, Teh, SK (2005) An open source tracking testbed and evaluation web site. In: IEEE international workshop on performance evaluation of tracking and surveillance, vol 2. p 35

  4. Danelljan M, Bhat G, Shahbaz Khan F, Felsberg M (2017) Eco: Efficient convolution operators for tracking. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp 6638–6646

  5. Danelljan M, Hager G, Shahbaz Khan F, Felsberg M (2015) Convolutional features for correlation filter based visual tracking. In: Proceedings of the IEEE international conference on computer vision workshops. pp 58–66

  6. Guo D, Wang J, Cui Y, Wang Z, Chen S (2020) Siamcar: Siamese fully convolutional classification and regression for visual tracking. In: 2020 IEEE/CVF conference on computer vision and pattern recognition (CVPR). pp 6268–6276. https://doi.org/10.1109/CVPR42600.2020.00630

  7. Henriques JF, Caseiro R, Martins P, Batista J (2012) Exploiting the circulant structure of tracking-by-detection with kernels. In: European conference on computer vision. Springer, pp 702–715

  8. Hong Z, Wang C, Mei X, Prokhorov D, Tao D (2014) Tracking using multilevel quantizations. In: European conference on computer vision. Springer, pp 155–171

  9. Huang L, Zhao X, Huang K (2021) Got-10k: A large high-diversity benchmark for generic object tracking in the wild. IEEE Transactions on Pattern Analysis and Machine Intelligence 43(5):1562–1577. https://doi.org/10.1109/TPAMI.2019.2957464

    Article  Google Scholar 

  10. Jiang K, Qian F, Song C, Zhang B (2018) An approach to overcome occlusions in visual tracking: By occlusion estimating agency and self-adapting learning rate for filter’s training. IEEE Signal Processing Letters 25(12):1890–1894

    Article  Google Scholar 

  11. Kalal Z, Mikolajczyk K, Matas J (2011) Tracking-learning-detection. IEEE transactions on pattern analysis and machine intelligence 34(7):1409–1422

    Article  Google Scholar 

  12. Kiani Galoogahi H, Fagg A, Lucey S (2017) Learning background-aware correlation filters for visual tracking. In: Proceedings of the IEEE international conference on computer vision. pp 1135–1143

  13. Son J, Jung I, Park K, Han B (2015) Tracking-by-segmentation with online gradient boosting decision tree. In: Proceedings of the IEEE international conference on computer vision. pp 3056–3064

  14. Strand R, Ciesielski KC, Malmberg F, Saha PK (2013) The minimum barrier distance. Computer Vision and Image Understanding 117(4):429–437

    Article  MATH  Google Scholar 

  15. Valmadre J, Bertinetto L, Henriques J, Vedaldi A, Torr PHS (2017) End-to-end representation learning for correlation filter based tracking. In: 2017 IEEE conference on computer vision and pattern recognition (CVPR). pp 5000–5008. https://doi.org/10.1109/CVPR.2017.531

  16. Wang M, Liu Y, Huang Z (2017) Large margin object tracking with circulant feature maps. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp 4021–4029

  17. Wang Z (2017) Image segmentation by combining the global and local properties. Expert Systems with Applications 87(30):30–40. https://doi.org/10.1016/j.eswa.2017.06.008

    Article  Google Scholar 

  18. Wang Z (2020) Robust segmentation of the colour image by fusing the sdd clustering results from different colour spaces. IET Image Processing 14(13):3273–3281. https://doi.org/10.1049/iet-ipr.2019.1481

    Article  Google Scholar 

  19. Wu Y, Lim J, Yang MH (2013) Online object tracking: A benchmark. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp 2411–2418

  20. Wu Y, Lim J, Yang MH (2015) Object tracking benchmark. IEEE Transactions on Pattern Analysis and Machine Intelligence 37(9):1834–1848. https://doi.org/10.1109/TPAMI.2014.2388226

    Article  Google Scholar 

  21. Yang K, He Z, Pei W, Zhou Z, Li X, Yuan D, Zhang H (2021) Siamcorners: Siamese corner networks for visual tracking. IEEE Transactions on Multimedia :1–1. https://doi.org/10.1109/TMM.2021.3074239

  22. Yeo D, Son J, Han B, Hee Han J (2017) Superpixel-based tracking-by-segmentation using markov chains. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp 1812–1821

  23. Zhang J, Sclaroff S, Lin Z, Shen X, Price B, Mech R (2015) Minimum barrier salient object detection at 80 fps. In: Proceedings of the IEEE international conference on computer vision. pp 1404–1412

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongying Zhao.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jiang, K., Yan, L., Zhang, Z. et al. Improving the anti-occlusion ability of correlation filter-based trackers via segmentation. Appl Intell 53, 2815–2824 (2023). https://doi.org/10.1007/s10489-021-03058-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-03058-y

Keywords

Navigation