Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Part-based visual tracking with spatially regularized correlation filters

  • 268 Accesses

  • 3 Citations

Abstract

Discriminative Correlation Filters (DCFs) have demonstrated excellent performance in visual object tracking. These methods utilize a periodic assumption of the training samples to efficiently learn a classifier on image patches; unfortunately, this also introduces unwanted boundary effects. Recently, Spatially Regularized Discriminative Correlation Filters (SRDCFs) were proposed to resolve this issue by introducing penalization weights to the filter coefficients, thereby efficiently reducing boundary effects by assigning higher weights to the background. However, due to the variable target scale, defining the penalization ratio is non trivial; thus, it is possible to penalize the image content while also penalizing the background. In this paper, we investigate SRDCFs and present a novel and efficient part-based tracking framework by exploiting multiple SRDCFs. Compared with existing trackers, the proposed method has several advantages. (1) We define multiple correlation filters to extract features within the range of the object, thereby alleviating the boundary effect problem and avoiding penalization of the target content. (2) Through the combination of cyclic object shifts with penalized filters to build part-based object trackers, there is no need to divide training samples into parts. (3) Comprehensive comparisons demonstrate that our approach achieves a performance equivalent to that of the baseline SRDCF tracker on a set of benchmark datasets, namely, OTB2013, OTB2015 and VOT2017. In addition, compared with other state-of-the-art trackers, our approach demonstrates superior performance.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Notes

  1. 1.

    http://www.votchallenge.net/vot2017/.

References

  1. 1.

    Bai, B., Zhong, B., Ouyang, G., Wang, P., Liu, X., Chen, Z., Wang, C.: Kernel correlation filters for visual tracking with adaptive fusion of heterogeneous cues. Neurocomputing 286, 109–120 (2018)

  2. 2.

    Bibi, A., Mueller, M., Ghanem, B.: Target response adaptation for correlation filter tracking. In: European Conference on Computer Vision, pp. 419–433. Springer (2016)

  3. 3.

    Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2544–2550. IEEE (2010)

  4. 4.

    Cehovin, L., Kristan, M., Leonardis, A.: Robust visual tracking using an adaptive coupled-layer visual model. IEEE Trans. Pattern Anal. Mach. Intell. 35(4), 941–953 (2013)

  5. 5.

    Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: CVPR 2005. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2005, vol. 1, pp. 886–893. IEEE (2005)

  6. 6.

    Danelljan, M., Hager, G., Shahbaz Khan, F., Felsberg, M,: Learning spatially regularized correlation filters for visual tracking. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 4310–4318 (2015)

  7. 7.

    Fan, H., Xiang, J., Xu, J., Liao, H.: Part-based visual tracking via online weighted p–n learning. Sci. World J. 2014, 402185 (2014)

  8. 8.

    Felzenszwalb, P.F., Girshick, R.B., McAllester, D., Ramanan, D.: Object detection with discriminatively trained part-based models. IEEE Trans. Pattern Anal. Mach. Intell. 32(9), 1627–1645 (2010)

  9. 9.

    Gao, J., Ling, H., Hu, W., Xing, J.: Transfer learning based visual tracking with Gaussian processes regression. In: European Conference on Computer Vision, pp. 188–203. Springer (2014)

  10. 10.

    Godec, M., Roth, P.M., Bischof, H.: Hough-based tracking of non-rigid objects. Comput. Vis. Image Underst. 117(10), 1245–1256 (2013)

  11. 11.

    Guan, H., Cheng, B.: How do deep convolutional features affect tracking performance: an experimental study. Visual Comput. 34(12), 1701–1711 (2018)

  12. 12.

    Hare, S., Golodetz, S., Saffari, A., Vineet, V., Cheng, M.-M., Hicks, S.L., Torr, P.H.S.: Struck: structured output tracking with kernels. IEEE Trans. Pattern Anal. Mach. Intell. 38(10), 2096–2109 (2016)

  13. 13.

    Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: European Conference on Computer Vision, pp. 702–715. Springer (2012)

  14. 14.

    Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)

  15. 15.

    Hu, X., Yang, Y.: Faster spatially regularized correlation filters for visual tracking. arXiv preprint. arXiv:1706.00140 (2017)

  16. 16.

    Hwang, J.P., Baek, J., Choi, B., Kim, E.: A novel part-based approach to mean-shift algorithm for visual tracking. Int. J. Control Autom. Syst. 13(2), 443–453 (2015)

  17. 17.

    Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking–learning–detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2012)

  18. 18.

    Kiani Galoogahi, H., Sim, T., Lucey, S.: Correlation filters with limited boundaries. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4630–4638 (2015)

  19. 19.

    Kristan, M., Eldesokey, A., Xing, Y., Fan, Y., Zhu, Z., Zhang, Z., He, Z., Fernandez, G., Garciamartin, A., Muhic, A.: The visual object tracking VOT2017 challenge results. In: IEEE International Conference on Computer Vision Workshop, pp. 1949–1972 (2017)

  20. 20.

    Kwon, J., Lee, K.M.: Visual tracking decomposition. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1269–1276. IEEE (2010)

  21. 21.

    Kwon, J., Lee, K.M.: Highly nonrigid object tracking via patch-based dynamic appearance modeling. IEEE Trans. Pattern Anal. Mach. Intell. 35(10), 2427–2441 (2013)

  22. 22.

    Li, Y., Zhu, J.: A scale adaptive kernel correlation filter tracker with feature integration. In: ECCV Workshops, no. 2, pp. 254–265 (2014)

  23. 23.

    Li, Y., Zhu, J., Hoi, S.C.H.: Reliable patch trackers: robust visual tracking by exploiting reliable patches. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 353–361 (2015)

  24. 24.

    Li, Z., He, S., Hashem, M.: Robust object tracking via multi-feature adaptive fusion based on stability: contrast analysis. Visual Comput. 31(10), 1319–1337 (2015)

  25. 25.

    Li, Z., Xiaoping, Y., Li, P., Hashem, M.: Moving object tracking based on multi-independent features distribution fields with comprehensive spatial feature similarity. Visual Comput. 31(12), 1633–1651 (2015)

  26. 26.

    Liu, B., Huang, J., Kulikowski, C., Yang, L.: Robust visual tracking using local sparse appearance model and k-selection. IEEE Trans. Pattern Anal. Mach. Intell. 35(12), 2968–2981 (2013)

  27. 27.

    Liu, S., Zhang, T., Cao, X., Xu, C.: Structural correlation filter for robust visual tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4312–4320 (2016)

  28. 28.

    Liu, T., Wang, G., Yang, Q.: Real-time part-based visual tracking via adaptive correlation filters. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4902–4912 (2015)

  29. 29.

    Ma, C., Yang, X., Zhang, C., Yang, M.-H.: Long-term correlation tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5388–5396 (2015)

  30. 30.

    Matthews, L., Ishikawa, T., Baker, S.: The template update problem. IEEE Trans. Pattern Anal. Mach. Intell. 26(6), 810–815 (2004)

  31. 31.

    Mbelwa, J.T., Zhao, Q., Lu, Y., Liu, H., Wang, F., Mbise, M.: Objectness-based smoothing stochastic sampling and coherence approximate nearest neighbor for visual tracking. Visual Comput. 1–14 (2018). https://doi.org/10.1007/s00371-018-1470-5

  32. 32.

    Medioni, G.: Context tracker: exploring supporters and distracters in unconstrained environments. In: Computer Vision and Pattern Recognition, pp. 1177–1184 (2011)

  33. 33.

    Quan, W., Chen, J.X., Yu, N.: Robust object tracking using enhanced random ferns. Visual Comput. 30(4), 351–358 (2014)

  34. 34.

    Quan, W., Jiang, Y., Zhang, J., Chen, J.X.: Robust object tracking with active context learning. Visual Comput. 31(10), 1307–1318 (2015)

  35. 35.

    Zhigang, T., Xie, W., Qin, Q., Poppe, R., Veltkamp, R.C., Li, B., Yuan, J.: Multi-stream CNN: learning representations based on human-related regions for action recognition. Pattern Recognit. 79, 32–43 (2018)

  36. 36.

    Valmadre, J., Bertinetto, L., Henriques, J., Vedaldi, A., Torr, P.H.S.: End-to-end representation learning for correlation filter based tracking. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 5000–5008. IEEE (2017)

  37. 37.

    Van De Weijer, J., Schmid, C., Verbeek, J., Larlus, D.: Learning color names for real-world applications. IEEE Trans. Image Process. 18(7), 1512–1523 (2009)

  38. 38.

    Wang, Q., Gao, J., Xing, J., Zhang, M., Hu, W.: Dcfnet: discriminant correlation filters network for visual tracking. arXiv preprintarXiv:1704.04057 (2017)

  39. 39.

    Wang, Z., Yoon, S., Xie, S.J., Lu, Y., Park, D.S.: Visual tracking with semi-supervised online weighted multiple instance learning. Visual Comput. 32(3), 307–320 (2016)

  40. 40.

    Wu, Y., Lim, J., Yang, M.-H.: Online object tracking: a benchmark. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2411–2418 (2013)

  41. 41.

    Yi, W., Lim, J., Yang, M.-H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)

  42. 42.

    Yunxia, W., Jia, N., Sun, J.: Real-time multi-scale tracking based on compressive sensing. Visual Comput. 31(4), 471–484 (2015)

  43. 43.

    Yang, M., Yuan, J., Wu, Y.: Spatial selection for attentional visual tracking. In: CVPR’07. IEEE Conference on Computer Vision and Pattern Recognition, 2007, pp. 1–8. IEEE (2007)

  44. 44.

    Zhan, J., Zhuo, S., Hefeng, W., Luo, X.: Robust tracking via discriminative sparse feature selection. Visual Comput. 31(5), 575–588 (2015)

  45. 45.

    Zhang, H., Liu, G.: Coupled-layer based visual tracking via adaptive kernelized correlation filters. Visual Comput. 34(1), 41–54 (2018)

  46. 46.

    Zhang, T., Ghanem, B., Liu, S., Ahuja, N.: Robust visual tracking via multi-task sparse learning. In: Computer Vision and Pattern Recognition, pp. 2042–2049 (2012)

  47. 47.

    Zhang, T., Ghanem, B., Liu, S., Ahuja, N.: Low-Rank Sparse Learning for Robust Visual Tracking. Springer, Berlin (2012)

  48. 48.

    Zhang, T., Jia, K., Xu, C., Ma, Y., Ahuja, N.: Partial occlusion handling for visual tracking via robust part matching. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1258–1265 (2014)

  49. 49.

    Zhao, L., Zhao, Q., Liu, H., Lv, P., Dongbing, G.: Structural sparse representation-based semi-supervised learning and edge detection proposal for visual tracking. Visual Comput. 33(9), 1169–1184 (2017)

  50. 50.

    Zhong, B., Zhang, J., Wang, P., Du, J., Chen, D.: Jointly feature learning and selection for robust tracking via a gating mechanism. PLOS ONE 11(8), e0161808 (2016)

  51. 51.

    Zhong, B., Chen, Y., Shen, Y., Chen, Y., Cui, Z., Ji, R., Yuan, X., Chen, D., Chen, W.: Robust tracking via patch-based appearance model and local background estimation. Neurocomputing 123, 344–353 (2014)

  52. 52.

    Zhong, B., Yao, H., Chen, S., Ji, R., Chin, T.J., Wang, H.: Visual tracking via weakly supervised learning from multiple imperfect oracles. Pattern Recognit. 47(3), 1395–1410 (2014)

  53. 53.

    Zhong, W., Lu, H., Yang, M.-H.: Robust object tracking via sparsity-based collaborative model. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1838–1845. IEEE (2012)

Download references

Acknowledgements

This study was funded by the National Natural Science Foundation of China (Grant nos. 61702350 and 61472289) and the Open Project Program of State Key Laboratory of Digital Manufacturing Equipment and Technology at HUST (Grant no. DMETKF2017016).

Author information

Correspondence to Dejun Zhang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zhang, D., Zhang, Z., Zou, L. et al. Part-based visual tracking with spatially regularized correlation filters. Vis Comput 36, 509–527 (2020). https://doi.org/10.1007/s00371-019-01634-5

Download citation

Keywords

  • Correlation filter tracking
  • Discriminative Correlation Filter
  • Part-based tracking
  • Spatially regularized filter