Skip to main content
Log in

Real-time multi-scale tracking based on compressive sensing

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Tracking-by-detection methods have been widely studied and some promising results have been obtained. These methods use discriminative appearance models to train and update online classifiers. They also use a sliding window to detect samples which will then be classified. Then, the location of the sample with the maximum classifier response will be selected as the new location. Compressive tracking was recently proposed with an appearance model based on features extracted in the compressed domain. However, CT uses a fixed-size tracking box to detect samples, and this is unsuitable for practice applications. CT detects samples around the selected region of the previous frame within a fixed radius. Here, the classifier may become inaccurate if the selected region drifts. The fixed radius is also not suitable for tracking targets that experience abrupt acceleration changes. Furthermore, CT updates the classifier parameters with constant learning rate. If the target is fully occluded for an extended period, the classifier will instead learn the features of the cover object and the target will ultimately be lost. In this paper, we present a multi-scale compressive tracker. This tracker integrates an improved appearance model based on normalized rectangle features extracted in the adaptive compressive domain into the bootstrap filter. This type of feature extraction is efficient, and the computation complexity does not increase as the tracking regions become larger. The classifier response is utilized to generate particle importance weight and a re-sample procedure preserves samples according to weight. A 2-order transition model considers the target velocity to estimate the current position and scale status. In this way, the sampling is not limited to a fixed range. Here, feedback strategies are adopted to adjust learning rate for occlusion. Experimental results on various benchmark challenging sequences have demonstrated the superior performance of our tracker when compared with several state-of-the-art tracking algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. http://visual-tracking.net.

  2. http://vision.ucsd.edu/~bbabenko/project_miltrack.shtml.

  3. http://www4.comp.polyu.edu.hk/~cslzhang/CT/CT.htm.

References

  1. Isard, M., Blake, A.: Condensation-conditional density propagation for visual tracking. Int. J. Comput. Vision. 29(1), 5–28 (1998)

    Article  Google Scholar 

  2. Isard, M., Blake, A.: Icondensation: unifying low-level and high-level tracking in a stochastic framework. In: Proceedings 5th European Conference on Computer Vision (ECCV 1998), pp. 893–908 (1998)

  3. MacCormick, J., Blake, A.: A probabilistic exclusion principle for tracking multiple objects. In: Proceedings Seventh IEEE International Conference on Computer Vision (ICCV 1999), pp. 572–578 (1999)

  4. Wu, Y., Huang, T.S.: Robust visual tracking by integrating multiple cues based on co-inference learning. Int. J. Comput. Vision. 58(1), 55–71 (2004)

    Article  Google Scholar 

  5. Black, M.J., Jepson, A.D.: Eigen tracking: Robust matching and tracking of articulated objects using a view-based representation. In: Proceedings Fourth European Conference on Computer Vision (ECCV 1996), pp. 329–342 (1996)

  6. Jepson, A.D., Fleet, D.J., El-Maraghi, T.F.: Robust online appearance models for visual tracking. IEEE Trans. Pattern Anal. Mach. Intell. 25(10), 1296–1311 (2003)

    Article  Google Scholar 

  7. Ross, D.A., Lim, J., Lin, R.S., Yang, M.H.: Incremental learning for robust visual tracking. Int. J. Comput. Vision. 77(1–3), 125–141 (2008)

    Article  Google Scholar 

  8. Li, H.X., Shen, C.H., Shi, Q.F.: Real-time visual tracking using compressive sensing. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2011), pp. 1305–1312 (2011)

  9. Mei, X., Ling, H.B.: Robust visual tracking using L1 minimization. In: IEEE 12th International Conference on Computer Vision (ICCV 2009), pp. 1436–1443 (2009)

  10. Avidan, S.: Support vector tracking. IEEE Trans. Pattern Anal. Mach. Intell. 26(8), 1064–1072 (2004)

    Article  Google Scholar 

  11. Collins, R.T., Liu, Y.X., Leordeanu, M.: Online selection of discriminative tracking features. IEEE Trans. Pattern Anal. Mach. Intell. 27(10), 1631–1643 (2005)

    Article  Google Scholar 

  12. Grabner, H., Grabner, M., Bischof, H.: Real-time tracking via online boosting. In: Proceedings British Machine Vision Conference, pp. 47–56 (2006)

  13. Grabner, H., Leistner, C., Bischof, H.: Semi-supervised online boosting for robust tracking. In: 10th European Conference on Computer Vision (ECCV 2008), pp. 234–247 (2008)

  14. Babenko, B., Yang, M.H., Belongie, S.: Robust object tracking with online multiple instance learning. IEEE Trans. Pattern Anal. Mach. Intell. 33(8), 1619–1632 (2011)

    Article  Google Scholar 

  15. Zhang, K.H., Zhang, L., Yang, M.H.: Real-time compressive tracking. In: 12th European Conference on Computer Vision (ECCV 2012), pp. 864–877 (2012)

  16. Mei, X., Ling, H.B., Wu, Y., Blasch, E.P., Bai, L.: Minimum error bounded efficient l1 tracker with occlusion detection. In: Proceedings IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2011), pp. 1257–1264 (2011)

  17. Bao, C.L., Wu, Y., Ling, H.B., Ji, H.: Real time robust L1 tracker using accelerated proximal gradient approach. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2012), pp. 16–21 (2012)

  18. Liu, B., Yang, L., Huang, J.Z., Meer, P., Gong, L.G., Kulikowski, C.: Robust and fast collaborative tracking with two stage sparse optimization. In: Proceedings 11th European Conference on Computer Vision (ECCV 2010), pp. 624–637 (2010)

  19. Zhang, T.Z., Ghanem, B., Liu, S., Ahuja, N.: Robust visual tracking via multi-task sparse learning. In: Proceedings IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2012), pp. 2042–2049 (2012)

  20. Arulampalam, M.S., Maskell, S., Gordon, N., Clapp, T.: A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans. Signal Process. 50(2), 174–188 (2002)

    Article  Google Scholar 

  21. Gordon, N.J., Salmond, D.J., Smith, A.F.M.: Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proc. Part F. Radar Signal Process. 140(2), 107–113 (1993)

    Article  Google Scholar 

  22. Yang, C.J., Duraiswami, R., Davis, L.: Fast multiple object tracking via a hierarchical particle filter. In: Proceedings Tenth IEEE International Conference on Computer Vision (ICCV 2005), pp. 212–219 (2005)

  23. Wu, Y., Lim, J., Yang, M.H.: Online Object Tracking: A Benchmark. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2013), pp. 2411–2418 (2013)

  24. Pang, Y., Ling, H.B.: Finding the best from the second bests—inhibiting subjective bias in evaluation of visual tracking algorithms. In: IEEE International Conference on Computer Vision (ICCV 2013)

  25. Hare, S., Saffari, A., Torr, P.H.S.: Struck: Structured output tracking with kernels. In: IEEE International Conference on Computer Vision (ICCV 2011), pp. 263–270 (2011)

  26. Kalal, Z., Matas, J., Mikolajczyk, K.: P-N learning: Bootstrapping binary classifiers by structural constraints. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2010), pp. 49–56 (2010)

  27. Dinh, T.B., Vo, N., Medioni, G.: Context tracker: exploring supporters and distracters in unconstrained environments. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2011), pp. 1177–1184 (2011)

  28. Kwon, J., Lee, K.M.: Tracking by sampling trackers. In: IEEE International Conference on Computer Vision (ICCV 2011), pp. 1195–1202 (2011)

  29. Kwon, J., Lee, K.M.: Visual tracking decomposition. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2010), pp. 1269–1276 (2010)

  30. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: 12th European Conference on Computer Vision (ECCV 2012), pp. 702–715 (2012)

  31. Liu, B.Y., Huang, J.Z., Yang, L., Kulikowsk, C.: Robust tracking using local sparse appearance model and K-selection. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2011), pp. 1313–1320 (2011)

  32. Sevilla-Lara, L., Learned-Miller, E.: Distribution fields for tracking. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2012), pp. 1910–1917 (2012)

  33. Zhang, T.Z., Ghanem, B., Liu, S., Ahuja, N.: Robust visual tracking via structured multi-task sparse learning. Int. J. Comput. Vision. 101(2), 367–383 (2013)

    Article  MathSciNet  Google Scholar 

  34. Zhong, W., Lu, H.C., Yang, M.H.: Robust object tracking via sparsity-based collaborative model. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2012), pp. 1838–1845 (2012)

  35. Jia, X., Lu, H., Yang, M.H.: Visual tracking via adaptive structural local sparse appearance model. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2012), pp. 1822–1829 (2012)

Download references

Acknowledgments

This research work was supported by the National Natural Science Foundation of China (Grant Nos. 51074169 and 51134024), the National High Technology Research and Development Program of China (863 Program) (Grant No. 2012AA062203). The authors would like to thank the anonymous reviewers for their helpful comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ni Jia.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mpg 24678 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, Y., Jia, N. & Sun, J. Real-time multi-scale tracking based on compressive sensing. Vis Comput 31, 471–484 (2015). https://doi.org/10.1007/s00371-014-0942-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-014-0942-5

Keywords

Navigation