Robust Visual Tracking with Double Bounding Box Model

  • Junseok Kwon
  • Junha Roh
  • Kyoung Mu Lee
  • Luc Van Gool
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8689)

Abstract

A novel tracking algorithm that can track a highly non-rigid target robustly is proposed using a new bounding box representation called the Double Bounding Box (DBB). In the DBB, a target is described by the combination of the Inner Bounding Box (IBB) and the Outer Bounding Box (OBB). Then our objective of visual tracking is changed to find the IBB and OBB instead of a single bounding box, where the IBB and OBB can be easily obtained by the Dempster-Shafer (DS) theory. If the target is highly non-rigid, any single bounding box cannot include all foreground regions while excluding all background regions. Using the DBB, our method does not directly handle the ambiguous regions, which include both the foreground and background regions. Hence, it can solve the inherent ambiguity of the single bounding box representation and thus can track highly non-rigid targets robustly. Our method finally finds the best state of the target using a new Constrained Markov Chain Monte Carlo (CMCMC)-based sampling method with the constraint that the OBB should include the IBB. Experimental results show that our method tracks non-rigid targets accurately and robustly, and outperforms state-of-the-art methods.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Adam, A., Rivlin, E., Shimshoni, I.: Robust fragments-based tracking using the integral histogram. In: CVPR (2006)Google Scholar
  2. 2.
    Avidan, S.: Ensemble tracking. PAMI 29(2), 261–271 (2007)CrossRefGoogle Scholar
  3. 3.
    Babenko, B., Yang, M., Belongie, S.: Visual tracking with online multiple instance learning. In: CVPR (2009)Google Scholar
  4. 4.
    Bao, C., Wu, Y., Ling, H., Ji, H.: Real time robust l1 tracker using accelerated proximal gradient approach. In: CVPR (2012)Google Scholar
  5. 5.
    Birchfield, S.: Elliptical head tracking using intensity gradients and color histograms. In: CVPR (1998)Google Scholar
  6. 6.
    Cehovin, L., Kristan, M., Leonardis, A.: An adaptive coupled-layer visual model for robust visual tracking. In: ICCV (2011)Google Scholar
  7. 7.
    Collins, R.T., Liu, Y., Leordeanu, M.: Online selection of discriminative tracking features. PAMI 27(10), 1631–1643 (2005)CrossRefGoogle Scholar
  8. 8.
    Comaniciu, D., Ramesh, V., Meer, P.: Real-time tracking of non-rigid objects using mean shift. In: CVPR (2000)Google Scholar
  9. 9.
    Dempster, A.P.: Upper and lower probabilities induced by a multivalued mapping. Ann. Math. Statist. 38(2), 325–339 (1967)CrossRefMATHMathSciNetGoogle Scholar
  10. 10.
    Faux, F., Luthon, F.: Robust face tracking using colour dempster-shafer fusion and particle filter. In: FUSION (2006)Google Scholar
  11. 11.
    Glenn, S.: A mathematical theory of evidence. Princeton University Press (1976)Google Scholar
  12. 12.
    Godec, M., Roth, P.M., Bischof, H.: Hough-based tracking of non-rigid objects. In: ICCV (2011)Google Scholar
  13. 13.
    Han, B., Davis, L.: On-line density-based appearance modeling for object tracking. In: ICCV (2005)Google Scholar
  14. 14.
    Hare, S., Saffari, A., Torr, P.H.S.: Struck: Structured output tracking with kernels. In: ICCV (2011)Google Scholar
  15. 15.
    Isard, M., Blake, A.: ICONDENSATION: Unifying low-level and high-level tracking in a stochastic framework. In: Burkhardt, H.-J., Neumann, B. (eds.) ECCV 1998. LNCS, vol. 1406, pp. 893–908. Springer, Heidelberg (1998)Google Scholar
  16. 16.
    Jepson, A.D., Fleet, D.J., Maraghi, T.F.E.: Robust online appearance models for visual tracking. PAMI 25(10), 1296–1311 (2003)CrossRefGoogle Scholar
  17. 17.
    Jia, X., Lu, H., Yang, M.H.: Visual tracking via adaptive structural local sparse appearance model. In: CVPR (2012)Google Scholar
  18. 18.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. PAMI 34(7), 1409–1422 (2012)CrossRefGoogle Scholar
  19. 19.
    Khan, Z., Balch, T., Dellaert, F.: MCMC-based particle filtering for tracking a variable number of interacting targets. PAMI 27(11), 1805–1918 (2005)CrossRefGoogle Scholar
  20. 20.
    Kwon, J., Lee, K.M.: Visual tracking decomposition. In: CVPR (2010)Google Scholar
  21. 21.
    Kwon, J., Lee, K.M.: Tracking by sampling trackers. In: ICCV (2011)Google Scholar
  22. 22.
    Kwon, J., Lee, K.M.: Tracking of a non-rigid object via patch-based dynamic appearance modeling and adaptive basin hopping monte carlo sampling. In: CVPR (2009)Google Scholar
  23. 23.
    Li, X., Dick, A., Shen, C., Zhang, Z., van den Hengel, A., Wang, H.: Visual tracking with spatio-temporal dempstershafer information fusion. TIP (2013)Google Scholar
  24. 24.
    Mei, X., Ling, H.: Robust visual tracking using l1 minimization. In: ICCV (2009)Google Scholar
  25. 25.
    Munoz-Salinas, R., Medina-Carnicer, R., Madrid-Cuevas, F., Carmona-Poyato, A.: Multi-camera people tracking using evidential filters. Ann. Math. Statist. 50, 732–749 (2009)Google Scholar
  26. 26.
    Nejhum, S.M.S., Ho, J., Yang, M.H.: Visual tracking with histograms and articulating blocks. In: CVPR (2008)Google Scholar
  27. 27.
    Pang, Y., Ling, H.: Finding the best from the second bests- inhibiting subjective bias in evaluation of visual tracking algorithms. In: ICCV (2013)Google Scholar
  28. 28.
    Pérez, P., Hue, C., Vermaak, J., Gangnet, M.: Color-based probabilistic tracking. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002, Part I. LNCS, vol. 2350, pp. 661–675. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  29. 29.
    Ramanan, D., Forsyth, D., Zisserman, A.: Tracking people by learning their appearance. PAMI 29(1), 65–81 (2007)CrossRefGoogle Scholar
  30. 30.
    Ross, D.A., Lim, J., Lin, R., Yang, M.: Incremental learning for robust visual tracking. IJCV 77(1), 125–141 (2008)CrossRefGoogle Scholar
  31. 31.
    Santner, J., Leistner, C., Saffari, A., Pock, T., Bischof, H.: Prost: Parallel robust online simple tracking. In: CVPR (2010)Google Scholar
  32. 32.
    Sevilla-Lara, L., Learned-Miller, E.: Distribution fields for tracking. In: CVPR (2012)Google Scholar
  33. 33.
    Smith, K., Gatica-Perez, D., Odobez, J.M.: Using particles to track varying numbers of interacting people. In: CVPR (2005)Google Scholar
  34. 34.
    Stalder, S., Grabner, H., Van Gool, L.: Cascaded confidence filtering for improved tracking-by-detection. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part I. LNCS, vol. 6311, pp. 369–382. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  35. 35.
    Stenger, B., Woodley, T., Cipolla, R.: Learning to track with multiple observers. In: CVPR (2009)Google Scholar
  36. 36.
    Stolkin, R., Florescu, I., Baron, M., Harrier, C., Kocherov, B.: Efficient visual servoing with the abcshift tracking algorithm. In: ICRA (2008)Google Scholar
  37. 37.
    Wu, Y., Lim, J., Yang, M.H.: Online object tracking: A benchmark. In: CVPR (2013)Google Scholar
  38. 38.
    Yilmaz, A., Javed, O., Shah, M.: Object tracking: A survey. ACM Comput. Surv. 38(4) (2006)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Junseok Kwon
    • 1
  • Junha Roh
    • 2
  • Kyoung Mu Lee
    • 3
  • Luc Van Gool
    • 1
  1. 1.Computer Vision LaboratoryETH ZurichSwitzerland
  2. 2.Imaging Media Research CenterKISTKorea
  3. 3.Department of ECEASRI, Seoul National UniversityKorea

Personalised recommendations