Machine Vision and Applications

, Volume 27, Issue 8, pp 1339–1349 | Cite as

Context-based multi-target tracking with occlusion handling

Special Issue Paper

Abstract

Occlusion has long been a core challenge for multi-target tracking tasks. In this paper we present context-based tracking strategies and demonstrate those for two very different types of targets, namely vehicles and fruit flies, representing examples of different target categories (e.g. individually identifiable with relatively consistent trajectories versus nearly identical targets with highly irregular trajectories). Those two classes of targets are also recorded with either mobile or static camera systems, and they represent either long-term or high-frequency occlusion scenarios, respectively. Occlusions among rigid vehicles have various occlusion patterns because of the mobile recording platform and the dynamic traffic environment. In contrast, a high-density scene of fruit flies contains hundreds of targets where occlusion is relatively short, but the frequency of occlusions is very high. In this paper we propose tracking systems based on context information, and show that those are able to address both application scenarios of target tracking. The proposed strategy outperforms state-of-the-art methods in both cases. Experimental results also demonstrate the efficiency of the proposed systems for occlusion handling.

Keywords

Tracking Occlusion Stereo vision Context 

References

  1. 1.
    Andriluka, M., Roth, S., Schiele, B.: People-tracking-by-detection and people-detection-by-tracking. CVPR, pp 1–8, (2008)Google Scholar
  2. 2.
    Ardekani, R., Biyani, A., Dalton, J.E., Saltz, J.B., Arbeitman, M.N., Tower, J., Nuzhdin, J., Tavaré, S.: Three-dimensional tracking and behaviour monitoring of multiple fruit flies. J. R. Soc. Interface 10, 1–13 (2013)Google Scholar
  3. 3.
    Du, H., Zou, D., Chen, Y. Q.: Relative epipolar motion of tracked features for correspondence in binocular stereo. ICCV (2007)Google Scholar
  4. 4.
    Felzenszwalb, P.F., Girshick, R.B., McAllester, D., Ramanan, D.: Object detection with discriminatively trained part-based models. IEEE Trans. PAMI 32(9), 1627–1645 (2010)CrossRefGoogle Scholar
  5. 5.
    Gao, T., Packer, B., Koller, D.: A segmentation-aware object detection model with occlusion handling. CVPR. (2011)Google Scholar
  6. 6.
    Girshick, R. B., Felzenszwalb, P. F., Mcallester, D. A.: Object detection with grammar models. NIPS, (2011)Google Scholar
  7. 7.
    Grover, D., Tower, J., Tavaré, S.: O fly, where art thou? J. R. Soc. 5, 1181–1191 (2008)CrossRefGoogle Scholar
  8. 8.
    Klette, R.: Concise Computer Vision. 429. Springer, London (2014)CrossRefMATHGoogle Scholar
  9. 9.
    Kohlhoff, K.J., Jahn, T.R., Lomas, D.A., Dobson, C.M., Crowther, D.C., Vendruscolo, C.M.: The iFly tracking system for an automated locomotor and behavioural analysis of Drosophila melanogaster. Integ. Biol. 3(7), 755–760 (2011)CrossRefGoogle Scholar
  10. 10.
    Leal-Taixé, L., Pons-Moll, G., Rosenhahn, B.: Everybody needs somebody: Modeling social and grouping behavior on a linear programming multiple people tracker. ICCV Workshops, pp. 120–127, (2011)Google Scholar
  11. 11.
    Li, Y., Huang, C., Nevatia, R.: Learning to associate: Hybridboosted multi-target tracker for crowded scene. CVPR, pp. 2953–2960, (2009)Google Scholar
  12. 12.
    Li, B., Wu, T., Zhu, S. C.: Integrating context and occlusion for car detection by hierarchical and-or model. ECCV, (2014)Google Scholar
  13. 13.
    Li, B., Song, X., Wu, T., Hu, W., Pei, M.: Coupling-and-decoupling: a hierarchical model for occlusion-free object detection. Pattern Recognit. 47(10), 3254–3264 (2014)CrossRefGoogle Scholar
  14. 14.
    Liu, Y., Li, H., Chen, Y. Q.: Automatic tracking of a large number of moving targets in 3d. ECCV, pp. 730–742, (2012)Google Scholar
  15. 15.
    Liu, J., Carr, P., Collins, R. T., Liu, Y.: Tracking sports players with context-conditioned motion models. CVPR, (2013)Google Scholar
  16. 16.
    Mitzel, D., Horbert, E., Ess, A., Leibe, B.: Multi-person tracking with sparse detection and continuous segmentation. ECCV, pp. 397–410, (2010)Google Scholar
  17. 17.
    Pellegrini, S., Ess, A., Schindler, A., Van Gool, L.: You’ll never walk alone: Modeling social behavior for multi-target tracking. ICCV, pp. 261–268, (2009)Google Scholar
  18. 18.
    Pepikj, B., Stark, M., Gehler, P., Schiele, B.: Occlusion patterns for object class detection. CVPR, (2013)Google Scholar
  19. 19.
    Pereira, F., Stuer, H., Graff, E.C., Gharib, M.: Two-frame 3d particle tracking. J. Meas. Sci. Technol. 17, 1–8 (2006)CrossRefGoogle Scholar
  20. 20.
    Perera, A. A., Srinivas, C., Hoogs, A., Brooksby, G., Hu, W.:Multi-object tracking through simultaneous long occlusions andsplit-merge conditions. CVPR, pp. 666–673, (2006)Google Scholar
  21. 21.
    Possegger, H., Mauthner, T., Roth, P. M., Bischof, H.: Occlusion geodesics for online multi-object tracking. CVPR, (2014)Google Scholar
  22. 22.
    Straw, A.D., Branson, K., Neumann, T.R., Dickinson, M.H.: Multi-camera real-time three-dimensional tracking of multiple flying animals. J. R. Soc. 8, 395–409 (2011)CrossRefGoogle Scholar
  23. 23.
    Szeliski, R.: Computer Vision: Algorithms and Applications. Springer, New York (2010)MATHGoogle Scholar
  24. 24.
    Tang, S., Andriluka, M., Schiele, B.: Detection and tracking of occluded people. IJCV 110(1), 58–69 (2009)CrossRefGoogle Scholar
  25. 25.
    Tao, J., Risse, B., Jiang, X., Klette, R.: 3D trajectory estimation of simulated fruit flies. In: 27th Image Vision Computing New Zealand, pp.31–36. ACM, Dunedin (2012)Google Scholar
  26. 26.
    Tao, J., Risse, B., Jiang, X.: Stereo and Motion Based 3D High Density Object Tracking. PSIVT, pp.136–148, (2013)Google Scholar
  27. 27.
    Tao, J., Enzweiler, M., Franke, U., Pfeiffer, D., Klette, R.: What Is in Front? Multiple-Object Detection and Tracking with Dynamic Occlusion Handling. CAIP, pp. 14–26, (2015)Google Scholar
  28. 28.
    Wang, X., Han, T. X., Yan, S.: An HOG-LBP human detector with partial occlusion handling. ICCV, (2009)Google Scholar
  29. 29.
    Wu, B., Nevatia, R.: Detection and tracking of multiple, partially occluded humans by Bayesian combination of edgelet based part detectors. Int. J. Comput. Vis. 75, 247–266 (2007)CrossRefGoogle Scholar
  30. 30.
    Wu, H.S., Zhao, Q., Zou, D., Chen, Y.Q.: Acquiring 3d motion trajectories of large numbers of swarming animals, pp. 593–600. ICCV, Workshop (2009)Google Scholar
  31. 31.
    Xing, J., Ai, H., Lao, S.: Multi-object tracking through occlusions by local tracklets filtering and global tracklets association with detection responses. CVPR, pp. 1200–1207, (2009)Google Scholar
  32. 32.
    Yang, B., Nevatia, R.: An online learned CRF model for multi-target tracking. CVPR, pp. 2034–2041, (2012)Google Scholar
  33. 33.
    Yang, B., Nevatia, R.: Online learned discriminative part-based appearance models for multi-human tracking. ECCV, pp. 484–498, (2012)Google Scholar
  34. 34.
    Zia, M. Z., Stark, M., Schindler, K.: Explicit occlusion modeling for 3d object class representations. CVPR, pp. 3326–3333, (2013)Google Scholar
  35. 35.
    Zou, D., Zhao, Q., Wu, H. S., Chen, Y. Q.: Reconstructing 3d motion trajectories of particle swarms by global correspondence selection. ICCV, pp. 1578–1585, (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  1. 1.Auckland University of TechnologyAucklandNew Zealand
  2. 2.Daimler A.GBoeblingenGermany

Personalised recommendations