Object Tracking Method Using PTAMM and Estimated Foreground Regions

  • So Hayakawa
  • Shinji Fukui
  • Yuji Iwahori
  • M. K. Bhuyan
  • Robert J. Woodham
Chapter

Abstract

This chapter proposes a new approach for tracking moving objects in videos taken by a hand-held camera. The proposed method is based on the particle filter. The method is robust to occlusion by other objects. The 3D point map calculated by the Parallel Tracking and Multiple Mapping (PTAMM) is used for obtaining the positional relation between the target object and other moving objects. This causes improving the accuracy of the judgement of occlusion and being able to track the target object robustly when it is hidden by the others. The method uses the estimated foreground regions for calculating a part of likelihood. This increases the robustness of the tracking when the camera moving with rotation is used. The effectiveness of the proposed method is shown through the experiments using real videos.

Keywords

Object tracking Particle filter PTAMM 3D point map 

References

  1. 1.
    Bo, W., Nevatia, R.: Detection and tracking of multiple, partially occluded humans by Bayesian combination of edgelet based part detectors. IJCV 75(2), 247–266 (2007)CrossRefGoogle Scholar
  2. 2.
    Castle, R., Klein, G., Murray, D.W.: Video-rate localization in multiple maps for wearable augmented reality. In: 12th IEEE International Symposium on Wearable, Computing, pp. 15–22 (2008)Google Scholar
  3. 3.
    Del Bimbo, A., Dini, F.: Particle filter-based visual tracking with a first order dynamic model and uncertainty adaptation. Comput. Vis. Image Underst. 115(6), 771–786 (2011)CrossRefGoogle Scholar
  4. 4.
    Felzenszwalb, P.F., Huttenlocher, D.P.: Efficient graph-based image segmentation. IJCV 59(2), 167–181 (2004)CrossRefGoogle Scholar
  5. 5.
    Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Comm. ACM 24(6), 381–395 (1981)CrossRefMathSciNetGoogle Scholar
  6. 6.
    Isard, M., Blake, A.: CONDENSATION—conditional density propagation for visual tracking. IJCV 29(1), 5–28 (1998)CrossRefGoogle Scholar
  7. 7.
    Khan, Z., Balch, T., Dellaert, F.: An MCMC-based particle filter for tracking multiple interacting targtes. In: Proceeding of ECCV2004, vol. 4, pp. 279–290 (2004)Google Scholar
  8. 8.
    Lucas, B., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of IJCAI, pp. 674–679 (1981)Google Scholar
  9. 9.
    Perez, P., Hue, C., Vermaak, J., Gangnet, M.: Color-based probabilistic tracking. In: Proceedinf of 7th ECCV, Vol. 1, pp. 661–675 (2002)Google Scholar
  10. 10.
    Stewenius, H., Engels, C., Nister, D.: Recent developments on direct relative orientation. ISPRS J. Photogrammetry Remote Sens. 60(4), 284–294 (2006)CrossRefGoogle Scholar
  11. 11.
    Swain, M.J., Ballard, D.H.: Color indexing. IJCV 7, 11–32 (1991)CrossRefGoogle Scholar
  12. 12.
    Watanabe, G., Fukui, S., Iwahori, Y., Bhuyan, M.K., Woodham, R.J.: Robust tracking method based on particle filter for crossing of targets with similar appearances. In: SCIS-ISIS 2012, F1–54-4, pp. 1–4 (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • So Hayakawa
    • 1
  • Shinji Fukui
    • 2
  • Yuji Iwahori
    • 1
  • M. K. Bhuyan
    • 3
  • Robert J. Woodham
    • 4
  1. 1.Chubu UniversityKasugaiJapan
  2. 2.Aichi University of EducationKariyaJapan
  3. 3.Indian Institute of TechnologyGuwahatiIndia
  4. 4.University of British ColumbiaVancouverCanada

Personalised recommendations