Advertisement

Markerless Tracking for Augmented Reality

Chapter

Abstract

Augmented Reality (AR) tries to seamlessly integrate virtual content into the real world of the user. Ideally, the virtual content would behave exactly like real objects. This requires a correct and precise estimation of the user’s viewpoint (respectively that of a camera) with respect to the coordinate system of the virtual content. This can be achieved by an appropriate 6-DoF tracking system.

In this chapter we will present a general approach for a computer vision (CV) based tracking system applying an adaptive feature based tracker. We will present in detail the individual steps of the tracking pipeline and discuss a sample implementation based on SURF feature descriptors, allowing for easy understanding of the individual steps necessary upon building your own CV tracker.

Keywords

Feature Point Augmented Reality Interest Point Image Patch Feature Match 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    H. Bay, A. Ess, T. Tuytelaars, and L. van Gool, “Speeded-up robust Features (SURF),” Computer Vision and Image Understanding (CVIU), Volume 110, No. 3, pages 346–359, 2008Google Scholar
  2. 2.
    D. C. Brown, “Close-range camera calibration,” Photogrammetric Engineering, Volume 37, Number 8, pages 855–866, 1971Google Scholar
  3. 3.
    D. Cheng, S. Xie, and E. Hämmerle, “Comparison of local descriptors for image registration of geometrically-complex 3D scenes,” 14th International Conference on Mechatronics and Machine Vision in Patrice (M2VIP), pages 140–145, 2007Google Scholar
  4. 4.
    A. J. Davison, and N. Kita, “3D simultaneous localisation and map-building using active vision for a robot moving on undulating terrain,” In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Kauai, 2001Google Scholar
  5. 5.
    M. A. Fischler, and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Communications of the ACM 24, Volume 6, 1981Google Scholar
  6. 6.
    C. Harris, and M. Stephens, “A Combined Corner and Edge Detector,” Proceedings of the 4th Alvey Vision Conference, pages 147–151, 1988Google Scholar
  7. 7.
    J. Herling and W. Broll, “An Adaptive Training Free Tracker for Mobile Phones”, In Proc. of the 17th ACM Conference on Virtual Reality Systems and Technology (VRST 2010), ACM, New York, NY, USA, pages 35–42Google Scholar
  8. 8.
    H. Kato, and M. Billinghurst, “Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System”, in2nd IEEE and ACM International Workshop on Augmented Reality, 1999Google Scholar
  9. 9.
    G. Klein, and D. Murray, “Parallel Tracking and Mapping for Small AR Workspaces”, Proc. of IEEE ISMAR, 2007Google Scholar
  10. 10.
    G. Klein, and D. Murray, “Parallel Tracking and Mapping on a Camera Phone”, Proc. of IEEE ISMAR, 2009Google Scholar
  11. 11.
    V. Lepetit, P. Lagger, and P. Fua, “Randomized Trees for Real-Time Keypoint Recognition,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pages 775–781, 2005Google Scholar
  12. 12.
    V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An Accurate O(n) Solution to the PnP Problem,” International Journal of Computer Vision, Volume 81, Issue 2, 2009Google Scholar
  13. 13.
    D. G. Lowe, “Distinctive Image Features from Scale-Invariant Keypoints,” International Journal of Computer Vision. Volume 60 Issue 2, 2004Google Scholar
  14. 14.
    W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B.P. Flannery, “Numerical Recipes: The Art of Scientific Computing,” Cambridge University Press, Third Edition, 2007MATHGoogle Scholar
  15. 15.
    M. Özuysal, P. Fua, and V. Lepetit, “Fast Keypoint Recognition in Ten Lines of Code,” Proceedings of IEEE Conference on Computing Vision and Pattern Recognition, pages 1–8, 2007Google Scholar
  16. 16.
    E. Rosten, and T. Drummond, “Fusing Points and Lines for High Performance Tracking,” Proceedings of the IEEE International Conference on Computer Vision, pages 1508–1511, 2005Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  1. 1.Department of Virtual Worlds and Digital GamesIlmenau University of TechnologyIlmenauGermany
  2. 2.Collaborative Virtual and Augmented EnvironmentsFraunhofer FITSankt AugustinGermany

Personalised recommendations