Marker-Less Tracking for Multi-layer Authoring in AR Books

  • Kiyoung Kim
  • Jonghee Park
  • Woontack Woo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5709)

Abstract

An Augmented Reality (AR) book is an application that applies AR technologies to physical books for providing a new experience to users. In this paper, we propose a new marker-less tracking method for the AR book. The main goal of the tracker is not only to recognize many pages, but also to compute 6 DOF camera pose. As a result, we can augment different virtual contents according to the corresponding page. For this purpose, we use a multi-core programming approach that separates the page recognition module from the tracking module. In the page recognition module, highly distinctive Scale Invariant Features Transform (SIFT) features are used. In the tracking module, a coarse-to-fine approach is exploited for fast frame-to-frame matching. Our tracker provides more than 30 frames per second. In addition to the tracker, we explain multi-layer based data structure for maintaining the AR book. A GUI-based authoring tool is also shown to validate feasibility of the tracker and data structures. The proposed algorithm would be helpful to create various AR applications that require multiple planes tracking.

Keywords

augmented reality marker-less tracking layer authoring page recognition AR book SIFT 

References

  1. 1.
    Billinghurst, M., Kato, H., Poupyrev, I.: The magicbook - moving seamlessly between reality and virtuality. IEEE Computer Graphics and Applications 21(3), 6–8 (2001)Google Scholar
  2. 2.
    Taketa, N., Hayashi, K., Kato, H., Noshida, S.: Virtual pop-up book based on augmented reality. In: Smith, M.J., Salvendy, G. (eds.) HCII 2007. LNCS, vol. 4558, pp. 475–484. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  3. 3.
    Scherrer, C., Pilet, J., Fua, P., Lepetit, V.: The haunted book. In: IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR 2008, pp. 163–164 (2008)Google Scholar
  4. 4.
    Lepetit, V., Lagger, P., Fua, P.: Randomized trees for real-time keypoint recognition. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005, May 2005, vol. 2, pp. 775–781 (2005)Google Scholar
  5. 5.
    Yang, H.S., Cho, K., Soh, J., Jung, J., Lee, J.: Hybrid visual tracking for augmented books. In: Stevens, S.M., Saldamarco, S.J. (eds.) ICEC 2008. LNCS, vol. 5309, pp. 161–166. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  6. 6.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)CrossRefGoogle Scholar
  7. 7.
    Rosten, E., Drummond, T.: Fusing points and lines for high performance tracking. In: Tenth IEEE International Conference on Computer Vision, ICCV 2005, September 2005, vol. 2, pp. 508–1515 (2005)Google Scholar
  8. 8.
    Vacchetti, L., Lepetit, V., Fua, P.: Stable real-time 3d tracking using online and offline information. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(10), 1385–1391 (2004)CrossRefGoogle Scholar
  9. 9.
    Reitmayr, G., Drummond, T.: Going out: robust model-based tracking for outdoor augmented reality. In: IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR 2006, October 2006, pp. 109–118 (2006)Google Scholar
  10. 10.
    Ozuysal, M., Fua, P., Lepetit, V.: Fast keypoint recognition in ten lines of code. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2007, May 2007, pp. 1–8 (2007)Google Scholar
  11. 11.
    Davison, A.J., Reid, I.D., Molton, N.D., Stasse, O.: Monoslam: Real-time single camera slam. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(6), 1052–1067 (2007)CrossRefGoogle Scholar
  12. 12.
    Williams, B., Klein, G., Reid, I.: Real-time slam relocalisation. In: IEEE International Conference on Computer Vision, ICCV 2007, September 2007, pp. 1–8 (2007)Google Scholar
  13. 13.
    Castle, R., Gawley, D., Klein, G., Murray, D.: Video-rate recognition and localization for wearable cameras. In: British Machine Vision Conf. (January 2007)Google Scholar
  14. 14.
    Klein, G., Murray, D.: Parallel tracking and mapping for small ar workspaces. In: IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR 2007, October 2007, pp. 225–234 (2007)Google Scholar
  15. 15.
    Bay, H., Ess, A., Tuytelaars, T., Vangool, L.: Speeded-up robust features (surf). Computer Vision and Image Understanding 110(3), 346–359 (2008)CrossRefGoogle Scholar
  16. 16.
    Lee, T., Hollerer, T.: Hybrid feature tracking and user interaction for markerless augmented reality. In: IEEE Virtual Reality, VR 2008, February 2008, pp. 145–152 (2008)Google Scholar
  17. 17.
    Pilet, J., Geiger, A., Lagger, P., Lepetit, V., Fua, P.: An all-in-one solution to geometric and photometric calibration. In: IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR 2006, September 2006, pp. 69–78 (2006)Google Scholar
  18. 18.
  19. 19.
    Open computer vision library, http://sourceforge.net/projects/opencvlibrary/
  20. 20.

Copyright information

© IFIP International Federation for Information Processing 2009

Authors and Affiliations

  • Kiyoung Kim
    • 1
  • Jonghee Park
    • 1
  • Woontack Woo
    • 1
  1. 1.GIST U-VR Lab.GwangjuSouth Korea

Personalised recommendations