Advertisement

Markerless Outdoor Localisation Based on SIFT Descriptors for Mobile Applications

  • Frank Lorenz Wendt
  • Stéphane Bres
  • Bruno Tellez
  • Robert Laurini
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5099)

Abstract

This study proposes augmented reality from mobile devices based on SIFT (Scale Invariant Feature Transform) features for markerless outdoor augmented reality application. The proposed application is navigation help in a city. These SIFT features are projected on a digital model of the building façades of the square to obtain 3D co-ordinates for each feature point. The algorithms implemented calculate the camera pose for frame of a video from 3D-2D point correspondences between features extracted in the current video frame and points in the reference dataset. The algorithms were successfully tested on video films of city squares. Although they do not operate in real-time, they are capable of a correct pose estimation and projection of artificial data into the scene. In case of a loss of track, the algorithms recover automatically. The study shows the potential of SIFT features for purely image based markerless outdoor augmented reality applications. This study takes place in the MoSAIC project.

Keywords

Content-based image retrieval image matching augmented reality SIFT building recognition pose estimation 

References

  1. 1.
    Bay, H., Tuytelaars, T., Van Gool, L.: SURF: Speeded Up Robust Features. In: Proceedings of the ninth European Conference on Computer Vision, May, pp. 404–417 (2006)Google Scholar
  2. 2.
    Ferrari, V., Tuytelaars, T., Van Gool, L.: Markerless augmented reality with a real-time affine region tracker. In: Proceedings of the IEEE and ACM International Symposium on Augmented Reality, pp. 87–96 (2001)Google Scholar
  3. 3.
    Gordon, I., Lowe, D.: Scene modelling, recognition and tracking with invariant image features. In: Proc. 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2004), Arlington, pp. 110–119 (2004)Google Scholar
  4. 4.
    Harris, C., Stephens, M.: A combined corner and edge detector. In: Proc. of The Fourth Alvey Vision Conference, 1988, Manchester, UK, pp. 147–151 (1998)Google Scholar
  5. 5.
    Jiang, B., Neumann, U., You, S.: A Robust Hybrid Tracking System for Outdoor Augmented Reality. In: Proc. of the IEEE Virtual Reality Conference 2004 (VR 2004), pp. 3–10 (2004)Google Scholar
  6. 6.
    Ke, Y., Sukthankar, R.: PCA-SIFT: A more distinctive representation for local image descriptors. In: Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), vol. 2, pp. 506–513 (2004)Google Scholar
  7. 7.
    Lim, J.-H., Chevallet, J.-P., Merah, S.N.: SnapToTell: Ubiquitous Information Access from Camera. In: A Picture-Driven Tourist Information Directory Service in Mobile & Ubiquitous Information Access (MUIA 2004) Workshop as part of the conference Mobile Human Computer Interaction with Mobile Devices and Services (Mobile HCI 2004), Glasgow University of Strathclyde, Scotland, September 2004, pp. 21–27 (2004)Google Scholar
  8. 8.
    Lowe, D.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60, 91–100 (2004)CrossRefGoogle Scholar
  9. 9.
    Mikolajczyk, K., Schmid, C.A.: Performance evaluation of local descriptors. In: Proc. of IEEE Computer Vision and Pattern Recognition, vol. 2, pp. 257–263 (2003)Google Scholar
  10. 10.
    Reitmayr, G., Drummond, T.: Going Out: Robust Model-based Tracking for Outdoor Augmented Reality. In: Proc. IEEE ISMAR 2006, Santa Barbara, USA, pp. 109–118 (2006)Google Scholar
  11. 11.
    Ribo, M., Lang, P., Ganster, H., Brandner, M., Stock, C., Pinz, A.: Hybrid tracking for outdoor augmented reality applications. IEEE Comp. Graph. Appl. 22(6), 54–63 (2002)CrossRefGoogle Scholar
  12. 12.
    Rosten, E., Drummond, T.: Fusing points and lines for high performance tracking. In: Proc. 10th IEEE International Conference on Computer Vision (ICCV 2005), Beijing, vol. 2, pp. 1508–1515 (2005)Google Scholar
  13. 13.
    Vacchetti, L., Lepetit, V., Fua, P.: Combining edge and texture information for real-time accurate 3D camera tracking. In: Proc. 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2004), Arlington, VA, pp. 48–57 (2004)Google Scholar
  14. 14.
    Vlahakis, V., Ioannidis, N., Karigiannis, J., Tsotros, M., Gounaris, M., Stricker, D., Gleue, T., Daehne, P., Almeida, L.: Archeoguide: An Augmented Reality Guide for Archaeological Site. IEEE Computer Graphics and Applications 22(5), 52–60 (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Frank Lorenz Wendt
    • 1
  • Stéphane Bres
    • 1
  • Bruno Tellez
    • 1
  • Robert Laurini
    • 1
  1. 1.LIRIS UMR 5205 CNRS, INSA-LyonFrance

Personalised recommendations