Advertisement

Journal of Real-Time Image Processing

, Volume 2, Issue 2–3, pp 67–68 | Cite as

Editorial for the special issue on markerless real-time tracking for augmented reality image synthesis

  • Michael Felsberg
  • Reinhard Koch
Editorial
  • 140 Downloads

Abstract

Augmented reality is a growing field, with many diverse applications ranging from TV and film production, to industrial maintenance, medicine, education, entertainment and games. The central idea is to add virtual objects into a real scene, either by displaying them in a see-through head-mounted display, or by superimposing them on an image of the scene captured by a camera. Depending on the application, the added objects might be virtual characters in a TV or film production, instructions for repairing a car engine, or a reconstruction of an archaeological site. For the effect to be believable, the virtual objects must appear rigidly fixed to the real world, which requires the accurate measurement in real-time of the position of the camera or the user’s head. Present technology cannot achieve this without resorting to systems that require a significant infrastructure in the operating environment, severely restricting the range of possible applications.

Keywords

Augmented Reality Inertia Measurement Unit Virtual Object Augmented Reality Application Scene Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

This work has been supported by the EC Grant IST-2002-002013 MATRIS. This work does not represent the opinion of the European Community, and the European Community is not responsible for any use which may be made of its contents.

References

  1. 1.
    Chandaria, J., Stricker, D., Thomas, G.: The MATRIS project: real-time markerless camera tracking for AR and broadcast applications. J. Real-Time Image Process. (in this issue)Google Scholar
  2. 2.
    Bartczak, B., Koeser, K., Woelk, F., Koch, R.: Extraction of 3D freeform surfaces as visual landmarks for real-time tracking. J. Real-Time Image Process. (in this issue)Google Scholar
  3. 3.
    Felsberg, M., Hedborg, J.: Real-time view-based pose recognition and interpolation for tracking initialization. J. Real-Time Image Process. (in this issue)Google Scholar
  4. 4.
    Thomas, G.: Real-time camera tracking using sports pitch markings. J. Real-Time Image Process. (in this issue)Google Scholar
  5. 5.
    Koeser, K., Bartczak, B., Koch, R.: Robust GPU-assisted camera tracking using free-form surface models. J. Real-Time Image Process. (in this issue)Google Scholar
  6. 6.
    Hol, J., Schön, T.B., Luinge, H., Slycke, P., Gustafsson, F.: Robust real-time tracking by fusing measurements from inertial and vision sensors. J. Real-Time Image Process. (in this issue)Google Scholar
  7. 7.
    Bleser, G., Becker, M., Stricker, D.: Real-time vision-based tracking and reconstruction. J. Real-Time Image Process. (in this issue)Google Scholar
  8. 8.
    Lowe D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)Google Scholar

Copyright information

© Springer-Verlag 2007

Authors and Affiliations

  1. 1.Computer Vision LaboratoryLinköping UniversityLinköpingSweden
  2. 2.Institute of Computer ScienceChristian-Albrechts-Universität KielKielGermany

Personalised recommendations