Advertisement

Signal, Image and Video Processing

, Volume 9, Issue 4, pp 831–849 | Cite as

Hybrid tracking system for robust fiducials registration in augmented reality

  • Madjid MaidiEmail author
  • Fakhreddine Ababsa
  • Malik Mallem
  • Marius Preda
Original Paper

Abstract

An effective augmented reality system requires an accurate registration of virtual graphics on real images. In this work, we developed a multi-modal tracking architecture for object identification and occlusion handling. Our approach combines several sensors and techniques to overcome the environment changes. This architecture is composed of a first coded targets registration module based on a hybrid algorithm of pose estimation. To manage partial target occlusions, a second module based on a robust method for feature points tracking is developed. The latest component of the system is the hybrid tracking module. This multi-sensors part handles total target occlusions issue. Experiments with the multi-modal system proved the effectiveness of the proposed tracking approach and occlusion handling in augmented reality applications.

Keywords

Augmented reality Computer vision  Real-time tracking Hybrid tracking Multi-sensors systems 

References

  1. 1.
    Ansar, A., Daniilidis, K.: Linear pose estimation from points or lines. IEEE Trans. Pattern Anal. Mach. Intell. 25(5), 578–589 (2003)CrossRefGoogle Scholar
  2. 2.
    Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: Speeded-up robust features (surf), vol. 110, pp. 346–359. Elsevier Science Inc., New York, NY, USA (2008)Google Scholar
  3. 3.
    Bleser, G., Stricker, D.: Advanced tracking through efficient image processing and visual-inertial sensor fusion. In: IEEE Virtual Reality (VR’08), Reno, Nevada, USA, pp. 137–144, March 2008Google Scholar
  4. 4.
    Chen, D.M., Tsai, S.S., Vedantham, R., Grzeszczuk, R., Girod, B.: Streaming mobile augmented reality on mobile phones. In: ISMAR ’09: Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Washington, DC, USA, pp. 181–182 (2009)Google Scholar
  5. 5.
    Cho, Y., Neumann, U.: Multi-ring color fiducial systems for scalable fiducial tracking augmented reality. In: VRAIS’98: Proceedings of the Virtual Reality Annual International Symposium, Atlanta, GA, USA, p. 212 (1998)Google Scholar
  6. 6.
    Comport, A.I., Marchand, É., Chaumette, F.: A real-time tracker for markerless augmented reality. In: ISMAR’03, Tokyo, Japan, pp. 36–45, October 2003Google Scholar
  7. 7.
    Didier, J.-Y., Ababsa, F., Mallem, M.: Hybrid camera pose estimation combining square fiducials localization technique and orthogonal iteration algorithm. Int. J. Image Graph. (IJIG) 8(1), 169–188 (2008)CrossRefGoogle Scholar
  8. 8.
    Fiala, M.: Artag, a fiducial marker system using digital techniques. In: CVPR’05: Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, vol. 2, pp. 590–596 (2005)Google Scholar
  9. 9.
    Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (June 1981)Google Scholar
  10. 10.
    Foxlin, E., Naimark, L.: Vis-tracker: a wearable vision-inertial self-tracker. In: VR’03: Proceedings of the IEEE Virtual Reality 2003, Los Angeles, California, USA, pp. 199–206, March 2003Google Scholar
  11. 11.
    Harris, C., Stephens, M.: Combined corner and edge detector. In: Proceedings of the Alvey Conference, pp. 147–151 (1988)Google Scholar
  12. 12.
    Kato, H., Billinghurst, M.: Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In: IWAR’99: Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality, San Francisco, CA, USA, pp. 85–92 (1999)Google Scholar
  13. 13.
    Klein, G., Murray, D.: Parallel tracking and mapping on a camera phone. In: ISMAR ’09: Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Washington, DC, USA, pp. 83–86. IEEE Computer Society (2009)Google Scholar
  14. 14.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)CrossRefGoogle Scholar
  15. 15.
    Lu, C.P., Hager, G.D., Mjolsness, E.: Fast and globally convergent pose estimation from video images. IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000)CrossRefGoogle Scholar
  16. 16.
    Maidi, M., Ababsa, F., Mallem, M.: Vision-inertial system calibration for tracking in augmented reality. In: 2nd International Conference on Informatics in Control, Automation and Robotics (ICINCO’05), Barcelona, Spain, pp. 156–162 (2005)Google Scholar
  17. 17.
    Maidi, M., Ababsa, F., Mallem, M.: Robust augmented reality tracking based visual pose estimation. In: 3rd International Conference on Informatics in Control, Automation and Robotics (ICINCO’06), Setbal, Portugal, pp. 346–35 (2006a)Google Scholar
  18. 18.
    Maidi, M., Ababsa, F., Mallem, M.: Robust fiducials tracking in augmented reality. In: The 13th International Conference on Systems, Signals and Image Processing (IWSSIP 2006), Budapest, Hungary, pp. 423–42 (2006b)Google Scholar
  19. 19.
    Maidi, M., Didier, J.Y., Ababsa, F., Mallem, M.: A performance study for camera pose estimation using visual marker based tracking. Mach. Vis. Appl. Int. J. 21(3), 365–376 (2010) Google Scholar
  20. 20.
    Maidi, M., Preda, M., Le, V.-H:. Markerless tracking for mobile augmented reality. In: IEEE International Conference on Signal and Image Processing Applications (ICSIPA2011), Kuala Lumpur, Malaysia, pp. 301–306. IEEE Signal Processing Society, November 2011Google Scholar
  21. 21.
    Naimark, L., Foxlin, E.: Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR’02), Darmstadt, Germany, pp. 27–36 (2002)Google Scholar
  22. 22.
    Naimark, L., Foxlin, E.: Encoded led system for optical trackers. In: ACM and IEEE International Symposium on Mixed and Augmented Reality (ISMAR’05), Vienna, Austria, October 2005Google Scholar
  23. 23.
    Ozuysal, M., Fua, P., Lepetit V.: Fast keypoint recognition in ten lines of code. In: Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR 07), pp. 1–8 (2007)Google Scholar
  24. 24.
    Quan, L., Lan, Z.D.: Linear n-point camera pose determination. IEEE Trans. Pattern Anal. Mach. Intell. 21(8), 774–780 (1999)CrossRefGoogle Scholar
  25. 25.
    Rekimoto, J., Ayatsuka, Y.: Cybercode: designing augmented reality environments with visual tags. In: DARE’00: Proceedings of DARE 2000 on Designing Augmented Reality Environments, Elsinore, Denmark, pp. 1–10 (2000)Google Scholar
  26. 26.
    Stricker, D., Klinker, G., Reiners, D.: A fast and robust line-based optical tracker for augmented reality applications. In: Proceedings of First International Workshop on Augmented Reality (IWAR’98), San Francisco, USA, pp. 129-145 (1998)Google Scholar
  27. 27.
    Takacs, G., Chandrasekhar, V., Gelfand, N., Xiong, Y., Chen, W.-C., Bismpigiannis, T., Grzeszczuk, R., Pulli, K., Girod, B.: Outdoors augmented reality on mobile phone using loxel-based visual feature organization. In: MIR ’08: Proceeding of the 1st ACM International Conference on Multimedia Information Retrieval, New York, NY, USA, pp. 427–434. ACM (2008)Google Scholar
  28. 28.
    Wagner, D., Reitmayr, G., Mulloni, A., Drummond, T., Schmalstieg, D.: Real-time detection and tracking for augmented reality on mobile phones. IEEE Trans. Vis. Comput. Graph. 16(3), 355–368 (2010)CrossRefGoogle Scholar
  29. 29.
    Welch, G., Bishop, G.: An introduction to the Kalman filter. Technical report N. TR 95-041, Department of Computer Science, University of North Carolina, USA (2004)Google Scholar
  30. 30.
    You, S., Neumann, U.: Fusion of vision and gyro tracking for robust augmented reality registration. In: VR’01, Yokohama, Japan, pp. 71–78, March 2001Google Scholar
  31. 31.
    Zhang, Z.: A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London 2013

Authors and Affiliations

  • Madjid Maidi
    • 1
    Email author
  • Fakhreddine Ababsa
    • 2
  • Malik Mallem
    • 2
  • Marius Preda
    • 1
  1. 1.Département ARTEMISTélécom SudParis/Institut Mines-TélécomÉvryFrance
  2. 2.Laboratoire IBISCUniversité d’Évry Val d’EssonneÉvryFrance

Personalised recommendations