Skip to main content
Log in

Hybrid tracking system for robust fiducials registration in augmented reality

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

An effective augmented reality system requires an accurate registration of virtual graphics on real images. In this work, we developed a multi-modal tracking architecture for object identification and occlusion handling. Our approach combines several sensors and techniques to overcome the environment changes. This architecture is composed of a first coded targets registration module based on a hybrid algorithm of pose estimation. To manage partial target occlusions, a second module based on a robust method for feature points tracking is developed. The latest component of the system is the hybrid tracking module. This multi-sensors part handles total target occlusions issue. Experiments with the multi-modal system proved the effectiveness of the proposed tracking approach and occlusion handling in augmented reality applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24

Similar content being viewed by others

References

  1. Ansar, A., Daniilidis, K.: Linear pose estimation from points or lines. IEEE Trans. Pattern Anal. Mach. Intell. 25(5), 578–589 (2003)

    Article  Google Scholar 

  2. Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: Speeded-up robust features (surf), vol. 110, pp. 346–359. Elsevier Science Inc., New York, NY, USA (2008)

  3. Bleser, G., Stricker, D.: Advanced tracking through efficient image processing and visual-inertial sensor fusion. In: IEEE Virtual Reality (VR’08), Reno, Nevada, USA, pp. 137–144, March 2008

  4. Chen, D.M., Tsai, S.S., Vedantham, R., Grzeszczuk, R., Girod, B.: Streaming mobile augmented reality on mobile phones. In: ISMAR ’09: Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Washington, DC, USA, pp. 181–182 (2009)

  5. Cho, Y., Neumann, U.: Multi-ring color fiducial systems for scalable fiducial tracking augmented reality. In: VRAIS’98: Proceedings of the Virtual Reality Annual International Symposium, Atlanta, GA, USA, p. 212 (1998)

  6. Comport, A.I., Marchand, É., Chaumette, F.: A real-time tracker for markerless augmented reality. In: ISMAR’03, Tokyo, Japan, pp. 36–45, October 2003

  7. Didier, J.-Y., Ababsa, F., Mallem, M.: Hybrid camera pose estimation combining square fiducials localization technique and orthogonal iteration algorithm. Int. J. Image Graph. (IJIG) 8(1), 169–188 (2008)

    Article  Google Scholar 

  8. Fiala, M.: Artag, a fiducial marker system using digital techniques. In: CVPR’05: Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, vol. 2, pp. 590–596 (2005)

  9. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (June 1981)

    Google Scholar 

  10. Foxlin, E., Naimark, L.: Vis-tracker: a wearable vision-inertial self-tracker. In: VR’03: Proceedings of the IEEE Virtual Reality 2003, Los Angeles, California, USA, pp. 199–206, March 2003

  11. Harris, C., Stephens, M.: Combined corner and edge detector. In: Proceedings of the Alvey Conference, pp. 147–151 (1988)

  12. Kato, H., Billinghurst, M.: Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In: IWAR’99: Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality, San Francisco, CA, USA, pp. 85–92 (1999)

  13. Klein, G., Murray, D.: Parallel tracking and mapping on a camera phone. In: ISMAR ’09: Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Washington, DC, USA, pp. 83–86. IEEE Computer Society (2009)

  14. Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)

    Article  Google Scholar 

  15. Lu, C.P., Hager, G.D., Mjolsness, E.: Fast and globally convergent pose estimation from video images. IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000)

    Article  Google Scholar 

  16. Maidi, M., Ababsa, F., Mallem, M.: Vision-inertial system calibration for tracking in augmented reality. In: 2nd International Conference on Informatics in Control, Automation and Robotics (ICINCO’05), Barcelona, Spain, pp. 156–162 (2005)

  17. Maidi, M., Ababsa, F., Mallem, M.: Robust augmented reality tracking based visual pose estimation. In: 3rd International Conference on Informatics in Control, Automation and Robotics (ICINCO’06), Setbal, Portugal, pp. 346–35 (2006a)

  18. Maidi, M., Ababsa, F., Mallem, M.: Robust fiducials tracking in augmented reality. In: The 13th International Conference on Systems, Signals and Image Processing (IWSSIP 2006), Budapest, Hungary, pp. 423–42 (2006b)

  19. Maidi, M., Didier, J.Y., Ababsa, F., Mallem, M.: A performance study for camera pose estimation using visual marker based tracking. Mach. Vis. Appl. Int. J. 21(3), 365–376 (2010)

    Google Scholar 

  20. Maidi, M., Preda, M., Le, V.-H:. Markerless tracking for mobile augmented reality. In: IEEE International Conference on Signal and Image Processing Applications (ICSIPA2011), Kuala Lumpur, Malaysia, pp. 301–306. IEEE Signal Processing Society, November 2011

  21. Naimark, L., Foxlin, E.: Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR’02), Darmstadt, Germany, pp. 27–36 (2002)

  22. Naimark, L., Foxlin, E.: Encoded led system for optical trackers. In: ACM and IEEE International Symposium on Mixed and Augmented Reality (ISMAR’05), Vienna, Austria, October 2005

  23. Ozuysal, M., Fua, P., Lepetit V.: Fast keypoint recognition in ten lines of code. In: Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR 07), pp. 1–8 (2007)

  24. Quan, L., Lan, Z.D.: Linear n-point camera pose determination. IEEE Trans. Pattern Anal. Mach. Intell. 21(8), 774–780 (1999)

    Article  Google Scholar 

  25. Rekimoto, J., Ayatsuka, Y.: Cybercode: designing augmented reality environments with visual tags. In: DARE’00: Proceedings of DARE 2000 on Designing Augmented Reality Environments, Elsinore, Denmark, pp. 1–10 (2000)

  26. Stricker, D., Klinker, G., Reiners, D.: A fast and robust line-based optical tracker for augmented reality applications. In: Proceedings of First International Workshop on Augmented Reality (IWAR’98), San Francisco, USA, pp. 129-145 (1998)

  27. Takacs, G., Chandrasekhar, V., Gelfand, N., Xiong, Y., Chen, W.-C., Bismpigiannis, T., Grzeszczuk, R., Pulli, K., Girod, B.: Outdoors augmented reality on mobile phone using loxel-based visual feature organization. In: MIR ’08: Proceeding of the 1st ACM International Conference on Multimedia Information Retrieval, New York, NY, USA, pp. 427–434. ACM (2008)

  28. Wagner, D., Reitmayr, G., Mulloni, A., Drummond, T., Schmalstieg, D.: Real-time detection and tracking for augmented reality on mobile phones. IEEE Trans. Vis. Comput. Graph. 16(3), 355–368 (2010)

    Article  Google Scholar 

  29. Welch, G., Bishop, G.: An introduction to the Kalman filter. Technical report N. TR 95-041, Department of Computer Science, University of North Carolina, USA (2004)

  30. You, S., Neumann, U.: Fusion of vision and gyro tracking for robust augmented reality registration. In: VR’01, Yokohama, Japan, pp. 71–78, March 2001

  31. Zhang, Z.: A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Madjid Maidi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Maidi, M., Ababsa, F., Mallem, M. et al. Hybrid tracking system for robust fiducials registration in augmented reality. SIViP 9, 831–849 (2015). https://doi.org/10.1007/s11760-013-0508-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-013-0508-4

Keywords

Navigation