The Visual Computer

, Volume 30, Issue 9, pp 1035–1044 | Cite as

Mono-spectrum marker: an AR marker robust to image blur and defocus

  • Masahiro Toyoura
  • Haruhito Aruga
  • Matthew Turk
  • Xiaoyang Mao
Original Article

Abstract

Planar markers enable an augmented reality (AR) system to estimate the pose of objects from images containing them. However, conventional markers are difficult to detect in blurred or defocused images. We propose a new marker and a new detection and identification method that is designed to work under such conditions. The problem of conventional markers is that their patterns consist of high-frequency components such as sharp edges which are attenuated in blurred or defocused images. Our marker consists of a single low-frequency component. We call it a mono-spectrum marker. The mono-spectrum marker can be detected in real time with a GPU. In experiments, we confirm that the mono-spectrum marker can be accurately detected in blurred and defocused images in real time. Using these markers can increase the performance and robustness of AR systems and other vision applications that require detection or tracking of defined markers.

Keywords

Augmented reality Spectrum analysis  Planar marker 

References

  1. 1.
    Aruga, H., Toyoura, M., Mao, X.: A trackable ar marker in blurred or defocused images. In: Meeting on Image Recognition and Understanding (MIRU), pp. 496–503 (2011)Google Scholar
  2. 2.
    Asai, H., Oyamada, Y., Pilet, J., Saito, H.: Cepstral analysis based blind deconvolution for motion blur. In: IEEE International Conference on Image Processing (ICIP) (2010)Google Scholar
  3. 3.
    Bay, H., Ess, A., Tuytelaars, T., Gool, L.V.: Surf: speeded up robust features. Comput. Vis. Image Underst. 110(3), 346–359 (2008)CrossRefGoogle Scholar
  4. 4.
    Bichlmeier, C., Wimmer, F., Heining, S., Navab, N.: Contextual anatomic mimesis: Hybrid in-situ visualization method for improving multi-sensory depth perception in medical augmented reality. In: International Symposium on Mixed and Augmented Reality (ISMAR), pp. 129–138 (2007)Google Scholar
  5. 5.
    Dixon, J.D., Mortimer, B.: Permutation Groups. Springer, Berlin (1996)CrossRefMATHGoogle Scholar
  6. 6.
    Fiala, M.: Artag, a fiducial marker system using digital techniques. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 590–596 (2005)Google Scholar
  7. 7.
    Ito, E., Okatani, T., Deguchi, K.: Robust planar target tracking and pose estimation from a single concavity. In: International Symposium on Mixed and Augmented Reality (ISMAR) (2011)Google Scholar
  8. 8.
    Kanatani, K., Sugaya, Y.: Performance evaluation of iterative geometric fitting algorithms. Comput. Stat. Data Anal. 52, 1208–1222 (2007)CrossRefMATHMathSciNetGoogle Scholar
  9. 9.
    Kato, H., Billinghurst, M.: Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In: International Workshop on Augmented Reality, pp. 85–94 (1999)Google Scholar
  10. 10.
    Klein, G., Murray, D.: Parallel tracking and mapping for small ar workspaces. In: International Symposium on Mixed and Augmented Reality (ISMAR) (2007)Google Scholar
  11. 11.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)CrossRefGoogle Scholar
  12. 12.
    Mohan, A., Woo, G., Hiura, S., Smithwick, Q., Raskar, R.: Bokode: imperceptible visual tags for camera based interaction from a distance. ACM Trans. Graph. 28(3), 98 (2009)Google Scholar
  13. 13.
    Newcombe, R.A., Lovegrove, S., Davison, A.J.: Dtam: dense tracking and mapping in real-time. In: International Conference on Computer Vision (ICCV), pp. 2320–2327 (2011)Google Scholar
  14. 14.
    Okumura, B., Kanbara, M., Yokoya, N.: Augmented reality based on estimation of defocusing and motion blurring from captured images. In: International Symposium on Mixed and Augmented Reality (ISMAR), pp. 219–225 (2006)Google Scholar
  15. 15.
    Owen, C., Xiao, F., Middlin, P.: What is the best fiducial? In: IEEE International Workshop on Augmented Reality Toolkit (2002)Google Scholar
  16. 16.
    Park, Y., Lepetit, V., Woo, W.: Esm-blur: handling and rendering blur in 3d tracking and augmentation. In: International Symposium on Mixed and Augmented Reality (ISMAR), pp. 163–166 (2009)Google Scholar
  17. 17.
    Shan, Q., Jia, J., Agarwala, A.: High-quality motion deblurring from a single image. ACM Trans. Graph. 27(3), 73 (2008)Google Scholar
  18. 18.
    Tateno, K., Kitahara, I., Ohta, Y.: A nested marker for augmented reality. In: IEEE, Virtual Reality, pp. 259–262 (2007)Google Scholar
  19. 19.
    Uchiyama, H., Saito, H.: Random dot markers. In: IEEE Virtual Reality (IEEE VR) (2011)Google Scholar
  20. 20.
    Uematsu, Y., Saito, H.: Improvement of accuracy for 2d marker-based tracking using particle filter. In: International Conference on Artificial Reality and Telexistence (ICAT), pp. 183–189 (2007)Google Scholar
  21. 21.
    Xu, A., Dudek, G.: Fourier tag: a smoothly degradable fiducial marker system with configurable payload capacity. In: Canadian Conference on Computer and Robot Vision (CRV), pp. 40–47 (2011)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Masahiro Toyoura
    • 1
  • Haruhito Aruga
    • 1
  • Matthew Turk
    • 2
  • Xiaoyang Mao
    • 1
  1. 1.University of YamanashiKofuJapan
  2. 2.University of CaliforniaSanta BarbaraUSA

Personalised recommendations