Advertisement

Multimedia Tools and Applications

, Volume 75, Issue 16, pp 9563–9585 | Cite as

Exploring legibility of augmented reality X-ray

  • Marc Ericson C. Santos
  • Igor de Souza Almeida
  • Goshiro Yamamoto
  • Takafumi Taketomi
  • Christian Sandor
  • Hirokazu Kato
Article

Abstract

Virtual objects can be visualized inside real objects using augmented reality (AR). This visualization is called AR X-ray because it gives the impression of seeing through the real object. In standard AR, virtual information is overlaid on top of the real world. To position a virtual object inside an object, AR X-ray requires partially occluding the virtual object with visually important regions of the real object. In effect, the virtual object becomes less legible compared to when it is completely unoccluded. Legibility is an important consideration for various applications of AR X-ray. In this research, we explored legibility in two implementations of AR X-ray, namely, edge-based and saliency-based. In our first experiment, we explored on the tolerable amounts of occlusion to comfortably distinguish small virtual objects. In our second experiment, we compared edge-based and saliency-based AR X-ray methods when visualizing virtual objects inside various real objects. Moreover, we benchmarked the legibility of these two methods against alpha blending. From our experiments, we observed that users have varied preferences for proper amounts of occlusion cues for both methods. The partial occlusions generated by the edge-based and saliency-based methods need to be adjusted depending on the lighting condition and the texture complexity of the occluding object. In most cases, users identify objects faster with saliency-based AR X-ray than with edge-based AR X-ray. Insights from this research can be directly applied to the development of AR X-ray applications.

Keywords

Augmented reality Augmented reality X-ray Empirical study Legibility Visualization 

Notes

Acknowledgments

This work was supported by the Grant-in-Aid for JSPS Fellows, Grant Number 15J10186.

References

  1. 1.
    Avery B, Sandor C, Thomas B (2009) Improving spatial perception for augmented reality X-ray vision. In: Procceedings of IEEE virtual reality conference, pp 79–82Google Scholar
  2. 2.
    Bajura M, Fuchs H, Ohbuchi R (1992) Merging virtual objects with the real world: seeing ultrasound imagery within the patient. In: Proceedings of ACM SIGGRAPH computer graphics, vol 26. ACM, pp 203–210Google Scholar
  3. 3.
    Carmigniani J, Furht B, Anisetti M, Ceravolo P, Damiani E, Ivkovic M (2011) Augmented reality technologies, systems and applications. Multimed Tools Appl 51(1):341–377CrossRefGoogle Scholar
  4. 4.
    Dey A, Jarvis G, Sandor C, Reitmayr G (2012) Tablet versus phone: depth perception in handheld augmented reality. In: Proceedings of IEEE international symposium on mixed and augmented reality, pp 187–196Google Scholar
  5. 5.
    Dey A, Sandor C (2014) Lessons learned: evaluating visualizations for occluded objects in handheld augmented reality. Int J Human-Comput Stud 72:704–716CrossRefGoogle Scholar
  6. 6.
    Furness TA (1986) The super cockpit and its human factors challenges. In: Proceedings of the human factors and ergonomics society annual meeting, vol 30. SAGE Publications, pp 48–52Google Scholar
  7. 7.
    Gabbard J, Swan J (2008) Usability engineering for augmented reality: employing user-based studies to inform design. IEEE Trans Visual Comput Graph 14(3):513–525CrossRefGoogle Scholar
  8. 8.
    Gabbard J, Swan J, Hix D, Kim S J, Fitch G (2007) Active text drawing styles for outdoor augmented reality: a user-based study and design implications. In: IEEE virtual reality conference, pp 35–42Google Scholar
  9. 9.
    Itti L, Koch C, Niebur E (1998) A model of saliency-based visual attention for rapid scene analysis. IEEE Trans Pattern Anal Mach Intell 20(11):1254–1259CrossRefGoogle Scholar
  10. 10.
    Kalkofen D, Mendez E, Schmalstieg D (2007) Interactive focus and context visualization for augmented reality. In: IEEE international symposium on mixed and augmented reality, pp 191–201Google Scholar
  11. 11.
    Kalkofen D, Mendez E, Schmalstieg D (2009) Comprehensible visualization for augmented reality. IEEE Trans Vis Comput Graph 15(2):193–204CrossRefGoogle Scholar
  12. 12.
    Kalkofen D, Veas E, Zollmann S, Steinberger M, Schmalstieg D (2013) Adaptive ghosted views for augmented reality. In: IEEE international symposium on mixed and augmented reality, pp 1–9Google Scholar
  13. 13.
    Kameda Y, Takemasa T, Ohta Y (2004) Outdoor see-through vision utilizing surveillance cameras. In: IEEE international symposium on mixed and augmented reality, pp 151–160Google Scholar
  14. 14.
    Kourouthanassis P, Boletsis C, Lekakos G (2013) Demystifying the design of mobile augmented reality applications. Multimed Tools Appl:1–22Google Scholar
  15. 15.
    Livingston M (2005) Evaluating human factors in augmented reality systems. IEEE Comput Graph Appl 25(6):6–9CrossRefGoogle Scholar
  16. 16.
    Livingston M, Moser K (2013) Effectiveness of occluded object representations at displaying ordinal depth information in augmented reality. In: IEEE virtual reality conference, pp 107–108Google Scholar
  17. 17.
    Livingston MA, Dey A, Sandor C, Thomas B H (2013) Pursuit of X-ray vision for augmented reality. In: Human factors in augmented reality environments. Springer, pp 67–107Google Scholar
  18. 18.
    Nielsen J (1994) Usability engineering. ElsevierGoogle Scholar
  19. 19.
    Peterson S, Axholt M, Cooper M, Ellis S (2009) Visual clutter management in augmented reality: effects of three label separation methods on spatial judgments. In: IEEE symposium on 3D user interfaces, pp 111–118Google Scholar
  20. 20.
    Sandor C, Cunningham A, Dey A, Mattila V V (2010) An augmented reality X-ray system based on visual saliency. In: IEEE international symposium on mixed and augmented reality, pp 27–36Google Scholar
  21. 21.
    Santos M, Chen A, Terawaki M, Yamamoto G, Taketomi T, Miyazaki J, Kato H (2013) Augmented reality X-ray interaction in k-12 education: theory, student perception and teacher evaluation. In: IEEE international conference on advanced learning technologies, pp 141–145Google Scholar
  22. 22.
    Santos MEC, Chen A, Taketomi T, Yamamoto G, Miyazaki J, Kato H (2014) Augmented reality learning experiences: Survey of prototype design and evaluation. IEEE Trans Learn Technol 7(1):38–56CrossRefGoogle Scholar
  23. 23.
    Santos MEC, Polvi J, Taketomi T, Yamamoto G, Sandor C, Kato H (2014) A usability scale for handheld augmented reality. In: ACM symposium on virtual reality software and technologyGoogle Scholar
  24. 24.
    Santos MEC, Terawaki M, Taketomi T, Yamamoto G, Kato H (2015) Development of handheld augmented reality X-ray for k-12 settings. In: Chang M, Li Y (eds) Smart learning environments, lecture notes in educational technology. Springer, Berlin HeidelbergGoogle Scholar
  25. 25.
    Walther D (2006) Interactions of visual attention and object recognition: computational modeling, algorithms, and psychophysics. Ph.D. thesis. California Institute of TechnologyGoogle Scholar
  26. 26.
    Zollmann S, Kalkofen D, Mendez E, Reitmayr G (2010) Image-based ghostings for single layer occlusions in augmented reality. In: IEEE international symposium on mixed and augmented reality, pp 19–26Google Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • Marc Ericson C. Santos
    • 1
  • Igor de Souza Almeida
    • 1
  • Goshiro Yamamoto
    • 1
  • Takafumi Taketomi
    • 1
  • Christian Sandor
    • 1
  • Hirokazu Kato
    • 1
  1. 1.Interactive Media Design Laboratory, Graduate School of Information ScienceNara Institute of Science and TechnologyNaraJapan

Personalised recommendations