Evaluation of X-ray visualization techniques for vertical depth judgments in underground exploration

Original Article
  • 75 Downloads

Abstract

This paper investigates depth judgment-related performances of X-ray visualization techniques for rendering fully occluded geometries in augmented reality. The techniques we selected for this evaluation are careless overlay (CO), edge overlay (EO), excavation box (EB) and a cross-sectional visualization technique (CS). We have designed and conducted a comprehensive user study with 16 participants to examine and analyze the effects related to visualization techniques, having additional virtual objects and the scale of the vertical depths. To the best of our knowledge, this is the first user study on judged vertical depth distances that these techniques were compared against each other. We report our findings using four dependent variables: accuracy, signed error, absolute error and response time to shed some light into real-world performances and also to reveal estimation tendencies of each technique. Our findings suggest similar and better performance for EB, CS compared to CO and EO. We also observed significantly better results for EB and CS techniques when judging Top and Bottom distances compared to Middle distances. Derived from our findings, we proposed a new visualization technique for underground investigation with multiple views. The multi-view technique is our own implementation inspired by magic lens and cross-sectional visualizations with correlating displays.

Keywords

Augmented reality Depth perception User study X-ray visualization 

References

  1. 1.
    Avery, B., Sandor, C., Thomas, B.H.: Improving spatial perception for augmented reality X-ray vision. In: Virtual Reality Conference, 2009. VR 2009. IEEE, pp. 79–82 (2009)Google Scholar
  2. 2.
    Bane, R., Höllerer, T.: Interactive tools for virtual X-ray vision in mobile augmented reality. In: Mixed and Augmented Reality, 2004. ISMAR 2004. Third IEEE and ACM International Symposium on IEEE, pp. 231–239 (2004)Google Scholar
  3. 3.
    Behzadan, A.H., Kamat, V.R.: Visualization of construction graphics in outdoor augmented reality. In: Proceedings of the 37th Conference on Winter Simulation, pp. 1914–1920. Winter Simulation Conference (2005)Google Scholar
  4. 4.
    Bier, E.A., Stone, M.C., Pier, K., Buxton, W., DeRose, T.D.: Toolglass and magic lenses: the see-through interface. In: Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques, pp. 73–80. ACM (1993)Google Scholar
  5. 5.
    Boubaki, N., Saintenoy, A., Tucholka, P.: Gpr profiling and electrical resistivity tomography for buried cavity detection: a test site at the abbaye de l’ouye (france). In: Advanced Ground Penetrating Radar (IWAGPR), 2011 6th International Workshop on IEEE, pp. 1–5 (2011)Google Scholar
  6. 6.
    Coffin, C., Hollerer, T.: Interactive perspective cut-away views for general 3d scenes. In: 3D User Interfaces, 2006. 3DUI 2006. IEEE Symposium on IEEE, pp. 25–28 (2006)Google Scholar
  7. 7.
    Dey, A., Cunningham, A., Sandor, C.: Evaluating depth perception of photorealistic mixed reality visualizations for occluded objects in outdoor environments. In: Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology, pp. 211–218. ACM (2010)Google Scholar
  8. 8.
    Dey, A., Jarvis, G., Sandor, C., Reitmayr, G.: Tablet versus phone: depth perception in handheld augmented reality. In: Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on IEEE, pp. 187–196 (2012)Google Scholar
  9. 9.
    Dey, A., Sandor, C.: Lessons learned: evaluating visualizations for occluded objects in handheld augmented reality. Int. J. Hum. Comput. Stud. 72(10), 704–716 (2014)CrossRefGoogle Scholar
  10. 10.
    Diepstraten, J., Weiskopf, D., Ertl, T.: Interactive cutaway illustrations. Comput. Graph. Forum. 22, 523–532 (2013). doi:10.1111/1467-8659.t01-3-00700
  11. 11.
    Elmqvist, N., Tsigas, P.: A taxonomy of 3d occlusion management for visualization. Vis. Comput. Graph. IEEE Trans. 14(5), 1095–1109 (2008)CrossRefGoogle Scholar
  12. 12.
    Goldstein, E.B.: Sensation and Perception. Cengage Learning, Delhi (2013)Google Scholar
  13. 13.
    Jansen, Y., Hornbæk, K.: A Psychophysical Investigation of Size as a Physical Variable. In: IEEE Transactions on Visualization and Computer Graphics, vol. 22, pp. 479–488 (2016). doi:10.1109/TVCG.2015.2467951
  14. 14.
    Jones, J.A., Swan II, J.E., Singh, G., Kolstad, E., Ellis, S.R.: The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception. In: Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, pp. 9–14. ACM (2008)Google Scholar
  15. 15.
    Kalkofen, D., Tatzgern, M., Schmalstieg, D.: Explosion diagrams in augmented reality. In: Virtual Reality Conference, 2009. VR 2009, pp. 71–78. IEEE (2009)Google Scholar
  16. 16.
    King, G.R., Piekarski, W., Thomas, B.H.: Arvino-outdoor augmented reality visualisation of viticulture gis data. In: Mixed and Augmented Reality, 2005. Proceedings. Fourth IEEE and ACM International Symposium on IEEE, pp. 52–55 (2005)Google Scholar
  17. 17.
    Kruger, J., Schneider, J., Westermann, R.: Clearview: an interactive context preserving hotspot visualization technique. Vis. Comput. Graph. IEEE Trans. 12(5), 941–948 (2006)CrossRefGoogle Scholar
  18. 18.
    Kruijff, E., Swan, J., Feiner, S.: Perceptual issues in augmented reality revisited. In: Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on IEEE, pp. 3–12 (2010)Google Scholar
  19. 19.
    Landy, M.S., Maloney, L.T., Johnston, E.B., Young, M.: Measurement and modeling of depth cue combination: in defense of weak fusion. Vis. Res. 35(3), 389–412 (1995)CrossRefGoogle Scholar
  20. 20.
    Lappin, J.S., Shelton, A.L., Rieser, J.J.: Environmental context influences visually perceived distance. Percept. Psychophys. 68(4), 571–581 (2006)CrossRefGoogle Scholar
  21. 21.
    Li, W., Ritter, L., Agrawala, M., Curless, B., Salesin, D.: Interactive cutaway illustrations of complex 3d models. ACM Trans. Graph. 26(3), 31 (2007)CrossRefGoogle Scholar
  22. 22.
    Livingston, M.A., Ai, Z., Swan, J.E., Smallman, H.S.: Indoor versus outdoor depth perception for mobile augmented reality. In: Virtual Reality Conference, 2009. VR 2009. IEEE, pp. 55–62. IEEE (2009)Google Scholar
  23. 23.
    Livingston, M.A., Dey, A., Sandor, C., Thomas, B.H.: Pursuit of “X-Ray Vision” for Augmented Reality. Springer, Berlin (2013)CrossRefGoogle Scholar
  24. 24.
    Livingston, M.A., Swan II, J.E., Gabbard, J.L., Höllerer, T.H., Hix, D., Julier, S.J., Baillot, Y., Brown, D.: Resolving multiple occluded layers in augmented reality. In: Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality, pp. 56. IEEE Computer Society (2003)Google Scholar
  25. 25.
    Lo Monte, L., Erricolo, D., Picco, V., Soldovieri, F., Wicks, M.C.: Distributed rf tomography for tunnel detection: suitable inversion schemes. In: Aerospace and Electronics Conference (NAECON), Proceedings of the IEEE 2009 National, pp. 182–189. IEEE (2009)Google Scholar
  26. 26.
    Lo Monte, L., Erricolo, D., Soldovieri, F., Wicks, M.C.: Radio frequency tomography for tunnel detection. Geosci. Remote Sens. IEEE Trans. 48(3), 1128–1137 (2010)CrossRefGoogle Scholar
  27. 27.
    Looser, J., Billinghurst, M., Cockburn, A.: Through the looking glass: the use of lenses as an interface tool for augmented reality interfaces. In: Proceedings of the 2nd International Conference on Computer, pp. 204–211. ACM (2004)Google Scholar
  28. 28.
    Mendez, E., Schmalstieg, D.: Importance masks for revealing occluded objects in augmented reality. In: Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology, pp. 247–248. ACM (2009)Google Scholar
  29. 29.
    Roberts, J.C.: On encouraging multiple views for visualization. In: Information Visualization, 1998. Proceedings. 1998 IEEE Conference on IEEE, pp. 8–14. IEEE (1998)Google Scholar
  30. 30.
    Schall, G., Mendez, E., Kruijff, E., Veas, E., Junghanns, S., Reitinger, B., Schmalstieg, D.: Handheld augmented reality for underground infrastructure visualization. Personal Ubiquitous Comput. 13(4), 281–291 (2009)CrossRefGoogle Scholar
  31. 31.
    Schall, G., Schmalstieg, D., Junghanns, S.: Vidente-3d visualization of underground infrastructure using handheld augmented reality. In: Geohydroinformatics-Integrating GIS and Water Engineering, vol. 1, pp. 1–17. CRC Press, Boca Raton (2010)Google Scholar
  32. 32.
    Simi, A., Bracciali, S., Manacorda, G.: Hough transform based automatic pipe detection for array gpr: algorithm development and on-site tests. In: Radar Conference, 2008. RADAR’08. IEEE, pp. 1–6. IEEE (2008)Google Scholar
  33. 33.
    Swan, J., Livingston, M.A., Smallman, H.S., Brown, D., Baillot, Y., Gabbard, J.L., Hix, D.: A perceptual matching technique for depth judgments in optical, see-through augmented reality. In: Virtual Reality Conference, 2006, pp. 19–26. IEEE (2006)Google Scholar
  34. 34.
    Viega, J., Conway, M.J., Williams, G., Pausch, R.: 3d magic lenses. In: Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology, pp. 51–58. ACM (1996)Google Scholar
  35. 35.
    Zollmann, S., Kalkofen, D., Mendez, E., Reitmayr, G.: Image-based ghostings for single layer occlusions in augmented reality. In: Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on IEEE, pp. 19–26. IEEE (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2017

Authors and Affiliations

  1. 1.Sabanci UniversityIstanbulTurkey

Personalised recommendations