Skip to main content

Superpowers in the Metaverse: Augmented Reality Enabled X-Ray Vision in Immersive Environments

  • Chapter
  • First Online:
Augmented and Virtual Reality in the Metaverse

Part of the book series: Springer Series on Cultural Computing ((SSCC))

  • 65 Accesses

Abstract

This chapter explores the use of augmented reality (AR) enabled X-ray vision (XRV) in immersive environments. AR XRV is the ability to render a virtual object as if it is behind or encapsulated in a real-world object. For example, a user may look at a wall in the real world to see what is inside or behind the wall with the use of augmented reality. Seamlessly merging virtual objects with the real world is challenging as virtual objects in augmented reality are typically rendered in front of the real world causing a depth mismatch. The mismatch in depth does not accurately portray the virtual object in the real world and may lead to perception problems when augmenting the real world with virtual information. This review provides an overview of the existing techniques, applications, and devices that provide XRV in immersive environments and summarizes the current research challenges. The emergent nature of XRV in immersive environments is highlighted emphasizing the need to comprehend the challenges and opportunities of AR-enabled XRV. The overview presented in this chapter will assist researchers in identifying challenges as the technology necessary for XRV within the metaverse emerges as a refined capability and accepted convention.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://dl.acm.org/.

  2. 2.

    https://ieeexplore.ieee.org/.

  3. 3.

    https://pubmed.ncbi.nlm.nih.gov/.

  4. 4.

    https://www.webofscience.com/.

  5. 5.

    https://www.scopus.com/.

  6. 6.

    https://www.magicleap.com/magic-leap-2.

References

  • Adib F, Katabi D (2013) See through walls with WiFi! In: SIGCOMM 2013—Proceedings of the ACM SIGCOMM 2013 conference on applications, technologies, architectures, and protocols for computer communication, pp 75–86. https://doi.org/10.1145/2486001.2486039

  • Arefin MS, Li JES, Hoffing RC (2024) Mapping eye vergence angle to the depth of real and virtual objects. J Vis Extended R 1–38

    Google Scholar 

  • Asif R, Hassan SR (2023) Exploring the confluence of IoT and metaverse: future opportunities and challenges. IoT 4(3):412–429. https://doi.org/10.3390/iot4030018

  • Avery B, Sandor C, Thomas BH (2009) Improving spatial perception for augmented reality X-ray vision. In: 2009 IEEE virtual reality conference, pp 79–82. https://doi.org/10.1109/VR.2009.4811002

  • Bajura M, Fuchs H, Ohbuchi R (1992) Merging virtual objects with the real world: seeing ultrasound imagery within the patient. Comput Graph (ACM) 26(2):203–210. https://doi.org/10.1145/142920.134061

  • Becher C, Bottecchia S, Desbarats P (2021) Projection grid cues: an efficient way to perceive the depths of underground objects in augmented reality. Lecture notes in computer science (Including Subseries Lecture notes in artificial intelligence and Lecture notes in bioinformatics), vol 12932 LNCS, pp 611–630. https://doi.org/10.1007/978-3-030-85623-6_35

  • Bichlmeier C, Wimmer F, Heining SM, Navab N (2007) Contextual anatomic mimesis: hybrid in-situ visualization method for improving multi-sensory depth perception in medical augmented reality. In: 2007 6th IEEE and ACM International symposium on mixed and augmented reality (ISMAR). https://doi.org/10.1109/ISMAR.2007.4538837

  • Blanco-Novoa Ó, Fraga-Lamas P, Vilar-Montesinos MA, Fernández-Caramés TM (2020) Creating the internet of augmented things: an open-source framework to make IoT devices and augmented and mixed reality systems talk to each other. Sensors (Switzerland) 20(11):1–26. https://doi.org/10.3390/s20113328

  • Blum T, Stauder R, Euler E, Navab N (2012) Superman-like X-ray vision: towards brain-computer interfaces for medical augmented reality. In: ISMAR 2012: 11th IEEE International symposium on mixed and augmented reality 2012, science and technology papers, pp 271–272. https://doi.org/10.1109/ISMAR.2012.6402569

  • Chen J, Granier X, Lin N (2010) On-line visualization of underground structures using context features. In: Proceedings of the ACM symposium on virtual reality software and technology (VRST), pp 167–170

    Google Scholar 

  • Chen Y, Shah NY, Goswami SS, Lange A, Von Haxthausen F, Sieren MM, Hagenah J, Ernst F, García-Vázquez V (2020) Localization of endovascular tools in X-ray images using a motorized C-arm: visualization on HoloLens. Curr Direct Biomed Eng 6(1):1–5. https://doi.org/10.1515/cdbme-2020-0029

  • Clarke TJ (2021) Depth perception using x-ray visualizations. In: 2021 IEEE International symposium on mixed and augmented reality adjunct (ISMAR-Adjunct), pp 483–486. https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00114

  • Clarke TJ, Mayer W, Zucco JE, Matthews BJ, Smith RT (2022) Adapting VST AR X-ray vision techniques to OST AR. In: Proceedings: 2022 IEEE International symposium on mixed and augmented reality adjunct (ISMAR-Adjunct 2022), pp 495–500. https://doi.org/10.1109/ISMAR-Adjunct57072.2022.00104

  • Cong R, Lei J, Fu H, Cheng MM, Lin W, Huang Q (2019) Review of visual saliency detection with comprehensive information. IEEE Trans Circ Syst Video Technol 29(10):2941–2959. https://doi.org/10.1109/TCSVT.2018.2870832

  • Côté S, Mercier A (2018) Augmentation of road surfaces with subsurface utility model projections. In: 2018 IEEE conference on virtual reality and 3D user interfaces (VR), pp 535–536. https://doi.org/10.1109/VR.2018.8446545

  • Cutting JE, Vishton P (1995) Perceiving layout and knowing distances: the integration, relative potency, and contextual use of different information about depth. In: Perception of space and motion, vol 22, Issue 5, pp 69–117. https://doi.org/10.1016/B978-012240530-3/50005-5

  • De Paolis LT, De Luca V (2019) Augmented visualization with depth perception cues to improve the surgeon’s performance in minimally invasive surgery. Med Biol Eng Comput 57(5):995–1013. https://doi.org/10.1007/s11517-018-1929-6

  • Dey A, Jarvis G, Sandor C, Reitmayr G (2012) Tablet versus phone: depth perception in handheld augmented reality. In: ISMAR 2012: 11th IEEE International symposium on mixed and augmented reality 2012, science and technology papers, pp 187–196. https://doi.org/10.1109/ISMAR.2012.6402556

  • Dey A, Jarvis G, Sandor C, Wibowo A, Ville-Veikko M (2011) An evaluation of augmented reality x-ray vision for outdoor navigation. In Proceedings of International conference on artificial reality and telexistence, pp 28–32

    Google Scholar 

  • Dey A, Sandor C (2014) Lessons learned: Evaluating visualizations for occluded objects in handheld augmented reality. Int J Hum Comput Stud 72(10–11):704–716. https://doi.org/10.1016/j.ijhcs.2014.04.001

  • Erat O, Isop WA, Kalkofen D, Schmalstieg D (2018) Drone-augmented human vision: exocentric control for drones exploring hidden areas. IEEE Trans Vis Comput Graph 24(4):1437–1446. https://doi.org/10.1109/TVCG.2018.2794058

  • Erat O, Pauly O, Weidert S, Thaller P, Euler E, Mutschler W, Navab N, Fallavollita P (2013) How a surgeon becomes superman by visualization of intelligently fused multi-modalities. In: Medical imaging 2013: image-guided procedures, robotic interventions, and modelling, vol 8671, pp 86710L. https://doi.org/10.1117/12.2006766

  • Eren MT, Balcisoy S (2018) Evaluation of X-ray visualization techniques for vertical depth judgments in underground exploration. Vis Comput 34(3):405–416. https://doi.org/10.1007/s00371-016-1346-5

  • Eren MT, Cansoy M, Balcisoy S (2013) Multi-view augmented reality for underground exploration. Proceedings: IEEE virtual reality, pp 117–118. https://doi.org/10.1109/VR.2013.6549390

  • Erickson A, Kim K, Bruder G, Welch GF (2020) Exploring the limitations of environment lighting on optical see—through head-mounted displays. In: SUI’20: Proceedings of the 2020 ACM symposium on spatial user interaction, pp 1–8. https://doi.org/10.1145/3385959.3418445

  • Feiner SK, Seligmann DD (1992) Cutaways and ghosting: satisfying visibility constraints in dynamic 3D illustrations. Vis Comput 8(5–6):292–302. https://doi.org/10.1007/BF01897116

  • Feiner SK, Webster AC, Krueger TE, MacIntyre B, Keller EJ (1995) Architectural anatomy. Presence: Teleoper Virtual Environ 4(3):318–325. https://doi.org/10.1162/pres.1995.4.3.318

  • Fischer M, Rosenberg J, Leuze C, Hargreaves B, Daniel B (2023) The impact of occlusion on depth perception at arm’s length. IEEE Trans Vis Comput Graph 1–9. https://doi.org/10.1109/TVCG.2023.3320239

  • Furmanski C, Azuma R, Daily M (2002) Augmented-reality visualizations guided by cognition: perceptual heuristics for combining visible and obscured information. Proceedings: International symposium on mixed and augmented reality (ISMAR 2002), pp 215–224. https://doi.org/10.1109/ISMAR.2002.1115091

  • Ghasemi S, Otsuki M, Milgram P, Chellali R (2017) Use of random dot patterns in achieving x-ray vision for near-field applications of stereoscopic video-based augmented reality displays. Presence: Teleoper Virtual Environ 26(1):42–65. https://doi.org/10.1162/PRES

  • Gruenefeld U, Brück Y, Boll S (2020) Behind the scenes: comparing x-ray visualization techniques in head-mounted optical see-through augmented reality. ACM Int Conf Proc Ser 179–185. https://doi.org/10.1145/3428361.3428402

  • Guo HJ, Bakdash J, Marusich L, Prabhakaran B (2022) Dynamic X-ray vision in mixed reality. In: Proceedings of the ACM symposium on virtual reality software and technology (VRST), pp 851–852. https://doi.org/10.1145/3562939.3565675

  • Habert S, Gardiazabal J, Fallavollita P, Navab N (2015) RGBDX: first design and experimental validation of a mirror-based RGBD X-ray imaging system. In: Proceedings of the 2015 IEEE International symposium on mixed and augmented reality (ISMAR 2015), pp 13–18. https://doi.org/10.1109/ISMAR.2015.17

  • Heinrich F, Apilla V, Lawonn K, Hansen C, Preim B, Meuschke M (2021) Estimating depth information of vascular models: a comparative user study between a virtual reality and a desktop application. Comput Graph (pergamon) 98:210–217. https://doi.org/10.1016/j.cag.2021.05.014

    Article  Google Scholar 

  • Heinrich F, Bornemann K, Lawonn K, Hansen C (2019a) Depth perception in projective augmented reality: an evaluation of advanced visualization techniques. In: Proceedings of the ACM symposium on virtual reality software and technology, VRST. https://doi.org/10.1145/3359996.3364245

  • Heinrich F, Joeres F, Lawonn K, Hansen C (2019) Comparison of projective augmented reality concepts to support medical needle insertion. IEEE Trans Vis Comput Graph 25(6):2157–2167. https://doi.org/10.1109/TVCG.2019.2903942

  • Heinrich F, Schwenderling L, Joeres F, Hansen C (2022) 2D versus 3D: a comparison of needle navigation concepts between augmented reality display devices. In Proceedings: 2022 IEEE conference on virtual reality and 3D user interfaces, VR 2022, pp 260–269. https://doi.org/10.1109/VR51125.2022.00045

  • Johnson AS, Sanchez J, French A, Sun Y (2014) Unobtrusive augmentation of critical hidden structures in laparoscopy. Stud Health Technol Inf 196:185–191

    Google Scholar 

  • Kalkofen D, Mendez E, Schmalstieg D (2007) Focus and context visualization for medical augmented reality focus and context visualization for medical augmented reality. ISMAR ’07: Proceedings of the 2007 6th IEEE and ACM International symposium on mixed and augmented reality, vol 6, pp 1–10

    Google Scholar 

  • Kalkofen D, Veas E, Zollmann S, Steinberger M, Schmalstieg D (2013) Adaptive ghosted views for augmented reality. In: 2013 IEEE International symposium on mixed and augmented reality, ISMAR 2013, vol 1(c), pp 1–9. https://doi.org/10.1109/ISMAR.2013.6671758

  • Kameda Y, Takemasa T, Ohta Y (2004) Outdoor see-through vision utilizing surveillance cameras. In: ISMAR 2004: Proceedings of the Third IEEE and ACM International symposium on mixed and augmented reality, ISMAR, pp 151–160. https://doi.org/10.1109/ISMAR.2004.45

  • Kim K, Billinghurst M, Bruder G, Duh HB, Welch GF (2018) Revisiting trends in augmented reality research: a review of the 2nd decade of ISMAR (2008–2017). IEEE Trans Vis Comput Graph 24(11):2947–2962. https://doi.org/10.1109/TVCG.2018.2868591

  • Kitajima Y, Ikeda S, Sato K (2015) Vergence-based AR X-ray vision. In: Proceedings of the 2015 IEEE International symposium on mixed and augmented reality, ISMAR 2015, pp 188–189. https://doi.org/10.1109/ISMAR.2015.58

  • Kytö M, Mäkinen A, Häkkinen J, Oittinen P (2013) Improving relative depth judgments in augmented reality with auxiliary augmentations. ACM Trans Appl Perception 10(1). https://doi.org/10.1145/2422105.2422111

  • Kytö M, Mäkinen A, Tossavainen T, Oittinen P (2014) Stereoscopic depth perception in video see-through augmented reality within action space. J Electron Imaging 23(1):11006. https://doi.org/10.1117/1.jei.23.1.011006

  • Lerotic M, Chung AJ, Mylonas G, Yang GZ (2007) Pq-space based non-photorealistic rendering for augmented reality. Lecture notes in computer science (Including Subseries Lecture notes in artificial intelligence and Lecture notes in bioinformatics), vol 4792 LNCS(PART 2), pp 102–109. https://doi.org/10.1007/978-3-540-75759-7_13

  • Li H, Corey RR, Giudice U, Giudice NA (2016) Assessment of visualization interfaces for assisting the development of multi-level cognitive maps. Lecture notes in computer science (Including Subseries Lecture notes in artificial intelligence and Lecture notes in bioinformatics), vol 9744(1), pp 308–321. https://doi.org/10.1007/978-3-319-39952-2_30

  • Liao S, Zhou Y, Popescu V (2023) AR interfaces for disocclusion—a comparative study. In: Proceedings—2023 IEEE conference virtual reality and 3d user interfaces, VR 2023, pp 530–540. https://doi.org/10.1109/VR55154.2023.00068

  • Liu F, Seipel S (2018) Precision study on augmented reality-based visual guidance for facility management tasks. Autom Constr 90:79–90. https://doi.org/10.1016/j.autcon.2018.02.020

  • Livingston MA, Rosenblum L, Macedonia M (2005) Projects in VR: evaluating human factors in augmented reality systems. IEEE Comput Graph Appl 25(6):6–9. https://doi.org/10.1109/MCG.2005.130

  • Maia LF, Viana W, Trinta F (2016) A real-time X-ray mobile application using augmented reality and google street view. In: Proceedings of the ACM symposium on virtual reality software and technology, VRST 02–04-Nov 2016, pp 111–119. https://doi.org/10.1145/2993369.2993370

  • Martin-Gomez A, Weiss J, Keller A, Eck U, Roth D, Navab N (2021) The impact of focus and context visualization techniques on depth perception in optical see-through head-mounted displays. IEEE Trans Vis Comput Graph XX(X):1–16. https://doi.org/10.1109/TVCG.2021.3079849

  • Microsoft (2016) RoboRaid Microsoft. https://www.microsoft.com/en-au/p/roboraid/9nblggh5fv3j?activetab=pivot:overviewtab

  • Milgram P, Kishimo F (1994) A taxonomy of mixed reality. IEICE Trans Inf Syst 77(12):1321–1329

    Google Scholar 

  • Muthalif MZA, Shojaei D, Khoshelham K (2022) Resolving perceptual challenges of visualizing underground utilities in mixed reality. Int Arch Photogram Remote Sens Spatial Inf Sci (ISPRS Archives) 48(4/W4–2022):101–108. https://doi.org/10.5194/isprs-archives-XLVIII-4-W4-2022-101-2022

  • Ohta Y, Kameda Y, Kitahara I, Hayashi M, Yamazaki S (2010) See-through vision: a visual augmentation method for sensing-web. Commun Comput Inf Sci 81(Part 2):690–699. https://doi.org/10.1007/978-3-642-14058-7_71

  • Otsuki M, Kamioka Y, Kitai Y, Kanzaki M, Kuzuoka H, Uchiyama H (2015) Please show me inside: improving the depth perception using virtual mask in stereoscopic AR. In: SIGGRAPH Asia 2015 emerging technologies, SA 2015, pp 2–5. https://doi.org/10.1145/2818466.2818469

  • Ozgur E, Lafont A, Bartoli A (2017) Visualizing in-organ tumors in augmented monocular laparoscopy. In: Adjunct proceedings of the 2017 IEEE International symposium on mixed and augmented reality (ISMAR-Adjunct 2017), pp 46–51. https://doi.org/10.1109/ISMAR-Adjunct.2017.30

  • Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, Shamseer L, Tetzlaff JM, Akl EA, Brennan SE, Chou R, Glanville J, Grimshaw JM, Hróbjartsson A, Lalu MM, Li T, Loder EW, Mayo-Wilson E, McDonald S, McGuinness LA, Stewart LA, Thomas J, Tricco AC, Welch VA, Whiting P, Moher D (2021) The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Syst Rev 10(1):1–11. https://doi.org/10.1186/s13643-021-01626-4

  • Park S, Bokijonov S, Choi Y (2021) Review of Microsoft Hololens applications over the past five years. Appl Sci (Switzerland) 11(16). https://doi.org/10.3390/app11167259

  • Park SM, Kim YG (2022) A metaverse: taxonomy, components, applications, and open challenges. IEEE Access 10:4209–4251. https://doi.org/10.1109/ACCESS.2021.3140175

    Article  Google Scholar 

  • Pauly O, Katouzian A, Eslami A, Fallavollita P, Navab N (2012) Supervised classification for customized intraoperative augmented reality visualization. In: ISMAR 2012—11th IEEE International symposium on mixed and augmented reality 2012, science and technology papers, pp 311–312. https://doi.org/10.1109/ISMAR.2012.6402589

  • Pereira M, Orfeo D, Ezequelle W, Burns D, Xia T, Huston DR (2019) Photogrammetry and augmented reality for underground infrastructure sensing, mapping and assessment. In: International conference on smart infrastructure and construction 2019, ICSIC 2019: driving data-informed decision-making, pp 169–175. https://doi.org/10.1680/icsic.64669.169

  • Phillips N, Khan FA, Kruse B, Bethel C, Swan JE (2021) An X-ray vision system for situation awareness in action space. Proceedings 2021 IEEE conference on virtual reality and 3D user interfaces abstracts and workshops, VRW 2021, pp 593–594. https://doi.org/10.1109/VRW52623.2021.00179

  • Phillips N, Kruse B, Khan FA, Li JES (2020) Window for law enforcement operations. Springer International Publishing. https://doi.org/10.1007/978-3-030-49695-1

  • Pittenger JB (1983) On the plausibility of Superman’s x-ray vision, vol 12, pp 635–639

    Google Scholar 

  • Pratt P, Ives M, Lawton G, Simmons J, Radev N, Spyropoulou L, Amiras D (2018) Through the HoloLensTM looking glass: augmented reality for extremity reconstruction surgery using 3D vascular models with perforating vessels. Eur Radiol Exp 2(1):1–6. https://doi.org/10.1186/s41747-017-0033-2

  • Ren M (2003) Learning a classification model for segmentation. In: Proceedings Ninth IEEE International conference on computer vision, vol 1, pp 10–17. https://doi.org/10.1109/ICCV.2003.1238308

  • Rompapas DC, Sorokin N, Lübke AIW, Taketomi T, Yamamoto G, Sandor C, Kato H (2014) Dynamic augmented reality X-ray on google glass. In: SIGGRAPH Asia 2014 Mobile graphics and interactive applications, SA 2014:2010. https://doi.org/10.1145/2669062.2669087

  • Röntgen WC (1895) Eine neue Art von Strahlen; Ueber eine neue Art von Strahlen (vorläufige Mittheilung); Ueber eine neue Art von Strahlen (vorläufige Mittheilung). Verlag Der Stahel’schen k. Hof- u. Universitäts- Buch- u. Kunsthandlung 1(1):137–147

    Google Scholar 

  • Sandor C, Cunningham A, Dey A, Mattila VV (2010) An augmented reality X-ray system based on visual saliency. In: 9th IEEE International symposium on mixed and augmented reality 2010: science and technology, ISMAR 2010 Proceedings, pp 27–36. https://doi.org/10.1109/ISMAR.2010.5643547

  • Santos MEC, de Souza Almeida I, Yamamoto G, Taketomi T, Sandor C, Kato H (2016) Exploring legibility of augmented reality X-ray. Multimedia Tools Appl 75(16):9563–9585. https://doi.org/10.1007/s11042-015-2954-1

  • Santos MEC, Terawaki M, Taketomi T, Yamamoto G, Kato H (2015) Development of handheld augmented reality X-Ray for K-12 settings. Lect Notes Educ Technol 199–219. ISBN 9783662444467. https://doi.org/10.1007/978-3-662-44447-4_11

  • Tsuda T, Yamamoto H, Kameda Y, Ohta Y (2005) Visualization methods for outdoor see-through vision. ACM Int Conf Proc Ser 157:62–69. https://doi.org/10.1145/1152399.1152412

    Article  Google Scholar 

  • Van Son R, Jaw SW, Yan J, Khoo HSV, Loo WKR, Teo SNS, Schrotter G (2018) A framework for reliable three-dimensional underground utility mapping for urban planning. Int Arch Photogram Remote Sens Spatial Inf Sci (ISPRS Archives) 42(4/W10):209–214. https://doi.org/10.5194/isprs-archives-XLII-4-W10-209-2018

  • Wang Z, Zhao Y, Lu F (2022a) Control with vergence eye movement in augmented reality see-through vision. In: Proceedings 2022 IEEE conference on virtual reality and 3D user interfaces abstracts and workshops, VRW 2022, pp 548–549. https://doi.org/10.1109/VRW55335.2022.00125

  • Wang Z, Zhao Y, Lu F (2022b) Gaze-vergence-controlled see-through vision in augmented reality. IEEE Trans Vis Comput Graph 28(11):3843–3853. https://doi.org/10.1109/TVCG.2022.3203110

  • Yamamoto G, Wolde Lubke A (2014) A see-through vision with handheld, pp 392–399

    Google Scholar 

  • Yasuda H, Ohama Y (2012) Toward a practical wall see-through system for drivers: How simple can it be? In: ISMAR 2012: 11th IEEE International symposium on mixed and augmented reality 2012. Science and technology papers, pp 333–334. https://doi.org/10.1109/ISMAR.2012.6402600

  • Zollmann S, Grasset R, Reitmayr G, Langlotz T (2014) Image-based X-ray visualization techniques for spatial understanding in outdoor augmented reality. In: Proceedings of the 26th Australian computer-human interaction conference, OzCHI 2014, pp 194–203. https://doi.org/10.1145/2686612.2686642

  • Zollmann S, Kalkofen D, Mendez E, Reitmayr G (2010) Image-based ghostings for single layer occlusions in augmented reality. In 9th IEEE International symposium on mixed and augmented reality 2010: science and technology, ISMAR 2010-Proceedings, pp 19–26. https://doi.org/10.1109/ISMAR.2010.5643546

Download references

Acknowledgements

The authors would like to acknowledge the Australian Government Research Training Program Scholarship for funding this research, Bradley Richards for Illustrating Fig. 15.1, and Ha Chau Nguyen for helping to edit this chapter.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thomas J. Clarke .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Clarke, T.J., Gwilt, I., Zucco, J., Mayer, W., Smith, R.T. (2024). Superpowers in the Metaverse: Augmented Reality Enabled X-Ray Vision in Immersive Environments. In: Geroimenko, V. (eds) Augmented and Virtual Reality in the Metaverse. Springer Series on Cultural Computing. Springer, Cham. https://doi.org/10.1007/978-3-031-57746-8_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-57746-8_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-57745-1

  • Online ISBN: 978-3-031-57746-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics