Skip to main content
Log in

ARSlice: Head-Mounted Display Augmented with Dynamic Tracking and Projection

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Computed tomography (CT) generates cross-sectional images of the body. Visualizing CT images has been a challenging problem. The emergence of the augmented and virtual reality technology has provided promising solutions. However, existing solutions suffer from tethered display or wireless transmission latency. In this paper, we present ARSlice, a proof-of-concept prototype that can visualize CT images in an untethered manner without wireless transmission latency. Our ARSlice prototype consists of two parts, the user end and the projector end. By employing dynamic tracking and projection, the projector end can track the user-end equipment and project CT images onto it in real time. The user-end equipment is responsible for displaying these CT images into the 3D space. Its main feature is that the user-end equipment is a pure optical device with light weight, low cost, and no energy consumption. Our experiments demonstrate that our ARSlice prototype provides part of six degrees of freedom for the user, and a high frame rate. By interactively visualizing CT images into the 3D space, our ARSlice prototype can help untrained users better understand that CT images are slices of a body.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Raskar R, Welch G, Low K, Bandyopadhyay D. Shader lamps: Animating real objects with image-based illumination. In Proc. the 12th Eurographics Workshop on Rendering Techniques, June 2001, pp.89-102. https://doi.org/10.2312/EGWR/EGWR01/089-101.

  2. Benko H, Ofek E, Zheng F, Wilson A D. FoveAR: Combining an optically see-through near-eye display with projector-based spatial augmented reality. In Proc. the 28th Annual ACM Symposium on User Interface Software & Technology, November 2015, pp.129-135. https://doi.org/10.1145/2807442.2807493.

  3. Lincoln P, Welch G, Nashel A, State A, Ilie A, Fuchs H. Animatronic shader lamps avatars. Virtual Reality, 2011, 15(2/3): 225-238. https://doi.org/10.1007/s10055-010-0175-5.

    Article  Google Scholar 

  4. Lincoln P, Welch G, Nashel A, Ilie A, State A, Fuchs H. Animatronic shader lamps avatars. In Proc. the 8th IEEE International Symposium on Mixed and Augmented Reality, October 2009, pp.27-33. https://doi.org/10.1109/ISMAR.2009.5336503.

  5. Narita G, Watanabe Y, Ishikawa M. Dynamic projection mapping onto deforming non-rigid surface using deformable dot cluster marker. IEEE Trans. Vis. Comput. Graph., 2017, 23(3): 1235-1248. https://doi.org/10.1109/TVCG.2016.2592910.

    Article  Google Scholar 

  6. Wang L, Xu H, Tabata S, Hu Y, Watanabe Y, Ishikawa M. High-speed focal tracking projection based on liquid lens. In Proc. the 2020 ACM SIGGRAPH, August 2020, Article No. 15. https://doi.org/10.1145/3388534.3408333.

  7. García-Berná J A, Sanchez-Gomez J M, Hermanns J, García-Mateos G, Alemán J L F. Calcification detection of abdominal aorta in CT images and 3D visualization in VR devices. In Proc. the 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, August 2016, pp.4157-4160. https://doi.org/10.1109/EMBC.2016.7591642.

  8. Azuma R. A survey of augmented reality. Presence: Teleoperators and Virtual Environments, 1997, 6(4): 355-385. https://doi.org/10.1162/pres.1997.6.4.355.

    Article  Google Scholar 

  9. Ibrahim A, Huynh B, Downey J, Höllerer T, Chun D, O’Donovan J. ARbis pictus: A study of vocabulary learning with augmented reality. IEEE Trans. Vis. Comput. Graph., 2018, 24(11): 2867-2874. https://doi.org/10.1109/TVCG.2018.2868568.

    Article  Google Scholar 

  10. Buttussi F, Chittaro L. Effects of different types of virtual reality display on presence and learning in a safety training scenario. IEEE Trans. Vis. Comput. Graph., 2018, 24(2): 1063-1076. https://doi.org/10.1109/TVCG.2017.2653117.

    Article  Google Scholar 

  11. Bai Z, Blackwell A F, Coulouris G. Using augmented reality to elicit pretend play for children with autism. IEEE Trans. Vis. Comput. Graph., 2015, 21(5): 598-610. https://doi.org/10.1109/TVCG.2014.2385092.

    Article  Google Scholar 

  12. Menk C, Koch R. Truthful color reproduction in spatial augmented reality applications. IEEE Trans. Vis. Comput. Graph., 2013, 19(2): 236-248. https://doi.org/10.1109/TVCG.2012.146.

    Article  Google Scholar 

  13. Liu X, Vlachou C, Qian F, Wang C, Kim K. Firey: Untethered multi-user VR for commodity mobile devices. In Proc. the 2020 USENIX Annual Technical Conference, July 2020, pp.943-957.

  14. Stauffert J, Niebling F, Latoschik M E. Simultaneous runtime measurement of motion-to-photon latency and latency jitter. In Proc. the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, March 2020, pp.636-644. https://doi.org/10.1109/VR46266.2020.00086.

  15. Ehnes J, Hirota K, Hirose M. Projected augmentation—Augmented reality using rotatable video projectors. In Proc. the 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality, November 2004, pp.26-35. https://doi.org/10.1109/ISMAR.2004.47.

  16. Schwerdtfeger B, Pustka D, Hofhauser A, Klinker G. Using laser projectors for augmented reality. In Proc. the 2008 ACM Symposium on Virtual Reality Software and Technology, October 2008, pp.134-137. https://doi.org/10.1145/1450579.1450608.

  17. Jones B R, Sodhi R, Campbell R H, Garnett G, Bailey B P. Build your world and play in it: Interacting with surface particles on complex objects. In Proc. the 9th IEEE International Symposium on Mixed and Augmented Reality, October 2010, pp.165-174. https://doi.org/10.1109/ISMAR.2010.5643566.

  18. Hua H, Gao C, Brown L D, Ahuja N, Rolland J P. Using a head-mounted projective display in interactive augmented environments. In Proc. the 4th International Symposium on Augmented Reality, October 2001, pp.217-223. https://doi.org/10.1109/ISAR.2001.970540.

  19. Hamasaki T, Itoh Y, Hiroi Y, Iwai D, Sugimoto M. HySAR: Hybrid material rendering by an optical see-through head-mounted display with spatial augmented reality projection. IEEE Trans. Vis. Comput. Graph., 2018, 24(4): 1457-1466. https://doi.org/10.1109/TVCG.2018.2793659.

    Article  Google Scholar 

  20. Hiroi Y, Itoh Y, Hamasaki T, Iwai D, Sugimoto M. HySAR: Hybrid material rendering by an optical see-through head-mounted display with spatial augmented reality projection. In Proc. the 2017 IEEE Virtual Reality, March 2017, pp.211-212. https://doi.org/10.1109/VR.2017.7892251.

  21. Kim K, Billinghurst M, Bruder G, Duh H B, Welch G F. Revisiting trends in augmented reality research: A review of the 2nd decade of ISMAR (2008–2017). IEEE Trans. Vis. Comput. Graph., 2018, 24(11): 2947-2962. https://doi.org/10.1109/TVCG.2018.2868591.

    Article  Google Scholar 

  22. Zhou F, Duh H B, Billinghurst M. Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. In Proc. the 7th IEEE and ACM International Symposium on Mixed and Augmented Reality, September 2008, pp.193-202. https://doi.org/10.1109/ISMAR.2008.4637362.

  23. Liu S, Cheng D, Hua H. An optical see-through head mounted display with addressable focal planes. In Proc. the 7th IEEE and ACM International Symposium on Mixed and Augmented Reality, September 2008, pp.33-42. https://doi.org/10.1109/ISMAR.2008.4637321.

  24. Chakravarthula P, Dunn D, Aksit K, Fuchs H. FocusAR: Autofocus augmented reality eyeglasses for both real world and virtual imagery. IEEE Trans. Vis. Comput. Graph., 2018, 24(11): 2906-2916. https://doi.org/10.1109/TVCG.2018.2868532.

    Article  Google Scholar 

  25. Maimone A, Georgiou A, Kollin J S. Holographic near-eye displays for virtual and augmented reality. ACM Trans. Graph., 2017, 36(4): Article No. 85. https://doi.org/10.1145/3072959.3073624.

  26. Jang C, Bang K, Moon S, Kim J, Lee S, Lee B. Retinal 3D: Augmented reality near-eye display via pupil-tracked light field projection on retina. ACM Trans. Graph., 2017, 36(6): Article No. 190. https://doi.org/10.1145/3130800.3130889.

  27. Kikinis R, Shenton M E, Iosifescu D V, McCarley R W, Saiviroonporn P, Hokama H H, Robatino A, Metcalf D, Wible C, Portas C M, Donnino R M, Jolesz F A. A digital brain atlas for surgical planning, model-driven segmentation, and teaching. IEEE Trans. Vis. Comput. Graph., 1996, 2(3): 232-241. https://doi.org/10.1109/2945.537306.

    Article  Google Scholar 

  28. Wu J, Belle A, Hargraves R H, Cockrell C, Tang Y, Najarian K. Bone segmentation and 3D visualization of CT images for traumatic pelvic injuries. Int. J. Imaging Syst. Technol., 2014, 24(1): 29-38. https://doi.org/10.1002/ima.22076.

    Article  Google Scholar 

  29. Hu Z, Zou J, Gui J, Rong J, Li Y, Xi D, Zheng H. Real-time visualization and interaction of three-dimensional human CT images. J. Comput., 2010, 5(9): 1335-1342. https://doi.org/10.4304/jcp.5.9.1335-1342.

    Article  Google Scholar 

  30. Jung Y, Kim J, Feng D D. Dual-modal visibility metrics for interactive PET-CT visualization. In Proc. the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, August 28-September 1, 2012, pp.2696-2699. https://doi.org/10.1109/EMBC.2012.6346520.

  31. Javan R, Rao A, Jeun B S, Herur-Raman A, Singh N, Heidari P. From CT to 3D printed models, serious gaming, and virtual reality: Framework for educational 3D visualization of complex anatomical spaces from within—The pterygopalatine fossa. J. Digit. Imaging, 2020, 33(3): 776-791. https://doi.org/10.1007/s10278-019-00315-y.

    Article  Google Scholar 

  32. Xing J, Ai H, Lao S. Multi-object tracking through occlusions by local tracklets filtering and global tracklets association with detection responses. In Proc. the 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 2009, pp.1200-1207. https://doi.org/10.1109/CVPR.2009.5206745.

  33. Pirsiavash H, Ramanan D, Fowlkes C C. Globally-optimal greedy algorithms for tracking a variable number of objects. In Proc. the 24th IEEE Conference on Computer Vision and Pattern Recognition, June 2011, pp.1201-1208. https://doi.org/10.1109/CVPR.2011.5995604.

  34. Zhang Z. Flexible camera calibration by viewing a plane from unknown orientations. In Proc. the 7th IEEE International Conference on Computer Vision, September 1999, pp.666-673. https://doi.org/10.1109/ICCV.1999.791289.

  35. Kato H, Billinghurst M. Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In Proc. the 2nd IEEE and ACM International Workshop on Augmented Reality, October 1999, pp.85-94. https://doi.org/10.1109/IWAR.1999.803809.

  36. Gupta S, Jaynes C O. Active pursuit tracking in a projector-camera system with application to augmented reality. In Proc. the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, September 2005, Article No. 111. https://doi.org/10.1109/CVPR.2005.403.

  37. Wilson A, Benko H, Izadi S, Hilliges O. Steerable augmented reality with the beamatron. In Proc. the 25th Annual ACM Symposium on User Interface Software and Technology, October 2012, pp.413-422. https://doi.org/10.1145/2380116.2380169.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Li-Hui Wang.

Supplementary Information

ESM 1

(PDF 143 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, YP., Xie, SW., Wang, LH. et al. ARSlice: Head-Mounted Display Augmented with Dynamic Tracking and Projection. J. Comput. Sci. Technol. 37, 666–679 (2022). https://doi.org/10.1007/s11390-022-2173-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-022-2173-y

Keywords

Navigation