Advertisement

Holographic Visualisation and Interaction of Fused CT, PET and MRI Volumetric Medical Imaging Data Using Dedicated Remote GPGPU Ray Casting

  • Magali Fröhlich
  • Christophe Bolinhas
  • Adrien Depeursinge
  • Antoine Widmer
  • Nicolas Chevrey
  • Patric Hagmann
  • Christian Simon
  • Vivianne B. C. Kokje
  • Stéphane Gobron
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11042)

Abstract

Medical experts commonly use imaging including Computed Tomography (CT), Positron-Emission Tomography (PET) and Magnetic Resonance Imaging (MRI) for diagnosis or to plan a surgery. These scans give a highly detailed representation of the patient anatomy, but the usual Three-Dimensional (3D) separate visualisations on screens does not provide an convenient and performant understanding of the real anatomical complexity. This paper presents a computer architecture allowing medical staff to visualise and interact in real-time holographic fused CT, PET, MRI of patients. A dedicated workstation with a wireless connection enables real-time General-Purpose Processing on Graphics Processing Units (GPGPU) ray casting computation through the mixed reality (MR) headset. The hologram can be manipulated with hand gestures and voice commands through the following interaction features: instantaneous visualisation and manipulation of 3D scans with a frame rate of 30 fps and a delay lower than 120 ms. These performances give a seamless interactive experience for the user [10].

Keywords

Augmented and mixed reality Medical application Medical visualisation MRI scan PET scan CT scan GPGPU ray casting HoloLens Hologram 

References

  1. 1.
    Bach, B., Sicat, R., Beyer, J., Cordeil, M., Pfister, H.: The hologram in my hand: how effective is interactive exploration of 3D visualizations in immersive tangible augmented reality? IEEE TVCG 24(1), 457–467 (2018)Google Scholar
  2. 2.
    Bernhardt, S., Nicolau, S.A., Soler, L., Doignon, C.: The status of augmented reality in laparoscopic surgery as of 2016. Med. Image Anal. 37, 66–90 (2017)CrossRefGoogle Scholar
  3. 3.
    Douglas, D.B., Wilke, C.A., Gibson, J.D., Boone, J.M., Wintermark, M.: Augmented reality: advances in diagnostic imaging. Multimodal Tech. Interact. 1, 29 (2017)CrossRefGoogle Scholar
  4. 4.
    Egger, J., et al.: HTC Vive MeVisLab integration via OpenVR for med. app. PLOS ONE 12(3), 1–14 (2017)Google Scholar
  5. 5.
    Gobron, S., Çöltekin, A., Bonafos, H., Thalmann, D.: GPGPU computation and visualization of 3D cellular automata. Visual Comput. 27(1), 67–81 (2011)CrossRefGoogle Scholar
  6. 6.
    Hamacher, A., et al.: Application of virtual, augmented, and mixed reality to urology. Int. Neurourol. J. 20(3), 172–181 (2016)CrossRefGoogle Scholar
  7. 7.
    Karmonik, C., Boone, T.B., Khavari, R.: Workflow for visualization of neuroimaging data with an AR device. J. Digital Imaging 31(1), 26–31 (2017)CrossRefGoogle Scholar
  8. 8.
    Morley, C., Choudhry, O., Kelly, S., Phillips, J., Ahmed, F.. In: SIIM Scientific Session: Poster & Demostrations (2017)Google Scholar
  9. 9.
    Qian, L., et al.: Technical Note: Towards Virtual Monitors for Image Guided Interventions - Real-time Streaming to Optical See-Through Head-Mounted Displays (2017)Google Scholar
  10. 10.
    Raaen, K., Kjellmo, I.: Measuring latency in VR systems, pp. 457–462 (2015)Google Scholar
  11. 11.
    Syed, A.Z., Zakaria, A., Lozanoff, S.: Dark room to augmented reality: application of hololens technology for oral radiological diagnosis. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 124(1), e33 (2017)CrossRefGoogle Scholar
  12. 12.
    Tepper, O.M., et al.: Mixed reality with HoloLens. Plast. Reconstr. Surg. 140(5), 1066–1070 (2017)CrossRefGoogle Scholar
  13. 13.
    Vaughan, N., Dubey, V.N., Wainwright, T.W., Middleton, R.G.: A review of virtual reality based training simulators for orthopaedic surgery. Med. Eng. Phys. 38(2), 59–71 (2016)CrossRefGoogle Scholar
  14. 14.
    Wang, J., et al.: Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation. Comput. Med. Imaging Graph. 40, 147–159 (2015)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.HE-ARC School of EngineeringUniversity of Applied Sciences and Arts Western Switzerland (HES-SO)NeuchâtelSwitzerland
  2. 2.HE-ARC School of HealthUniversity of Applied Sciences and Arts Western Switzerland (HES-SO)NeuchâtelSwitzerland
  3. 3.School of ManagementUniversity of Applied Sciences and Arts Western Switzerland (HES-SO)SierreSwitzerland
  4. 4.Biomedical Imaging Group (BIG)Ecole polytechnique fédérale de Lausanne (EPFL)LausanneSwitzerland
  5. 5.Departement of RadiologyLausanne University Hospital (CHUV-UNIL)LausanneSwitzerland
  6. 6.Departement of Otolaryngology - Head and Neck SurgeryCHUVLausanneSwitzerland

Personalised recommendations