Abstract
Introduction
CT-guided interventions are taught using a mentored approach on real patients. It is well established that simulation is a valuable training tool in medicine. This project assessed the feasibility and acceptance of replicating a CT-guided intervention using a bespoke software application with an augmented reality head-mounted display (ARHMD).
Methods
A virtual patient was generated using a CT dataset obtained from The Cancer Imaging Archive. A surface mesh of a virtual patient was projected into the field-of-view of the operator. ChArUco markers, placed on both the needle and agar jelly phantom, were tracked using RGB cameras built into the ARHMD. A virtual CT slice simulating the needle position was generated on voice command. The application was trialled by senior interventional radiologists and trainee radiologists with a structured questionnaire evaluating face validity and technical aspects.
Results
Sixteen users trialled the application and feedback was received from all. Eleven felt the accuracy and realism was adequate for training and twelve felt more confident about their CT biopsy skills after this training session.
Discussion
The study showed the feasibility of simulating a CT-guided procedure with augmented reality and that this could be used as a training tool.
Key Points
• Simulating a CT-guided procedure using augmented reality is possible.
• The simulator developed could be an effective training tool for clinical practical skills.
• Complexity of cases can be tailored to address the training level demands.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
Historically, procedural training in medicine has relied on the application of Halsted’s model of “see one, do one, teach one” [1, 2]. This apprenticeship model of learning is based on observation, then performance, and finally demonstration. This model has some drawbacks: firstly, one cannot reliably and objectively monitor or predict the output of a training program, since feedback is given by the judgement of a trainer. Moreover, it requires that an apprentice learns a procedure by practising on a real patient in a clinical setting, which causes discomfort and may lead to risks for patients, particularly if the procedure involves ionising radiation. It also requires the opportunity to repeat the procedure in a clinical environment until competence is achieved, which makes training accessible only to a select group of students.
Procedural training by simulation can be a risk-free and low-pressure alternative to the apprenticeship method for procedural training [3]. In the former, students can practise a procedure multiple times and can learn from mistakes without risk to patients. The effectiveness of simulations as a training tool for procedural skills has been well-researched over the past few decades [4]. Even low-fidelity simulations in CT procedures have been shown to be effective in reducing procedure time and radiation exposure as well as improving confidence in participants [5, 6]. Now, due to advances in technology, there has been an increased interest in the application of virtual reality (VR) and augmented reality (AR) in procedural training. A key advantage of the use of VR and AR in simulation for procedural training is that these technologies inherently require active learner engagement, which is widely recognized as a cornerstone of effective learning [7]. Recent evidence to support the use of VR and AR simulations in medical education and training is abundant [8,9,10,11,12,13,14,15,16,17,18].
VR and AR simulation for procedural training has also been explored in the domain of radiology. Some examples of this are the use of an AR simulator for ultrasound-guided percutaneous renal access, which showed significant performance improvement in novices [16], and the use of an AR simulator for training in fluoroscopy-guided lumbar puncture, which was regarded as a “realistic replication of the anatomy and procedure” by trial users [13]. These results are promising, but it remains to be shown specifically that CT-guided procedures can be realistically simulated in a scalable and low-cost way.
The Microsoft HoloLens is an augmented reality head-mounted display (ARHMD) with three-dimensional (3D) “mixed reality” capabilities, by which we mean a head-mounted display with an integrated processing unit that allows real-time 3D mapping and tracking of the physical space around the user and that can overlay 3D objects into the field-of-view of the user as part of the physical space. The HoloLens was one of the first ARHMDs with 3D mixed reality capabilities. The second version of the headset, the HoloLens 2, was introduced in 2019. Notably, the HoloLens 2 has a wider field of view and a shifted centre of gravity allowing for improved ergonomics [19]. The device’s visible light camera also has a resolution significantly greater than its predecessor, which can enable more accurate pattern recognition and tracking. We expect that the current capabilities of 3D mixed reality ARHMDs, such as those of the HoloLens 2, together with the haptic feedback from a biopsy phantom, can be used for a realistic simulation of CT-guided procedures, potentially increasing the effectiveness and quality of medical training with a relatively low cost. There are examples of studies examining the feasibility of 3D mixed reality ARHMDs for surgical procedures [20] and for ultrasound-guided interventions [21], but not for the simulation of CT-guided procedures.
To address this, we present a simulator for CT-guided biopsies with haptic feedback using the HoloLens 2 and a bespoke software application. We present the results of a user trial of the simulator in order to evaluate the accuracy and realism, the acceptance by trainers and trainees, and the feasibility of the simulator for the use in the training of CT-guided biopsies.
Materials and methods
Phantom
Multiple identical “mock” phantoms of a torso were created from an agar jelly mixture [22]. In each phantom, a reference ChArUco marker was placed in a fixed position for position verification by the ARMHD. A CorVocet© Biopsy Needle was fitted with a ChArUco marker. The needle could be freely manipulated and its movement could be tracked by the ARHMD. The ChArUco markers combine the features of a chessboard pattern, where the exact points of intersection are easily refined, and the ArUco marker family, which facilitates fast detection while allowing for continued detection during partial occlusion of the marker.
Hardware
The Microsoft HoloLens 2™ is an augmented reality device developed by the Microsoft Corporation. Multiple HoloLens 2 devices were available for use.
Software
A publicly available CT dataset of a torso was obtained from the Cancer Imaging Archives [19]. A HoloLens application was developed, which displays multiple interactive elements in the field-of-view of the user. The software was written entirely in C# and C++ using Microsoft Visual Studio 2019, the DirectX SDK, and the ChArUco implementation in OpenCV. These elements include an interface containing a rendering of the slices of the CT dataset together with a simulated CT image of a needle, as well as a 3D model of a torso and a 3D model of a needle. The location of the 3D model of the torso corresponds to the reference marker placed on the phantom and the location of the 3D model of the needle corresponds to the marker placed on the biopsy needle. A green line is displayed on the 3D model of the torso to indicate the scan slice location, mimicking the positional laser guide used in modern CT scanners.
The HoloLens application can be controlled by the following voice commands:
-
1.
“Select” selects the needle tool
-
2.
“Scan” brings up the CT scan of the location of the virtual needle
-
3.
“Next” brings up the next CT slice
-
4.
“Previous” brings up the previous CT slice
Image segmentation was used to prepare the CT dataset for the biopsy procedure training. Objects in the rendering of the CT dataset were coloured to indicate the objectives for the participants to target with the biopsy needle. Two retroperitoneal lymph nodes were identified as targets representing two levels of difficulty. The beginner target was a simple pararenal retroperitoneal lymph node with a direct path and the expert target was a para-aortic lymph node.
Validation
To validate the effectiveness of our simulator for biopsy procedure training, we held a user trial during a day-long introduction to interventional radiology for junior radiology trainees. We enrolled 12 junior trainees and 4 trainers from this event on a voluntary basis. The junior trainees consisted of radiology registrars in training at local NHS medical centres. All participants signed an Imperial College consent form allowing the use of their data for this study.
An introductory presentation was given to all participants, in which the key functionalities of the HoloLens, the relevant voice commands, and relevant gestures for controlling the hardware were introduced.
Three stations, each consisting of a mock phantom and a HoloLens 2, were available to the participants (Fig. 1). The junior trainees were divided into 4 groups of 3. The participants in the first group were each assigned to a station to complete a session of 30 min, and the groups rotated afterwards. The first 10 min of each session was reserved for an introduction to the HoloLens and the simulator, which included practising voice commands under guidance by an expert. After this introduction, each participant was given 20 min to complete a simulated biopsy procedure, starting with the beginner target and on completion moving to the expert target (Fig. 2).
The procedure that the participants followed consisted of the following steps:
-
1.
Placing the biopsy needle in the mock torso, while aligning the needle with the simulated green (laser) line
-
2.
Taking a simulated scan, advancing the slice position if necessary
-
3.
Viewing the simulated location of the needle in the CT image
-
4.
Adjusting/advancing the needle towards the target
-
5.
Repeating the simulated scan, advancing the slice position if necessary
-
6.
Repeating until the objective is reached
Immediately after the simulation session, participants were asked to complete an anonymous feedback questionnaire evaluating their experience with the augmented reality simulator. The questionnaire consisted of closed-ended questions with Likert-type scale responses and open-ended questions. The questionnaire was constructed out of validated questionnaires from previous studies, but not validated in its current form.
Results
In total, 16 users trialed the application and all of them completed the questionnaire. An overview of the responses to the closed-ended questions is given in Tables 1, 2 and Fig. 3. The results from the feedback questionnaire are presented in terms of accuracy and realism, acceptability, and feasibility for the delivery of biopsy procedure training.
Accuracy and realism
Accuracy was assessed in terms of needle placement, needle advancement, display of the simulated CT scan of the needle, and “feel” of the mock phantom. Of the users, 11 agreed that the simulation of the procedure was realistic and 11 agreed that the accuracy of the needle placement was adequate. The haptic feedback provided by the agar jelly was perceived as realistic by 11 users. One user stated that the simulation was “as realistic as it can get in a virtual training environment”. In particular, 3 out of 4 experts agreed that the simulation of the procedure was accurate and needle positioning was accurate and all experts agreed that the haptic feedback was realistic (Table 2.)
Acceptability
More generally, the system should be perceived as an acceptable training tool. All users reported enjoying the training method (13 strongly agreed and 3 agreed). Four users agreed, 8 users neither agreed nor disagreed, and 3 users disagreed with the statement “the application was easy to use”. Fourteen users experienced no discomfort in wearing the HoloLens, and 2 users neither agreed nor disagreed. One user struggled slightly with the hardware and one user commented “I had a few difficulties with the headset in the beginning; figuring out exactly what to say and what it does”.
Feasibility for biopsy procedure training
Twelve of 16 users agreed that they felt more confident about their skills after training. Furthermore, 13 users agreed with the statement “This HoloLens app is a good addition to my current training”. One user commented that the simulator was beneficial for “understanding the geometrical relationship between needle position and slice number”. One user answered that the simulation provided a “rich new opportunity to practice.”
Discussion
We have shown that it is possible to simulate a CT-guided procedure using augmented reality technology. In the user trial of the simulator by trainees and trainers, the simulation of the procedure and of the haptic feedback was perceived as realistic by most users (11/16), and most users perceived the accuracy of the needle as adequate (11/16). However, one expert neither agreed nor disagreed that the simulation of the procedure was realistic.
The new functionalities of the HoloLens 2 and its improved ergonomics are likely to have improved scores related to acceptability, as all users reported enjoying the simulation training and no users reported discomfort in wearing the HoloLens. The majority of users neither agreed nor disagreed that the application was easy to use, while the other participants agreed and disagreed on the ease of use in similar proportions.
Most users (12/16) agreed that they felt more confident about their skills after using the simulator and most users (13/16) agreed that the simulator was a good addition to their training. Previous studies of simulations relating to CT-guided procedures also demonstrated an increase in user confidence [5, 6]. Both of these studies also demonstrated a reduced procedure completion time and radiation dose. A limitation of the current study is the fact that procedure time and radiation dose were not calculated.
In the introductory session of 10 min, participants had to learn to use the HoloLens and to interact with the virtual augmented content as well as learn the gestures and voice commands needed to complete the simulation procedure. While this was not measured, we speculate that the inexperience of the participants with both the HoloLens and the procedure itself combined with the short introduction time led to a lower perceived ease-of-use of the simulator, possibly due to an increased cognitive load during the simulation.
Compared to existing simulations, our simulator notably makes use of readily available materials and technologies with bespoke software, resulting in a low cost and high scalability.
Further work is needed to examine the applicability of this technology in training as well as in clinical practice. We suggest examining the effect of the simulator on the performance of biopsy procedures, in particular in comparison to “classical” biopsy procedure training on a real patient.
Abbreviations
- 3D:
-
Three-dimensional
- AR:
-
Augmented reality
- ARHMD:
-
Augmented reality head-mounted device
- ChArUco:
-
Chessboard Augmented Reality library from the University of Cordoba
- CT:
-
Computed tomography
- VR:
-
Virtual reality
References
Kotsis SV, Chung KC (2013) Application of the see one do one, teach one concept in surgical training. Plast Reconstr Surg 131:1194–1201
Pellegrini VD, Ferguson PC, Cruess R et al (2015) Sufficient competence to enter the unsupervised practice of orthopaedics: what is it when does it occur, and do we know it when we see it? J Bone Joint Surg Am 97:1459–1464
Al-Elq AH (2010) Simulation-based medical teaching and learning. J Fam Community Med 17:35
Nestel D, Groom J, Eikeland-Husebø S, O’Donnell JM (2011) Simulation for learning and teaching procedural skills. Simul Healthc 6:S10–S13
Mendiratta-Lala M, Williams TR, Mendiratta V et al (2015) Simulation center training as a means to improve resident performance in percutaneous noncontinuous CT-guided fluoroscopic procedures with dose reduction. AJR Am J Roentgenol 204:W376–W383
Picard M, Nelson R, Roebel J et al (2016) Use of low-fidelity simulation laboratory training for teaching radiology residents CT-guided procedures. J Am Coll Radiol 13:1363–1368
Pottle J (2019) Virtual reality and the transformation of medical education. Future Healthc J 6:181–185
Weidenbach M, Wick C, Pieper S et al (2000) Augmented reality simulator for training in two-dimensional echocardiography. Comput Biomed Res 33:11–22
Kerner KF, Imielinska C, Rolland J, Tang H (2003) Augmented reality for teaching endotracheal intubation: MR imaging to create anatomically correct models. AMIA Annu Symp Proc 2003;2003:888
Luciano CJ, Banerjee PP, Bellotte B et al (2011) Learning retention of thoracic pedicle screw placement using a high-resolution augmented reality simulator with haptic feedback. Operative Neurosurgery 69:ons14–ons19
Loukas C, Lahanas V, Georgiou E (2013) An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training. Int J Med Robot 9:e34–e51
Wang S, Parsons M, Stone-McLean J et al (2017) Augmented reality as a telemedicine platform for remote procedural training. Sensors (Basel) 17:2294
Ali S, Qandeel M, Ramakrishna R, Yang CW (2018) Virtual simulation in enhancing procedural training for fluoroscopy-guided lumbar puncture: a pilot study. Acad Radiol 25:235–239
Choque-Velasquez J, Colasanti R, Collan J et al (2018) Virtual reality glasses and eye-hands blind technique for microsurgical training in neurosurgery. World Neurosurg 112:126–130
McCarthy CJ, Yu AYC, Do S et al (2018) Interventional radiology training using a Dynamic Medical Immersive Training Environment (DynaMITE). J Am Coll Radiol 15:789–793
Hamacher A, Kim SJ, Cho ST et al (2016) Application of virtual, augmented, and mixed reality to urology. Int Neurourol J 20:172–181
Siff LN, Mehta N (2018) An interactive holographic curriculum for urogynecologic surgery. Obstet Gynecol 132(Suppl 1):27S–32S
Mu Y, Hocking D, Wang ZT et al (2020) Augmented reality simulator for ultrasound-guided percutaneous renal access. Int J Comput Assist Radiol Surg 15:749–757
Microsoft at MWC Barcelona: Introducing Microsoft HoloLens 2 - The Official Microsoft Blog. Available at https://blogs.microsoft.com/blog/2019/02/24/microsoft-at-mwc-barcelona-introducing-microsoft-hololens-2/. Accessed on 19 Dec 2020
Pratt P, Ives M, Lawton G et al (2018) Through the HoloLens™ looking glass: augmented reality for extremity reconstruction surgery using 3D vascular models with perforating vessels. Eur Radiol Exp 2:2
Rüger C, Feufel MA, Moosburner S et al (2020) Ultrasound in augmented reality: a mixed-methods evaluation of head-mounted displays in image-guided interventions. Int J Comput Assist Radiol Surg 15:1895–1905
Khera PS, Keshava SN (2014) An indigenous model for learning ultrasound-guided interventions. Indian J Radiol Imaging 24:132–134
Funding
The authors state that this work has not received any funding.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Guarantor
The scientific guarantor of this publication is Dimitri Amiras.
Statistics and biometry
No complex statistical methods were necessary for this paper.
Informed consent
Written informed consent was obtained from all subjects (patients) in this study.
Ethical approval
We did not seek Institutional Review Board approval since all human participation in the study was deemed to be of minimal risk and fulfilled in a non-clinical and non-invasive context under informed consent.
Conflict of interest
Philip Pratt is Chief Scientific Officer and Dimitri Amiras is a clinical advisor at Medical iSight Corporation. The authors of this manuscript declare no other relationships with any companies, whose products or services may be related to the subject matter of the article.
Methodology
• Prospective
• Experimental study
• Performed at one institution
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
D. Amiras and T. J. Hurkxkens are considered the first authors on this publication.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Amiras, D., Hurkxkens, T.J., Figueroa, D. et al. Augmented reality simulator for CT-guided interventions. Eur Radiol 31, 8897–8902 (2021). https://doi.org/10.1007/s00330-021-08043-0
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00330-021-08043-0