Interactive Training and Operation Ecosystem for Surgical Tasks in Mixed Reality
Inadequate skill in performing surgical tasks can lead to medical errors and cause avoidable injury or death to the patients. On the other hand, there are situations where a novice surgeon or resident does not have access to an expert while performing a task.
We therefore propose an interactive ecosystem for both training and practice of surgical tasks in mixed reality, which consists of authoring of the desired surgical task, immersive training and practice, assessment of the trainee, and remote coaching and analysis. This information-based ecosystem will also provide the data to train machine learning algorithms.
Our interactive ecosystem involves a head-mounted display (HMD) application that can provide feedback as well as audiovisual assistance for training and live clinical performance of the task. In addition, the remote monitoring station provides the expert with a real-time view of the scene from the user’s perspective and enables guidance by providing annotation directly on the user’s scene. We use bedside ventriculostomy, a neurosurgical procedure, as our illustrative use case; however the modular design of the system makes it expandable to other procedures.
KeywordsSurgical training and assessment Medical augmented reality Surgical simulation and modeling Artificial intelligence
We thank Professor Russell Taylor for his guidance and Patrick Myers, Benjamin Pikus, Prateek Bhatnagar and Allan Wang for their assistance with the software development.
- 1.Ahmidi, N., Hager, G.D., Ishii, L., Fichtinger, G., Gallia, G.L., Ishii, M.: Surgical task and skill classification from eye tracking and tool motion in minimally invasive surgery. In: Jiang, T., Navab, N., Pluim, J.P.W., Viergever, M.A. (eds.) MICCAI 2010. LNCS, vol. 6363, pp. 295–302. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15711-0_37CrossRefGoogle Scholar
- 3.Azimi, E., Doswell, J., Kazanzides, P.: Augmented reality goggles with an integrated tracking system for navigation in neurosurgery. In: Virtual Reality Short Papers and Posters (VRW), pp. 123–124. IEEE (2012)Google Scholar
- 4.Azimi, E., et al.: Can mixed-reality improve the training of medical procedures? In: IEEE Engineering in Medicine and Biology Conference (EMBC), pp. 112–116, July 2018Google Scholar
- 6.Chen, L., Day, T., Tang, W., John, N.W.: Recent developments and future challenges in medical mixed reality. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 123–135 (2017)Google Scholar
- 8.Eck, U., Stefan, P., Laga, H., Sandor, C., Fallavollita, P., Navab, N.: Exploring visuo-haptic augmented reality user interfaces for stereo-tactic neurosurgery planning. In: Zheng, G., Liao, H., Jannin, P., Cattin, P., Lee, S.-L. (eds.) MIAR 2016. LNCS, vol. 9805, pp. 208–220. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-43775-0_19CrossRefGoogle Scholar
- 11.Qian, L., Azimi, E., Kazanzides, P., Navab, N.: Comprehensive tracker based display calibration for holographic optical see-through head-mounted display. arXiv preprint arXiv:1703.05834 (2017)
- 13.Qian, L., et al.: Towards virtual monitors for image guided interventions-real-time streaming to optical see-through head-mounted displays. arXiv preprint arXiv:1710.00808 (2017)
- 16.Sadda, P., Azimi, E., Jallo, G., Doswell, J., Kazanzides, P.: Surgical navigation with a head-mounted tracking system and display. Stud. Health Technol. Inform. 184, 363–369 (2012)Google Scholar
- 17.Saucer, F., Khamene, A., Bascle, B., Rubino, G.J.: A head-mounted display system for augmented reality image guidance: towards clinical evaluation for imri-guided nuerosurgery. In: Niessen, W.J., Viergever, M.A. (eds.) MICCAI 2001. LNCS, vol. 2208, pp. 707–716. Springer, Heidelberg (2001). https://doi.org/10.1007/3-540-45468-3_85CrossRefGoogle Scholar
- 18.Stefan, P., et al.: A mixed-reality approach to radiation-free training of C-arm based surgery. In: Descoteaux, M., Maier-Hein, L., Franz, A., Jannin, P., Collins, D.L., Duchesne, S. (eds.) MICCAI 2017. LNCS, vol. 10434, pp. 540–547. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-66185-8_61CrossRefGoogle Scholar
- 20.Azimi, E., Qian, L., Kazanzides, P., Navab, N.:Robust optical see-through head-mounted display calibration: taking anisotropic nature of user interaction errors into account. In: IEEE Virtual Reality (VR), Los Angeles, CA, pp. 219–220 (2017)Google Scholar