Advertisement

Interactive Training and Operation Ecosystem for Surgical Tasks in Mixed Reality

  • Ehsan AzimiEmail author
  • Camilo Molina
  • Alexander Chang
  • Judy Huang
  • Chien-Ming Huang
  • Peter Kazanzides
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11041)

Abstract

Inadequate skill in performing surgical tasks can lead to medical errors and cause avoidable injury or death to the patients. On the other hand, there are situations where a novice surgeon or resident does not have access to an expert while performing a task.

We therefore propose an interactive ecosystem for both training and practice of surgical tasks in mixed reality, which consists of authoring of the desired surgical task, immersive training and practice, assessment of the trainee, and remote coaching and analysis. This information-based ecosystem will also provide the data to train machine learning algorithms.

Our interactive ecosystem involves a head-mounted display (HMD) application that can provide feedback as well as audiovisual assistance for training and live clinical performance of the task. In addition, the remote monitoring station provides the expert with a real-time view of the scene from the user’s perspective and enables guidance by providing annotation directly on the user’s scene. We use bedside ventriculostomy, a neurosurgical procedure, as our illustrative use case; however the modular design of the system makes it expandable to other procedures.

Keywords

Surgical training and assessment Medical augmented reality Surgical simulation and modeling Artificial intelligence 

Notes

Acknowledgement

We thank Professor Russell Taylor for his guidance and Patrick Myers, Benjamin Pikus, Prateek Bhatnagar and Allan Wang for their assistance with the software development.

References

  1. 1.
    Ahmidi, N., Hager, G.D., Ishii, L., Fichtinger, G., Gallia, G.L., Ishii, M.: Surgical task and skill classification from eye tracking and tool motion in minimally invasive surgery. In: Jiang, T., Navab, N., Pluim, J.P.W., Viergever, M.A. (eds.) MICCAI 2010. LNCS, vol. 6363, pp. 295–302. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-15711-0_37CrossRefGoogle Scholar
  2. 2.
    Armstrong, D.G., Rankin, T.M., Giovinco, N.A., Mills, J.L., Matsuoka, Y.: A heads-up display for diabetic limb salvage surgery: a view through the google looking glass. J. Diabetes Sci. Technol. 8(5), 951–956 (2014)CrossRefGoogle Scholar
  3. 3.
    Azimi, E., Doswell, J., Kazanzides, P.: Augmented reality goggles with an integrated tracking system for navigation in neurosurgery. In: Virtual Reality Short Papers and Posters (VRW), pp. 123–124. IEEE (2012)Google Scholar
  4. 4.
    Azimi, E., et al.: Can mixed-reality improve the training of medical procedures? In: IEEE Engineering in Medicine and Biology Conference (EMBC), pp. 112–116, July 2018Google Scholar
  5. 5.
    Barsom, E.Z., Graafland, M., Schijven, M.P.: Systematic review on the effectiveness of augmented reality applications in medical training. Surg. Endosc. 30(10), 4174–4183 (2016)CrossRefGoogle Scholar
  6. 6.
    Chen, L., Day, T., Tang, W., John, N.W.: Recent developments and future challenges in medical mixed reality. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 123–135 (2017)Google Scholar
  7. 7.
    Cutolo, F., et al.: A new head-mounted display-based augmented reality system in neurosurgical oncology: a study on phantom. Comput. Assist. Surg. 22(1), 39–53 (2017)CrossRefGoogle Scholar
  8. 8.
    Eck, U., Stefan, P., Laga, H., Sandor, C., Fallavollita, P., Navab, N.: Exploring visuo-haptic augmented reality user interfaces for stereo-tactic neurosurgery planning. In: Zheng, G., Liao, H., Jannin, P., Cattin, P., Lee, S.-L. (eds.) MIAR 2016. LNCS, vol. 9805, pp. 208–220. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-43775-0_19CrossRefGoogle Scholar
  9. 9.
    Kersten-Oertel, M., Jannin, P., Collins, D.L.: DVV: a taxonomy for mixed reality visualization in image guided surgery. IEEE Trans. Vis. Comput. Graph. 18(2), 332–352 (2012)CrossRefGoogle Scholar
  10. 10.
    Lemole Jr., G.M., Banerjee, P.P., Luciano, C., Neckrysh, S., Charbel, F.T.: Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback. Neurosurgery 61(1), 142–149 (2007)CrossRefGoogle Scholar
  11. 11.
    Qian, L., Azimi, E., Kazanzides, P., Navab, N.: Comprehensive tracker based display calibration for holographic optical see-through head-mounted display. arXiv preprint arXiv:1703.05834 (2017)
  12. 12.
    Qian, L., et al.: Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2D-display. Int. J. Comput. Assist. Radiol. Surg. (IJCARS) 12(6), 901–910 (2017)CrossRefGoogle Scholar
  13. 13.
    Qian, L., et al.: Towards virtual monitors for image guided interventions-real-time streaming to optical see-through head-mounted displays. arXiv preprint arXiv:1710.00808 (2017)
  14. 14.
    Raabe, C., Fichtner, J., Beck, J., Gralla, J., Raabe, A.: Revisiting the rules for freehand ventriculostomy: a virtual reality analysis. J. Neurosurg. 128(4), 1250–1257 (2018)CrossRefGoogle Scholar
  15. 15.
    Rolland, J.P., Fuchs, H.: Optical versus video see-through head-mounted displays in medical visualization. Presence Teleoperators Virtual Environ. 9(3), 287–309 (2000)CrossRefGoogle Scholar
  16. 16.
    Sadda, P., Azimi, E., Jallo, G., Doswell, J., Kazanzides, P.: Surgical navigation with a head-mounted tracking system and display. Stud. Health Technol. Inform. 184, 363–369 (2012)Google Scholar
  17. 17.
    Saucer, F., Khamene, A., Bascle, B., Rubino, G.J.: A head-mounted display system for augmented reality image guidance: towards clinical evaluation for imri-guided nuerosurgery. In: Niessen, W.J., Viergever, M.A. (eds.) MICCAI 2001. LNCS, vol. 2208, pp. 707–716. Springer, Heidelberg (2001).  https://doi.org/10.1007/3-540-45468-3_85CrossRefGoogle Scholar
  18. 18.
    Stefan, P., et al.: A mixed-reality approach to radiation-free training of C-arm based surgery. In: Descoteaux, M., Maier-Hein, L., Franz, A., Jannin, P., Collins, D.L., Duchesne, S. (eds.) MICCAI 2017. LNCS, vol. 10434, pp. 540–547. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-66185-8_61CrossRefGoogle Scholar
  19. 19.
    Yoon, J.W., Chen, R.E., Han, P.K., Si, P., Freeman, W.D., Pirris, S.M.: Technical feasibility and safety of an intraoperative head-up display device during spine instrumentation. Int. J. Med. Robot. Comp. Assisted Surg. 13(3), 1–9 (2017)CrossRefGoogle Scholar
  20. 20.
    Azimi, E., Qian, L., Kazanzides, P., Navab, N.:Robust optical see-through head-mounted display calibration: taking anisotropic nature of user interaction errors into account. In: IEEE Virtual Reality (VR), Los Angeles, CA, pp. 219–220 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Ehsan Azimi
    • 1
    Email author
  • Camilo Molina
    • 2
  • Alexander Chang
    • 1
  • Judy Huang
    • 2
  • Chien-Ming Huang
    • 1
  • Peter Kazanzides
    • 1
  1. 1.Department of Computer ScienceJohns Hopkins UniversityBaltimoreUSA
  2. 2.Department of NeurosurgeryJohns Hopkins HospitalBaltimoreUSA

Personalised recommendations