Skip to main content
Log in

Gaze behavior is related to objective technical skills assessment during virtual reality simulator-based surgical training: a proof of concept

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Simulation-based training allows surgical skills to be learned safely. Most virtual reality-based surgical simulators address technical skills without considering non-technical skills, such as gaze use. In this study, we investigated surgeons’ visual behavior during virtual reality-based surgical training where visual guidance is provided. Our hypothesis was that the gaze distribution in the environment is correlated with the simulator’s technical skills assessment.

Methods

We recorded 25 surgical training sessions on an arthroscopic simulator. Trainees were equipped with a head-mounted eye-tracking device. A U-net was trained on two sessions to segment three simulator-specific areas of interest (AoI) and the background, to quantify gaze distribution. We tested whether the percentage of gazes in those areas was correlated with the simulator’s scores.

Results

The neural network was able to segment all AoI with a mean Intersection over Union superior to 94% for each area. The gaze percentage in the AoI differed among trainees. Despite several sources of data loss, we found significant correlations between gaze position and the simulator scores. For instance, trainees obtained better procedural scores when their gaze focused on the virtual assistance (Spearman correlation test, N = 7, r = 0.800, p = 0.031).

Conclusion

Our findings suggest that visual behavior should be quantified for assessing surgical expertise in simulation-based training environments, especially when visual guidance is provided. Ultimately visual behavior could be used to quantitatively assess surgeons’ learning curve and expertise while training on VR simulators, in a way that complements existing metrics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Badash I, Burtt K, Solorzano CA, Carey JN (2016) Innovations in surgery simulation: a review of past, current and future techniques. Ann Transl Med 4(23):453

    Article  PubMed  PubMed Central  Google Scholar 

  2. De Visser H, Watson MO, Salvado O, Passenger JD (2011) Progress in virtual reality simulators for surgical training and certification. Med J Aust 194:S38–S40

    Article  PubMed  Google Scholar 

  3. Arjomandi Rad A, Vardanyan R, Thavarajasingam SG, Zubarevich A, Van den Eynde J, Sá MPB, Zhigalov K, Nia PM, Ruhparwar A, Weymann A (2022) Extended, virtual and augmented reality in thoracic surgery: a systematic review. Interact CardioVasc Thorac Surg 34(2):201–211

    Article  PubMed  Google Scholar 

  4. Canalichio KL, Berrondo C, Lendvay TS (2020) Simulation training in urology: state of the art and future directions. Adv Med Educ Pract 11:391

    Article  PubMed  PubMed Central  Google Scholar 

  5. Hasan LK, Haratian A, Kim M, Bolia IK, Weber AE, Petrigliano FA (2021) Virtual reality in orthopedic surgery training. Adv Med Educ Pract 12:1295

    Article  PubMed  PubMed Central  Google Scholar 

  6. Walbron P, Common H, Thomazeau H, Hosseini K, Peduzzi L, Bulaid Y, Sirveaux F (2020) Virtual reality simulator improves the acquisition of basic arthroscopy skills in first-year orthopedic surgery residents. Orthop Traumatol Surg Res 106(4):717–724

    Article  PubMed  Google Scholar 

  7. Satava RM (2008) Historical review of surgical simulation—a personal perspective. World J Surg 32(2):141–148

    Article  PubMed  Google Scholar 

  8. Brunckhorst O, Khan MS, Dasgupta P, Ahmed K (2015) Effective non-technical skills are imperative to robot-assisted surgery. BJU Int 116(6):842–844

    Article  PubMed  Google Scholar 

  9. Anderson O, Davis R, Hanna GB, Vincent CA (2013) Surgical adverse events: a systematic review. Am J Surg 206(2):253–262

    Article  PubMed  Google Scholar 

  10. Yule S, Flin R, Paterson-Brown S, Maran N, Rowley D (2006) Development of a rating system for surgeons’ non-technical skills. Med Educ 40(11):1098–1104

    Article  CAS  PubMed  Google Scholar 

  11. Sevdalis N, Davis R, Koutantji M, Undre S, Darzi A, Vincent CA (2008) Reliability of a revised NOTECHS scale for use in surgical teams. Am J Surg 196(2):184–190

    Article  PubMed  Google Scholar 

  12. Undre S, Sevdalis N, Healey AN, Darzi A, Vincent CA (2007) Observational teamwork assessment for surgery (OTAS): refinement and application in urological surgery. World J Surg 31(7):1373–1381

    Article  PubMed  Google Scholar 

  13. Casy T, Tronchot A, Thomazeau H, Morandi X, Jannin P, Huaulmé A (2022) “Stand-up straight!”: human pose estimation to evaluate postural skills during orthopedic surgery simulations. Int J Comput Assist Radiol Surg 18:279–288

    Article  PubMed  Google Scholar 

  14. Tolvanen O, Elomaa AP, Itkonen M, Vrzakova H, Bednarik R, Huotarinen A (2022) Eye-tracking indicators of workload in surgery: a systematic review. J Invest Surg 35(6):1340–1349

    Article  PubMed  Google Scholar 

  15. Gil AM, Birdi S, Kishibe T, Grantcharov TP (2022) Eye tracking use in surgical research: a systematic review. J Surg Res 279:774–787

    Article  PubMed  Google Scholar 

  16. Ashraf H, Sodergren MH, Merali N, Mylonas G, Singh H, Darzi A (2018) Eye-tracking technology in medical education: a systematic review. Med Teach 40(1):62–69

    Article  PubMed  Google Scholar 

  17. Wilson MR, Vine SJ, Bright E, Masters RS, Defriend D, McGrath JS (2011) Gaze training enhances laparoscopic technical skill acquisition and multi-tasking performance: a randomized, controlled study. Surg Endosc 25(12):3731–3739

    Article  PubMed  PubMed Central  Google Scholar 

  18. Fox SE, Faulkner-Jones BE (2017) Eye-tracking in the study of visual expertise: methodology and approaches in medicine. Front Learn Res 5(3):29–40

    Google Scholar 

  19. Law B, Atkins MS, Lomax AJ, Wilson JG (2003) Eye trackers in a virtual laparoscopic training environment. In: Medicine meets virtual reality. IOS Press, vol 11, pp 184–186

  20. Cai B, Xu N, Duan S, Yi J, Bay BH, Shen F, Hu N, Zhang P, Chen J, Chen C (2022) Eye tracking metrics of orthopedic surgeons with different competency levels who practice simulation-based hip arthroscopic procedures. Heliyon 8(12):12335

    Article  Google Scholar 

  21. Dilley J, Singh H, Pratt P, Omar I, Darzi A, Mayer E (2020) Visual behaviour in robotic surgery—demonstrating the validity of the simulated environment. Int J Med Robot Comput Assist Surg 16(2):e2075

    Article  Google Scholar 

  22. Evans-Harvey K, Erridge S, Karamchandani U, Abdalla S, Beatty JW, Darzi A, Purkayastha S, Sodergren MH (2020) Comparison of surgeon gaze behaviour against objective skill assessment in laparoscopic cholecystectomy-a prospective cohort study. Int J Surg 82:149–155

    Article  PubMed  Google Scholar 

  23. Snaineh STA, Seales B (2015) Minimally invasive surgery skills assessment using multiple synchronized sensors. In: 2015 IEEE international symposium on signal processing and information technology (ISSPIT), pp 314–319. IEEE

  24. Burkhart SS, Lo IK (2006) Arthroscopic rotator cuff repair. JAAOS J Am Acad Orthop Surg 14(6):333–346

    Article  PubMed  Google Scholar 

  25. Kassner M, Patera W, Bulling A (2014) Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: adjunct publication. pp 1151–1160

  26. Warfield SK, Zou KH, Wells WM (2004) Simultaneous truth and performance level estimation (STAPLE): an algorithm for the validation of image segmentation. IEEE Trans Med Imaging 23(7):903–921

    Article  PubMed  PubMed Central  Google Scholar 

  27. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

Download references

Acknowledgements

This study is part of the French network of University Hospitals HUGO (“Hôpitaux Universitaires du Grand Ouest”). It was made possible thanks to the VirtaMed Society.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pierre Jannin.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures involving human participants were conducted in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed consent

This article does not contain patient data.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Galuret, S., Vallée, N., Tronchot, A. et al. Gaze behavior is related to objective technical skills assessment during virtual reality simulator-based surgical training: a proof of concept. Int J CARS 18, 1697–1705 (2023). https://doi.org/10.1007/s11548-023-02961-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-023-02961-8

Keywords

Navigation