Skip to main content

Advertisement

Log in

Integrating Eye-Tracking to Augmented Reality System for Surgical Training

  • Education & Training
  • Published:
Journal of Medical Systems Aims and scope Submit manuscript

Abstract

Augmented Reality has been utilized for surgical training. During the implementation, displaying instructional information at the right moment is critical for skill acquisition. We built a new surgical training platform combining augmented reality system (HoloLens, Microsoft) with an eye-tracker (Pupil labs, Germany). Our goal is to detect the moments of performance difficulty using the integrated eye-tracker so that the system could display instructions at the precise moment when the user is seeking instructional information during a surgical skill practice in simulation. In the paper, we describe the system design, system calibration and data transferring between these devices.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Barsom E, Graafland M, Schijven M (2016) Systematic review on the effectiveness of augmented reality applications in medical training. Surgical endoscopy 30(10):4174–4183

    Article  CAS  Google Scholar 

  2. Bernhardt S, Nicolau SA, Soler L, Doignon C (2017) The status of augmented reality in laparoscopic surgery as of 2016. Medical image analysis 37:66–90

    Article  Google Scholar 

  3. Dickey RM, Srikishen N, Lipshultz LI, Spiess PE, Carrion RE, Hakky TS (2016) Augmented reality assisted surgery: a urologic training tool. Asian journal of andrology 18(5):732

    Article  Google Scholar 

  4. Eid MA, Giakoumidis N, El-Saddik A (2016) A novel eye-gaze-controlled wheelchair system for navigating unknown environments: Case study with a person with als. IEEE Access 4:558–573

    Article  Google Scholar 

  5. Esposito M, Busam B, Hennersperger C, Rackerseder J, Lu A, Navab N,Frisch B Cooperative robotic gamma imaging: Enhancing us-guided needle biopsy. In: International Conference on Medical Image Computing and Computer-Assisted Intervention, Springer, pp. 611–618 (2015)

  6. Hintjens P Ømq-the guide. Online: http://zguide.zeromq.org/page: all, Accessed on 23 (2011)

  7. Jang C, Bang K, Moon S, Kim J, Lee S, Lee B (2017) Retinal 3d: augmented reality near-eye display via pupil-tracked light field projection on retina. ACM Transactions on Graphics (TOG) 36(6):190

    Article  Google Scholar 

  8. Jiang X, Zheng B, Tien G, Atkins MS Pupil response to precision in surgical task execution. In: MMVR, pp. 210–214 (2013)

  9. Jiang X, Atkins MS, Tien G, Bednarik R, Zheng B Pupil responses during discrete goal-directed movements. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp 2075– 2084 (2014)

  10. Jiang X, Atkins MS, Tien G, Zheng B, Bednarik R Pupil dilations during target-pointing respect fitts’ law. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, pp. 175–182 (2014)

  11. Jiang X, Zheng B, Bednarik R, Atkins MS (2015) Pupil responses to continuous aiming movements. International Journal of Human-Computer Studies 83:1–11

    Article  Google Scholar 

  12. Kassner M, Patera W, Bulling A Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication, ACM, pp 1151–1160 (2014)

  13. Katić D, Sudra G, Speidel S, Castrillon-Oberndorfer G, Eggers G, Dillmann R Knowledge-based situation interpretation for context aware augmented reality in dental implant surgery. In: International Workshop on Medical Imaging and Virtual Reality, Springer, pp. 531–540 (2010)

  14. Katić D, Wekerle AL, Görtler J, Spengler P, Bodenstedt S, Röhl S,Suwelack S, Kenngott HG, Wagner M, Müller-Stich BP, et al. (2013) Context-aware augmented reality in laparoscopic surgery. Computerized Medical Imaging and Graphics 37(2):174–182

    Article  Google Scholar 

  15. Kosch T, Hassib M, Buschek D, Schmidt A Look into my eyes: Using pupil dilation to estimate mental workload for task complexity adaptation. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, ACM, p LBW617 (2018)

  16. Kottayil NK, Bogdanova R, Cheng I, Zheng B, Basu A Investigation of gaze patterns in multi view laparoscopic surgery. In: Engineering in Medicine and Biology Society (EMBC), 2016 IEEE 38th Annual International Conference of the, IEEE, pp 4031–4034 (2016)

  17. Kun AL, Meulen Hvd, Janssen CP Calling while driving: An initial experiment with HoloLens (2017)

  18. Langbehn E, Steinicke F, Lappe M, Welch GF, Bruder G (2018) In the blink of an eye: leveraging blink-induced suppression for imperceptible position and orientation redirection in virtual reality. ACM Transactions on Graphics (TOG) 37(4):66

    Article  Google Scholar 

  19. Law BHY, Cheung PY, Wagner M, van Os S, Zheng B, Schmölzer G (2018) Analysis of neonatal resuscitation using eye tracking: a pilot study. Archives of Disease in Childhood-Fetal and Neonatal Edition 103(1):F82– F84

    Article  Google Scholar 

  20. Navab N, Traub J, Sielhorst T, Feuerstein M, Bichlmeier C (2007) Action and workflow-driven augmented reality for computer-aided medical procedures. IEEE Computer Graphics and Applications 27(5):10–14

    Article  Google Scholar 

  21. Patel R, Zurca A (2018) 396: I see what you see adding eye-tracking to medical simulation. Critical Care Medicine 46(1):181

    Article  Google Scholar 

  22. Pelargos PE, Nagasawa DT, Lagman C, Tenn S, Demos JV, Lee SJ, Bui TT, Barnette NE, Bhatt NS, Ung N, et al. (2017) Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery. Journal of Clinical Neuroscience 35:1–4

    Article  Google Scholar 

  23. Renner P, Pfeiffer T Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems. In: 3D User Interfaces (3DUI), 2017 IEEE Symposium on, IEEE, pp 186–194 (2017)

  24. Sielhorst T, Feuerstein M, Traub J, Kutter O, Navab N Campar: A software framework guaranteeing quality for medical augmented reality. International Journal of Computer Assisted Radiology and Surgery 1:29 (2006)

  25. Steil J, Müller P, Sugano Y, Bulling A Forecasting user attention during everyday mobile interactions using device-integrated and wearable sensors. arXiv preprint arXiv:180106011 (2018)

  26. Zheng B, Jiang X, Atkins MS (2015) Detection of changes in surgical difficulty: evidence from pupil responses. Surgical innovation 22(6):629– 635

    Article  Google Scholar 

Download references

Acknowledgements

We thank the Natural Sciences and Engineering Research Council of Canada (NSERC) to support the technological development of this project through Discovery Grant to Dr. Zheng. We also thank the University of Alberta Provost Office for supporting the education application of this project through the Teaching & Learning Enhancement Fund to Dr. Zheng.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bin Zheng.

Ethics declarations

Disclosure of potential conflicts of interest

The authors declared that we do not have any conflict of interest for this research work.

Ethical approval

This article does not contain any studies with animals performed by any of the authors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lu, S., Sanchez Perdomo, Y.P., Jiang, X. et al. Integrating Eye-Tracking to Augmented Reality System for Surgical Training. J Med Syst 44, 192 (2020). https://doi.org/10.1007/s10916-020-01656-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10916-020-01656-w

Keywords

Navigation