Advertisement

Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction

  • David Black
  • Michael Unger
  • Nele Fischer
  • Ron Kikinis
  • Horst Hahn
  • Thomas Neumuth
  • Bernhard Glaser
Original Article

Abstract

Purpose

The growing number of technical systems in the operating room has increased attention on developing touchless interaction methods for sterile conditions. However, touchless interaction paradigms lack the tactile feedback found in common input devices such as mice and keyboards. We propose a novel touchless eye-tracking interaction system with auditory display as a feedback method for completing typical operating room tasks. Auditory display provides feedback concerning the selected input into the eye-tracking system as well as a confirmation of the system response.

Methods

An eye-tracking system with a novel auditory display using both earcons and parameter-mapping sonification was developed to allow touchless interaction for six typical scrub nurse tasks. An evaluation with novice participants compared auditory display with visual display with respect to reaction time and a series of subjective measures.

Results

When using auditory display to substitute for the lost tactile feedback during eye-tracking interaction, participants exhibit reduced reaction time compared to using visual-only display. In addition, the auditory feedback led to lower subjective workload and higher usefulness and system acceptance ratings.

Conclusion

Due to the absence of tactile feedback for eye-tracking and other touchless interaction methods, auditory display is shown to be a useful and necessary addition to new interaction concepts for the sterile operating room, reducing reaction times while improving subjective measures, including usefulness, user satisfaction, and cognitive workload.

Keywords

Auditory display Sterile interaction Eye tracking Scrub nurse Touchless interaction Operating room Digital operating room Human–computer interaction 

Notes

Acknowledgements

The study was partly supported by National Institutes of Health Grants P41 EB015902, P41 EB015898, R01EB014955, and U24CA180918.

Compliance with ethical standards

Conflict of interest

The authors state that they have no conflict of interest.

Ethical approval

For this kind of study no formal ethics approval is required by the institutional ethics committee.

Informed consent

Informed consent was obtained from all individual participants included in the study.

References

  1. 1.
    Rutala W, White M, Gergen M, Weber D (2006) Bacterial contamination of keyboards: efficacy and functional impact of disinfectants. Infect Control 27(4):372–377.  https://doi.org/10.1086/503340 Google Scholar
  2. 2.
    Stockert E, Langerman A (2014) Assessing the magnitude and costs of intraoperative inefficiencies attributable to surgical instrument trays. J Am Coll Surg 219(4):646–55.  https://doi.org/10.1016/j.jamcollsurg.2014.06.019 CrossRefPubMedGoogle Scholar
  3. 3.
    Fitzke T, Krail N, Kroll F, Ohlrogge L, Schrder F, Spillner L, Voll A, Dylla F, Herrlich M, Malaka R (2015) Fubasierte interaktionen mit computersystemen im operationssaal. In: 14. Jahrestagung der Gesellschaft für Computer-und Roboterassistierte Chirurgie e.V. CURAC, Bremen, September 2015Google Scholar
  4. 4.
    Hatscher B, Luz M, Hansen C (2017) Foot interaction concepts to support radiological interventions. In: Mensch und Computer, RegensburgGoogle Scholar
  5. 5.
    Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B (2014) Leap motion gesture control with osirix in the operating room to control imaging. Surg Innov 21(6):655–656.  https://doi.org/10.1177/1553350614528384 CrossRefPubMedGoogle Scholar
  6. 6.
    Black D, Ganze S, Hettig J, Hansen C (2017) Auditory display for improving free-hand gesture interaction. In: Mensch und Computer, RegensburgGoogle Scholar
  7. 7.
    Kirmizibayrak C, Radeva N, Wakid M, Philbeck J, Sibert J, Hahn J (2011) Evaluation of gesture based interfaces for medical volume visualization tasks. In: Proceedings of the 10th international conference on virtual reality.  https://doi.org/10.1145/2087756.2087764
  8. 8.
    Mewes A, Hensen B, Wacker F, Hansen C (2016) Touchless interaction with software in interventional radiology and surgery: a systematic literature review. Int J CARS 11(1):1–16.  https://doi.org/10.1007/s11548-016-1480-6 CrossRefGoogle Scholar
  9. 9.
    Chetwood A, Kwok K, Sun L, Mylonas G, Clark J, Darzi A, Yang G (2012) Collaborative eye tracking: a potential training tool in laparoscopic surgery. Surg Endosc 26(7):2003–2009.  https://doi.org/10.1007/s00464-011-2143-x CrossRefPubMedGoogle Scholar
  10. 10.
    Ali S, Reisner L, King B, Cao A, Auner G, Klein M, Pandya A (2007) Eye gaze tracking for endoscopic camera positioning: an application of a hardware/software interface developed to automate Aesop. Stud Health Technol Inform 132:47 PMID:18391246Google Scholar
  11. 11.
    Glaser B, Unger M, Schellenberg T, Neumuth T (2015) Use cases für sterile blickgesteuerte Mensch-Maschine-interaktionskonzepte im digitalen operationssaal. In: 14. Jahrestagung der Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. - CURAC, Bremen, September 2015Google Scholar
  12. 12.
    Kangas J, Akkil D, Rantala J, Isokoski P, Majaranta P, Raisamo R (2014) Gaze gestures and haptic feedback in mobile devices. In: SIGCHI conference on human factors in computing systems, Toronto, April 2014.  https://doi.org/10.1145/2556288.2557040
  13. 13.
    Park Y, Kim J, Lee K (2015) Effects of auditory feedback on menu selection in hand-gesture interfaces. IEEE Multimed 22(1):32–40.  https://doi.org/10.1109/MMUL.2015.5 CrossRefGoogle Scholar
  14. 14.
    Black D, Hansen C, Nabavi A, Kikinis R, Hahn H (2017) A survey of auditory display in image-guided interventions. Int J CARS.  https://doi.org/10.1007/s11548-017-1547-z Google Scholar
  15. 15.
    Puckette M (1996) Pure data: another integrated computer music environment. In: Second intercollege computer music concerts. Tachikawa, Japan, pp 37–41Google Scholar
  16. 16.
    Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: 14th international conference on auditory display, Paris, France, 2008Google Scholar
  17. 17.
    Van der Laan J, Heino A, de Waard D (1997) A simple procedure for the assessment of acceptance of advanced transport telematics. Transp Res Part C Emerg Technol 5:1–10.  https://doi.org/10.1016/S0968-090X(96)00025-3 CrossRefGoogle Scholar
  18. 18.
    Byers J, Bittner A, Hill S (1989) Traditional and raw task load index (TLX) correlations: are paired comparisons necessary? In: Mital A (ed) Advances in industrial ergonomics and safety. Taylor and Francis, Routledge, pp 481–485Google Scholar
  19. 19.
    Hart SG (2006) NASA-task load index (NASA-TLX); 20 years Later. In: Human factors and ergonomics society 50th annual meeting. HFES, Santa Monica, pp 904–908Google Scholar
  20. 20.
    Jain A, Bansal R, Kumar A, Singh K (2015) A comparative study of visual and auditory reaction times on the basis of gender and physical activity levels of medical first year students. Int J Appl Basic Med Res 5(2):124–127.  https://doi.org/10.4103/2229-516X.157168 CrossRefPubMedPubMedCentralGoogle Scholar
  21. 21.
    Field A (2013) Discovering statistics using IBM SPSS statistics. Sage Publishing, Thousand OaksGoogle Scholar
  22. 22.
    Bork F, Fuerst B, Schneider A, Pinto F, Graumann C, Navab N (2015) Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality. In: Proceedings of 2015 IEEE international symposium on mixed and augmented reality (ISMAR), pp 7–12.  https://doi.org/10.1109/ISMAR.2015.16
  23. 23.
    Cho B, Oka M, Matsumoto N, Ouchida R, Hong J, Hashizume M (2013) Warning navigation system using realtime safe region monitoring for otologic surgery. Int J CARS 8:395–405.  https://doi.org/10.1007/s11548-012-0797-z CrossRefGoogle Scholar
  24. 24.
    Hansen C, Black D, Lange C, Rieber F, Lamadé W, Donati M, Oldhafer K, Hahn H (2013) Auditory support for resection guidance in navigated liver surgery. Med Robot Comput Assist Surg 9(1):36.  https://doi.org/10.1002/rcs.1466 CrossRefGoogle Scholar
  25. 25.
    Kitagawa M, Dokko D, Okamura A, Yuh D (2005) Effect of sensory substitution on suture-manipulation forces for robotic surgical systems. Thorac Cardiovasc Surg 129(1):151–8.  https://doi.org/10.1016/j.jtcvs.2004.05.029 CrossRefGoogle Scholar
  26. 26.
    Willems P, Noordmans H, van Overbeeke J, Viergever M, Tulleken C, van der Sprenkel J (2005) The impact of auditory feedback on neuronavigation. Acta Neurochir 147:167–173.  https://doi.org/10.1007/s00701-004-0412-3 CrossRefPubMedGoogle Scholar
  27. 27.
    Katz J (2014) Noise in the operating room. Anesthesiology 121(4):894–8.  https://doi.org/10.1097/ALN.0000000000000319 CrossRefPubMedGoogle Scholar
  28. 28.
    Moorthy K, Munz Y, Undre S, Darzi A (2004) Objective evaluation of the effect of noise on the performance of a complex laparoscopic task. Surgery 136(1):25–30.  https://doi.org/10.1016/j.surg.2003.12.011 (Discussion 31)CrossRefPubMedGoogle Scholar
  29. 29.
    Rockstroh M, Franke S, Hofer M, Will A, Kasparick M, Andersen B, Neumuth T (2017) OR.NET: multi-perspective qualitative evaluation of an integrated operating room based on IEEE 11073 SDC. Int J CARS 12:1461–1469CrossRefGoogle Scholar
  30. 30.
    Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum Comput Interact 4(1):11–44.  https://doi.org/10.1207/s15327051hci0401_1 CrossRefGoogle Scholar

Copyright information

© CARS 2017

Authors and Affiliations

  1. 1.Medical Image ComputingUniversity of BremenBremenGermany
  2. 2.Jacobs UniversityBremenGermany
  3. 3.Fraunhofer MEVISBremenGermany
  4. 4.Innovation Center Computer Assisted SurgeryLeipzigGermany
  5. 5.Leipzig University of Applied SciencesLeipzigGermany

Personalised recommendations