Advertisement

Device- and system-independent personal touchless user interface for operating rooms

One personal UI to control all displays in an operating room
  • Meng MAEmail author
  • Pascal Fallavollita
  • Séverine Habert
  • Simon Weidert
  • Nassir Navab
Original Article

Abstract

Introduction

In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems’ software and hardware.

Methods

To achieve this, a wearable RGB-D sensor is mounted on the surgeon’s head for inside-out tracking of his/her finger with any of the medical systems’ displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system.

Results and conclusion

To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.

Keywords

User interface Operating room Multimodal interaction Finger pointing gesture 

Notes

Acknowledgments

This work was partly supported by the China Scholarship Council (file No. 201206110030). We also want to thanks Sergii Pylypenko for the virtual devices in the linux kernel to simulate the USB mouse and keyboard.

Supplementary material

Supplementary material 1 (mp4 25276 KB)

References

  1. 1.
    Brooke J (1996) SUS—A quick and dirty usability scale. Usability Eval Ind 189(194):4–7. doi: 10.1002/hbm.20701 Google Scholar
  2. 2.
    Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S (2012) You can’t touch this: touch-free navigation through radiological images. Surg Innov 19(3):301–307. doi: 10.1177/1553350611425508 CrossRefPubMedGoogle Scholar
  3. 3.
    Ebert LC, Hatch G, Thali MJ, Ross S (2013) Invisible touch-control of a DICOM viewer with finger gestures using the kinect depth camera. J Forensic Radiol Imaging 1(1):10–14. doi: 10.1016/j.jofri.2012.11.006 CrossRefGoogle Scholar
  4. 4.
    Grätzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon-computer interaction. Technol Health Care 12(3):245–257PubMedGoogle Scholar
  5. 5.
    Hart SG (2006) Nasa-Task Load Index (NASA-TLX); 20 Years Later. Proc Hum Factors Ergon Soc Annu Meet 50(9):904–908. doi: 10.1177/154193120605000909 CrossRefGoogle Scholar
  6. 6.
    Ionescu A (2006) A Mouse in the O.R. Ambidextrous Mag 4:30–32Google Scholar
  7. 7.
    Jalaliniya S, Pederson T (2015) Designing wearable personal assistants for surgeons: an egocentric approach. Pervasive Comput IEEE 14(3):22–31CrossRefGoogle Scholar
  8. 8.
    Jalaliniya S, Smith J, Sousa M, Büthe L, Pederson T (2013) Touch-less interaction with medical images using hand & foot gestures. In: Proceedings of the 2013 ACM conference pervasive ubiquitous Comput Adjun Publ - UbiComp ’13 Adjun., ACM Press, New York, New York, USA, pp 1265–1274. doi: 10.1145/2494091.2497332
  9. 9.
    Johnson R, O’Hara K, Sellen A, Cousins C, Criminisi A (2011) Exploring the potential for touchless interaction in image-guided interventional radiology. In: Proceedings of the 2011 annuual conference Hum factors Comput Syst - CHI ’11, pp 3323–3332. doi: 10.1145/1978942.1979436
  10. 10.
    Ma M, Merckx K, Fallavollita P, Navab N (2015) [POSTER] Natural user interface for ambient objects. In: 2015 IEEE International Symposium Mix Augment Real, Fukuoka, Japan, pp 76–79. doi: 10.1109/ISMAR.2015.25
  11. 11.
    Moraes TF, Amorim PHJ, Azevedo FS, Silva JVL (2012) InVesalius-An open-source imaging application. Comput Vis Med Image Process 19:405–408Google Scholar
  12. 12.
    Norman Da (2010) Natural user interfaces are not natural. Interactions 17(3):6. doi: 10.1145/1744161.1744163 CrossRefGoogle Scholar
  13. 13.
    O’Hara K, Dastur N, Carrell T, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, Criminisi A, Corish R, Rouncefield M (2014) Touchless interaction in surgery. Commun ACM 57(1):70–77. doi: 10.1145/2541883.2541899 CrossRefGoogle Scholar
  14. 14.
    Pederson T, Janlert LE, Surie D (2010) Towards a model for egocentric interaction with physical and virtual objects. In: Proceedings of the 6th Nord conference human–computer interact. Extending Boundaries—Nord. ’10, ACM Press, New York, New York, USA, p 755. doi: 10.1145/1868914.1869022
  15. 15.
    Rosa GM, Elizondo ML (2014) Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imaging Sci Dent 44(2):155–160. doi: 10.5624/isd.2014.44.2.155 CrossRefPubMedPubMedCentralGoogle Scholar
  16. 16.
    Schwarz LA, Bigdelou A, Navab N (2011) Learning gestures for customizable human-computer interaction in the operating room. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 6891 LNCS (PART 1), 129–136. doi: 10.1007/978-3-642-23623-5_17
  17. 17.
    Strickland M, Tremaine J, Brigley G, Law C (2013) Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Can J Surg 56(3):1–6. doi: 10.1503/cjs.035311 CrossRefGoogle Scholar
  18. 18.
    Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB (2013) Informatics in radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics 33(2):E61–70. doi: 10.1148/rg.332125101 CrossRefPubMedGoogle Scholar
  19. 19.
    Tangcharoen T, Bell A, Hegde S, Hussain T, Beerbaum P, Schaeffter T, Razavi R, Botnar RM, Greil GF (2011) Detection of coronary artery anomalies in infants and young children with congenital heart disease by using MR imaging. doi: 10.1148/radiol.10100828
  20. 20.
    Wachs J, Stern H, Edan Y (2008) Real-time hand gesture interface for browsing medical images. J Intell 2(1):15–25Google Scholar

Copyright information

© CARS 2016

Authors and Affiliations

  • Meng MA
    • 1
    • 2
    Email author
  • Pascal Fallavollita
    • 1
  • Séverine Habert
    • 1
  • Simon Weidert
    • 3
  • Nassir Navab
    • 1
    • 4
  1. 1.Fakultät für InformatikTechnische Universität MünchenGarching b. MünchenGermany
  2. 2.National University of Defense TechnologyChangshaChina
  3. 3.Chirurgischen Klinik und Poliklinik - InnenstadtLMUMunichGermany
  4. 4.Johns Hopkins UniversityBaltimoreUSA

Personalised recommendations