Touchless scanner control to support MRI-guided interventions

Abstract

Purpose

MRI-guided interventions allow minimally invasive, radiation-free treatment but rely on real-time image data and free slice positioning. Interventional interaction with the data and the MRI scanner is cumbersome due to the diagnostic focus of current systems, confined space and sterile conditions.

Methods

We present a touchless, hand-gesture-based interaction concept to control functions of the MRI scanner typically used during MRI-guided interventions. The system consists of a hand gesture sensor customised for MRI compatibility and a specialised UI that was developed based on clinical needs. A user study with 10 radiologists was performed to compare the gesture interaction concept and its components to task delegation—the prevalent method in clinical practice.

Results

Both methods performed comparably in terms of task duration and subjective workload. Subjective performance with gesture input was perceived as worse compared to task delegation, but was rated acceptable in terms of usability while task delegation was not.

Conclusion

This work contributes by (1) providing access to relevant functions on an MRI scanner during percutaneous interventions in a (2) suitable way for sterile human–computer interaction. The introduced concept removes indirect interaction with the scanner via an assistant, which leads to comparable subjective workload and task completion times while showing higher perceived usability.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

References

  1. 1.

    Bachmann D, Weichert F, Rinkenauer G (2015) Evaluation of the leap motion controller as a new contact-free pointing device. Sensors 15(1):214–233. https://doi.org/10.3390/s150100214

    Article  Google Scholar 

  2. 2.

    Bangor A, Kortum P, Miller J (2009) Determining what individual sus scores mean: adding an adjective rating scale. J Usability Stud 4(3):114–123

    Google Scholar 

  3. 3.

    Barkhausen J, Kahn T, Krombach GA, Kuhl CK, Lotz J, Maintz D, Ricke J, Schönberg SO, Vogl TJ, Wacker FK (2017) White paper: interventional MRI: current status and potential for development considering economic perspectives, part 2: liver and other applications in oncology. RoFo: Fortschritte auf dem Gebiete der Röntgenstrahlen und der Nuklearmedizin 189(11):1047–1054

    Article  Google Scholar 

  4. 4.

    Brooke J (1996) SUS-A quick and dirty usability scale. Usability Eval Ind 189(194):4–7

    Google Scholar 

  5. 5.

    Christoforou E, Akbudak E, Ozcan A, Karanikolas M, Tsekos NV (2007) Performance of interventions with manipulator-driven real-time mr guidance: implementation and initial in vitro tests. Magn Reson Imaging 25(1):69–77

    Article  Google Scholar 

  6. 6.

    Cronin S, Doherty G (2018) Touchless computer interfaces in hospitals: a review. Health Inform J. https://doi.org/10.1177/1460458217748342

    Article  Google Scholar 

  7. 7.

    Fischbach F, Bunke J, Thormann M, Gaffke G, Jungnickel K, Smink J, Ricke J (2011) MR-guided freehand biopsy of liver lesions with fast continuous imaging using a 1.0-T open mri scanner: experience in 50 patients. Cardiovasc Interv Radiol 34(1):188–192

    Article  Google Scholar 

  8. 8.

    Gu C, Wang J, Lien J (2019) Motion sensing using radar: gesture interaction and beyond. IEEE Microw Mag 20(8):44–57

    Article  Google Scholar 

  9. 9.

    Güttler FV, Heinrich A, Herrmann F, Teichgräber UKM (2018) Gesture control: the hand for the contactless control of the MRI during interventional procedures. Eur Soc Radiol. https://doi.org/10.1594/ecr2018/C-1819

    Article  Google Scholar 

  10. 10.

    Hart SG (2006) Nasa-task load index (nasa-tlx); 20 years later. In: Proceedings of the human factors and ergonomics society annual meeting, vol 50, pp 904–908

    Article  Google Scholar 

  11. 11.

    Hettig J, Saalfeld P, Luz M, Becker M, Skalej M, Hansen C (2017) Comparison of gesture and conventional interaction techniques for interventional neuroradiology. Int J Comput Assist Radiol Surg 12:1643–1653

    Article  Google Scholar 

  12. 12.

    Jacob MG, Wachs JP (2014) Context-based hand gesture recognition for the operating room. Pattern Recognit Lett 36:196–203

    Article  Google Scholar 

  13. 13.

    Kägebein U, Godenschweger F, Armstrong BS, Rose G, Wacker FK, Speck O, Hensen B (2018) Percutaneous MR-guided interventions using an optical moiré phase tracking system: initial results. PLoS ONE 13(10):e0205394

    Article  Google Scholar 

  14. 14.

    Kim Y, Toomajian B (2016) Hand gesture recognition using micro-doppler signatures with convolutional neural network. IEEE Access 4:7125–7130

    Article  Google Scholar 

  15. 15.

    Malaterre M et al (2008) GDCM reference manual, 1st edn. http://gdcm.sourceforge.net/gdcm.pdf. Accessed 15 Nov 2018

  16. 16.

    Mazilu MT, Faranesh AZ, Derbyshire JA, Lederman RJ, Hansen MS (2011) Low-cost MRI compatible interface device for interactive scan plane control. In: Proceedings of the 19th annual meeting of ISMRM, Montreal, Canada, p 3752

  17. 17.

    Mewes A, Hensen B, Wacker F, Hansen C (2017) Touchless interaction with software in interventional radiology and surgery: a systematic literature review. Int J Comput Assist Radiol Surg 12(2):291–305

    Article  Google Scholar 

  18. 18.

    Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C (2016) A gesture-controlled projection display for CT-guided interventions. Int J Comput Assist Radiol Surg 11(1):157–164

    CAS  Article  Google Scholar 

  19. 19.

    Pannicke E, Hatscher B, Hensen B, Mewes A, Hansen C, Wacker F, Vick R (2018) MR compatible and sterile gesture interaction for interventions. In: Proceedings of the 12th interventional MRI symposium

  20. 20.

    Riffe MJ, Yutzy SR, Jiang Y, Twieg MD, Blumenthal CJ, Hsu DP, Pan L, Gilson WD, Sunshine JL, Flask CA, Duerk JL, Nakamoto D, Gulani V, Griswold MA (2014) Device localization and dynamic scan plane selection using a wireless magnetic resonance imaging detector array. Magn Reson Med 71(6):2243–2249

    Article  Google Scholar 

  21. 21.

    Rothgang E, Gilson WD, Wacker F, Hornegger J, Lorenz CH, Weiss CR (2013) Rapid freehand MR-guided percutaneous needle interventions: an image-based approach to improve workflow and feasibility. J Magn Reson Imaging JMRI 37(5):1202–1212

    Article  Google Scholar 

  22. 22.

    Rothgang E, Gilson WD, Wacker F, Hornegger J, Lorenz CH, Weiss CR (2013) Rapid freehand MR-guided percutaneous needle interventions: an image-based approach to improve workflow and feasibility. J Magn Reson Imaging 37(5):1202–1212

    Article  Google Scholar 

  23. 23.

    Rube MA, Holbrook AB, Cox BF, Buciuc R, Melzer A (2015) Wireless mobile technology to improve workflow and feasibility of MR-guided percutaneous interventions. Int J Comput Assist Radiol Surg 10(5):665–676

    Article  Google Scholar 

  24. 24.

    Rutala WA, White MS, Gergen MF, Weber DJ (2006) Bacterial contamination of keyboards: efficacy and functional impact of disinfectants. Infect Control Hosp Epidemiol 27(04):372–377

    Article  Google Scholar 

  25. 25.

    Schroeder WJ, Martin K, Lorensen WE (2006) The visualization toolkit, 4. ed edn. Kitware, New York

    Google Scholar 

  26. 26.

    Wachs JP, Stern HI, Edan Y, Gillam M, Handler J, Feied C, Smith M (2008) A gesture-based tool for sterile browsing of radiology images. J Am Med Inform Assoc 15(3):321–323. https://doi.org/10.1197/jamia.M2410

    Article  PubMed  PubMed Central  Google Scholar 

  27. 27.

    Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393

    Article  Google Scholar 

  28. 28.

    Weiss J, Hoffmann R, Rempp H, Ke\(\beta \)ler DE, Pereira PL, Nikolaou K, Clasen S (2018) Feasibility, efficacy, and safety of percutaneous mr-guided ablation of small (\(\pm \) 12 mm) hepatic malignancies. J Magn Reson Imaging 49:374–381

  29. 29.

    Wipfli R, Dubois-Ferrière V, Budry S, Hoffmeyer P, Lovis C (2016) Gesture-controlled image management for operating room: a randomized crossover study to compare interaction using gestures, mouse, and third person relaying. PLoS ONE 11(4):e0153596

    Article  Google Scholar 

Download references

Funding

The work of this paper is partly funded by the Federal Ministry of Education and Research within the Forschungscampus STIMULATE under Grant Number 13GW0095A and 13GW0095C. Frank Wacker declares grants from Siemens Healthcare outside of this work.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Benjamin Hatscher.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

For this type of study, formal consent is not required.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 2 (mp4 11200 KB)

Supplementary material 1 (pdf 137 KB)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Hatscher, B., Mewes, A., Pannicke, E. et al. Touchless scanner control to support MRI-guided interventions. Int J CARS 15, 545–553 (2020). https://doi.org/10.1007/s11548-019-02058-1

Download citation

Keywords

  • Magnetic resonance imaging
  • Interventional
  • Human–computer interaction
  • Gestures
  • Task delegation
  • Radiology
  • Usability
  • Touchless interaction