Skip to main content
Log in

A novel gaze-supported multimodal human–computer interaction for ultrasound machines

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Conventional ultrasound (US) machines employ a physical control panel (PCP) as the primary user interface for machine control. This panel is adjacent to the main machine display that requires the operator’s constant attention. The switch of attention to the control panel can lead to interruptions in the flow of the medical examination. Some ultraportable machines also lack many physical controls. Furthermore, the need to both control the US machine and observe the US image may lead the practitioners to adopt unergonomic postures and repetitive motions that can lead to work-related injuries. Therefore, there is a need for a more efficient human–computer interaction method on US machines.

Methods

To tackle some of the limitations with the PCP, we propose to merge the PCP into the main screen of the US machines. We propose to use gaze tracking and a handheld controller so that machine control can be achieved via a multimodal human–computer interaction (HCI) method that does not require one to touch the screen or look away from the US image. As a first step, a pop-up menu and measurement tool were designed on top of the US image based on gaze position for efficient machine control.

Results

A comparative study was performed on the BK Medical SonixTOUCH US machine. Participants were asked to complete the task of measuring the area of an ellipse-shaped tumor in a phantom using our gaze-supported HCI method as well as the traditional method. The user study indicates that the task completion time can be reduced by \(20.6\%\) when using our gaze-supported HCI, while no extra workload is imposed on the operators.

Conclusions

Our preliminary study suggests that, when combined with a simple handheld controller, eye gaze tracking can be integrated into the US machine HCI for more efficient machine control.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Andreoni G, Mazzola M, Matteoli S, D’Onofrio S, Forzoni L (2015) Ultrasound system typologies, user interfaces and probes design: a review. Procedia Manuf 3:112–119

    Article  Google Scholar 

  2. Bartol K, Graziano S, Kelton W (2007) Remote wireless control device for an ultrasound machine and method. US Patent App. 10/550,046

  3. Cobbold RS (2006) Foundations of biomedical ultrasound. Oxford University Press, Oxford

    Google Scholar 

  4. Feit AM, Williams S, Toledo A, Paradiso, A, Kulkarni H, Kane S, Morris MR (2017) Toward everyday gaze input: accuracy and precision of eye tracking and implications for design. In: Proceedings of the 2017 CHI conference on human factors in computing systems. ACM, pp 1118–1130

  5. Halwani Y (2017) An investigation of multi-modal gaze-supported zoom and pan interactions in ultrasound machines. Ph.D. thesis, University of British Columbia

  6. Halwani Y, Salcudean SE, Lessoway VA, Fels SS (2017) Enhancing zoom and pan in ultrasound machines with a multimodal gaze-based interface. In: Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems. ACM, pp 1648–1654

  7. Han JH, Yang Eh (2015) Method of moving the displays of an ultrasound diagnostic device and ultrasound diagnostic device. US Patent App. 14/314,313

  8. Hart SG (2006) Nasa-task load index (nasa-tlx); 20 years later. In: Proceedings of the human factors and ergonomics society annual meeting, vol 50. Sage Publications, Los Angeles, CA, pp 904–908

  9. Hart SG, Staveland LE (1988) Development of nasa-tlx (task load index): results of empirical and theoretical research. Adv Psychol, vol 52. Elsevier, Amsterdam, pp 139–183

    Google Scholar 

  10. Istance H, Bates R, Hyrskykari A, Vickers S (2008) Snap clutch, a moded approach to solving the midas touch problem. In: Proceedings of the 2008 symposium on eye tracking research & applications. ACM, pp 221–228

  11. Jarc AM (2017) Medical devices, systems, and methods using eye gaze tracking for stereo viewer. US Patent App. 15/126,151

  12. Murphy S, Need DE (1996) Voice control of a medical ultrasound scanning machine. US Patent 5,544,654

  13. Pelissier L, Zhang B, Bobovsky T (2016) Highly configurable medical ultrasound machine and related methods. US Patent 9,408,587

  14. Reiner B (2017) Visually directed human-computer interaction for medical applications. US Patent 9,841,811

  15. Ruhland K, Peters CE, Andrist S, Badler JB, Badler NI, Gleicher M, Mutlu B, McDonnell R (2015) A review of eye gaze in virtual agents, social robotics and HCI: behaviour generation, user interaction and perception. Comput Graph Forum 34(6):299–326

    Article  Google Scholar 

  16. Washburn MJ, Hawley BM, Prichard SD (2010) Voice control of a generic input device for an ultrasound system. US Patent 7,698,142

  17. Weinger MB, Gardner-Bonneau DJ, Wiklund ME (2010) Handbook of human factors in medical device design. CRC Press, Boca Raton

    Book  Google Scholar 

  18. Wichrowski M (2015) Usability engineering in the prototyping process of software user interfaces for mobile medical ultrasound devices. Comput Sci 16:219–236

    Article  Google Scholar 

  19. Yudkovitch LM, Farrokhnia F, Chiao R (2007) Method and apparatus for natural voice control of an ultrasound machine. US Patent 7,247,139

  20. Zhai S, Morimoto C, Ihde S (1999) Manual and gaze input cascaded (magic) pointing. In: Proceedings of the SIGCHI Conference on human factors in computing systems. ACM, pp 246–253

Download references

Acknowledgements

The authors would like to deliver their thanks to Mrs. Vickie Lessoway, Mrs. Irene Tong, Mr. Zhaoshuo Li, Mr. Neerav Patel for their supports and help during the study. Appreciations should also be delivered to the voluntary participants in the user study. Without your comments and feedback, very limited improvements on the design of our system can be made.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongzhi Zhu.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

Our research was approved by the Behavioral Research Ethics Board at the University of British Columbia (UBC), Vancouver, Canada. All procedures performed in our study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This research is funded by the Natural Sciences and Engineering Research Council of Canada.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 52293 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhu, H., Salcudean, S.E. & Rohling, R.N. A novel gaze-supported multimodal human–computer interaction for ultrasound machines. Int J CARS 14, 1107–1115 (2019). https://doi.org/10.1007/s11548-019-01964-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-019-01964-8

Keywords

Navigation