Skip to main content
Log in

Hand-eye coordination-based implicit re-calibration method for gaze tracking on ultrasound machines: a statistical approach

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Eye gaze tracking is proving to be beneficial in many biomedical applications. The performance of systems based on eye gaze tracking is very much dependent on how accurate their calibration is. It has been reported that the gaze tracking accuracy deteriorates cumulatively and significantly with usage time. This impedes the wide use of gaze tracking in user interfaces.

Methods

Explicit re-calibration, typically requiring the user’s active attention, is time-consuming and can interfere with the user’s main activity. Therefore, we propose an implicit re-calibration method, which can rectify the deterioration of the gaze tracking accuracy without bringing about the user’s deliberate attention. We make use of hand-eye coordination, with the reasonable assumption that the eye gaze follows the pointer during a selection task, to acquire additional calibration points during normal usage of a gaze-contingent system. We construct a statistical model for the calibration and the hand-eye coordination and apply the Gaussian process regression framework to perform the re-calibration.

Results

To validate our model and method, we performed a user study on ultrasonography tasks on a gaze-contingent interface for ultrasound machines. Results suggest that our method can rectify the tracking accuracy deterioration for \(75\%\) of all cases where deterioration occurs in our user study. With another benchmark dataset, our method can redress tracking accuracy to a level comparable to the initial calibration in more than \(80\%\) of the cases.

Conclusions

Our implicit re-calibration method is a practical and convenient fix for tracking accuracy deterioration in gaze-contingent user interfaces, and in particular for gaze-contingent ultrasound machines.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Atkins MS, Moise A, Rohling R (2006) An application of eyegaze tracking for designing radiologists’ workstations: Insights for comparative visual search tasks. ACM Trans Appl Percep 3(2):136–151

    Article  Google Scholar 

  2. Borg LK, Harrison TK, Kou A, Mariano ER, Udani AD, Kim TE, Shum C, Howard SK (2018) Preliminary experience using eye-tracking technology to differentiate novice and expert image interpretation for ultrasound-guided regional anesthesia. J Ultrasound Med 37(2):329–336

    Article  Google Scholar 

  3. Cai Y, Sharma H, Chatelain P, Noble JA (2018) Multi-task sonoeyenet: detection of fetal standardized planes assisted by generated sonographer attention maps. In: International conference on medical image computing and computer-assisted intervention, pp 871–879. Springer

  4. Carrigan AJ, Brennan PC, Pietrzyk M, Clarke J, Chekaluk E (2015) A ‘snapshot’ of the visual search behaviours of medical sonographers. Aust J Ultrasound Med 18(2):70–77

    Article  Google Scholar 

  5. Chatelain P, Sharma H, Drukker L, Papageorghiou AT, Noble JA (2018) Evaluation of gaze tracking calibration for longitudinal biomedical imaging studies. In: IEEE transactions on cybernetics

  6. Cherif ZR, Nait-Ali A, Motsch J, Krebs M (2002) An adaptive calibration of an infrared light device used for gaze tracking. In: IMTC/2002. In: Proceedings of the 19th IEEE instrumentation and measurement technology conference (IEEE Cat. No. 00CH37276), vol 2, pp 1029–1033. IEEE

  7. Droste R, Cai Y, Sharma H, Chatelain P, Drukker L, Papageorghiou AT, Noble JA (2019) Ultrasound image representation learning by modeling sonographer visual attention. In: International conference on information processing in medical imaging, pp 592–604. Springer

  8. Gomez AR, Gellersen H (2018) Smooth-i: smart re-calibration using smooth pursuit eye movements. In: Proceedings of the 2018 ACM symposium on eye tracking research and applications, p 10. ACM

  9. Hastie T, Tibshirani R (1990) Exploring the nature of covariate effects in the proportional hazards model. Biometrics 3:1005–1016

    Article  Google Scholar 

  10. Hornof AJ, Halverson T (2002) Cleaning up systematic error in eye-tracking data by using required fixation locations. Behav Res Methods Instrum Comput 34(4):592–604

    Article  Google Scholar 

  11. Jia S, Koh DH, Pomplun M (2018) Gaze tracking accuracy maintenance using traffic sign detection. In: Adjunct proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications, pp 87–91. ACM

  12. Johansen SA, San Agustin J, Skovsgaard H, Hansen JP, Tall M (2011) Low cost vs. high-end eye tracking for usability testing. In: CHI’11 extended abstracts on human factors in computing systems, pp 1177–1182. ACM

  13. Kosevoi-Tichie A, Berghea F, Vlad V, Abobului M, Trandafir M, Gudu T, Peltea A, Duna M, Groseanu L, Patrascu C, Ionescu R (2015) Does eye gaze tracking have the ability to assess how rheumatologists evaluate musculoskeletal ultrasound images?

  14. Levine S, Pastor P, Krizhevsky A, Ibarz J, Quillen D (2018) Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection. Int J Robot Res 37(4–5):421–436

    Article  Google Scholar 

  15. Li Z, Tong I, Metcalf L, Hennessey C, Salcudean SE (2018) Free head movement eye gaze contingent ultrasound interfaces for the da vinci surgical system. IEEE Robot Autom Lett 3(3):2137–2143

    Article  Google Scholar 

  16. Liebling DJ, Dumais ST (2014) Gaze and mouse coordination in everyday work. In: Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: adjunct publication, pp 1141–1150. ACM

  17. Papoutsaki A, Gokaslan A, Tompkin J, He Y, Huang J (2018) The eye of the typer: a benchmark and analysis of gaze behavior during typing. In: Proceedings of the 2018 ACM symposium on eye tracking research and applications, p 16. ACM

  18. Sidenmark L, Lundström A (2019) Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration. In: Eleventh edition of the ACM symposium on eye tracking research and applications (ETRA 2019)

  19. Tripathi S, Guenter B (2017) A statistical approach to continuous self-calibrating eye gaze tracking for head-mounted virtual reality systems. In: 2017 IEEE winter conference on applications of computer vision (WACV), pp 862–870. IEEE

  20. Williams CK, Rasmussen CE (2006) Gaussian processes for machine learning, vol 2. MIT press, Cambridge

    Google Scholar 

  21. Zhu H, Salcudean S, Rohling R (2019) The neyman pearson detection of microsaccades with maximum likelihood estimation of parameters. J Vis 19(13):17–17

    Article  Google Scholar 

  22. Zhu H, Salcudean SE, Rohling RN (2019) A novel gaze-supported multimodal human-computer interaction for ultrasound machines. Int J Comput Assist Radiol Surg 14(7):1107–1115

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by an NSERC Strategic Project Grant and by infrastructure purchased with Grants from CFI, involving Canadian federal and British Columbia Provincial support. The authors would like to deliver appreciations to all participants in our user study, as well as to authors of [17] who shared their experimental data publicly.

Funding

This work was supported by an NSERC Strategic Project Grant and by infrastructure purchased with grants from CFI, involving Canadian federal and British Columbia Provincial support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongzhi Zhu.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

The user study performed in our research is approved by the Behavioural Research Ethics Board at the University of British Columbia.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhu, H., Rohling, R.N. & Salcudean, S.E. Hand-eye coordination-based implicit re-calibration method for gaze tracking on ultrasound machines: a statistical approach. Int J CARS 15, 837–845 (2020). https://doi.org/10.1007/s11548-020-02143-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-020-02143-w

Keywords

Navigation