Abstract
Purpose
Eye gaze tracking is proving to be beneficial in many biomedical applications. The performance of systems based on eye gaze tracking is very much dependent on how accurate their calibration is. It has been reported that the gaze tracking accuracy deteriorates cumulatively and significantly with usage time. This impedes the wide use of gaze tracking in user interfaces.
Methods
Explicit re-calibration, typically requiring the user’s active attention, is time-consuming and can interfere with the user’s main activity. Therefore, we propose an implicit re-calibration method, which can rectify the deterioration of the gaze tracking accuracy without bringing about the user’s deliberate attention. We make use of hand-eye coordination, with the reasonable assumption that the eye gaze follows the pointer during a selection task, to acquire additional calibration points during normal usage of a gaze-contingent system. We construct a statistical model for the calibration and the hand-eye coordination and apply the Gaussian process regression framework to perform the re-calibration.
Results
To validate our model and method, we performed a user study on ultrasonography tasks on a gaze-contingent interface for ultrasound machines. Results suggest that our method can rectify the tracking accuracy deterioration for \(75\%\) of all cases where deterioration occurs in our user study. With another benchmark dataset, our method can redress tracking accuracy to a level comparable to the initial calibration in more than \(80\%\) of the cases.
Conclusions
Our implicit re-calibration method is a practical and convenient fix for tracking accuracy deterioration in gaze-contingent user interfaces, and in particular for gaze-contingent ultrasound machines.
Similar content being viewed by others
References
Atkins MS, Moise A, Rohling R (2006) An application of eyegaze tracking for designing radiologists’ workstations: Insights for comparative visual search tasks. ACM Trans Appl Percep 3(2):136–151
Borg LK, Harrison TK, Kou A, Mariano ER, Udani AD, Kim TE, Shum C, Howard SK (2018) Preliminary experience using eye-tracking technology to differentiate novice and expert image interpretation for ultrasound-guided regional anesthesia. J Ultrasound Med 37(2):329–336
Cai Y, Sharma H, Chatelain P, Noble JA (2018) Multi-task sonoeyenet: detection of fetal standardized planes assisted by generated sonographer attention maps. In: International conference on medical image computing and computer-assisted intervention, pp 871–879. Springer
Carrigan AJ, Brennan PC, Pietrzyk M, Clarke J, Chekaluk E (2015) A ‘snapshot’ of the visual search behaviours of medical sonographers. Aust J Ultrasound Med 18(2):70–77
Chatelain P, Sharma H, Drukker L, Papageorghiou AT, Noble JA (2018) Evaluation of gaze tracking calibration for longitudinal biomedical imaging studies. In: IEEE transactions on cybernetics
Cherif ZR, Nait-Ali A, Motsch J, Krebs M (2002) An adaptive calibration of an infrared light device used for gaze tracking. In: IMTC/2002. In: Proceedings of the 19th IEEE instrumentation and measurement technology conference (IEEE Cat. No. 00CH37276), vol 2, pp 1029–1033. IEEE
Droste R, Cai Y, Sharma H, Chatelain P, Drukker L, Papageorghiou AT, Noble JA (2019) Ultrasound image representation learning by modeling sonographer visual attention. In: International conference on information processing in medical imaging, pp 592–604. Springer
Gomez AR, Gellersen H (2018) Smooth-i: smart re-calibration using smooth pursuit eye movements. In: Proceedings of the 2018 ACM symposium on eye tracking research and applications, p 10. ACM
Hastie T, Tibshirani R (1990) Exploring the nature of covariate effects in the proportional hazards model. Biometrics 3:1005–1016
Hornof AJ, Halverson T (2002) Cleaning up systematic error in eye-tracking data by using required fixation locations. Behav Res Methods Instrum Comput 34(4):592–604
Jia S, Koh DH, Pomplun M (2018) Gaze tracking accuracy maintenance using traffic sign detection. In: Adjunct proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications, pp 87–91. ACM
Johansen SA, San Agustin J, Skovsgaard H, Hansen JP, Tall M (2011) Low cost vs. high-end eye tracking for usability testing. In: CHI’11 extended abstracts on human factors in computing systems, pp 1177–1182. ACM
Kosevoi-Tichie A, Berghea F, Vlad V, Abobului M, Trandafir M, Gudu T, Peltea A, Duna M, Groseanu L, Patrascu C, Ionescu R (2015) Does eye gaze tracking have the ability to assess how rheumatologists evaluate musculoskeletal ultrasound images?
Levine S, Pastor P, Krizhevsky A, Ibarz J, Quillen D (2018) Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection. Int J Robot Res 37(4–5):421–436
Li Z, Tong I, Metcalf L, Hennessey C, Salcudean SE (2018) Free head movement eye gaze contingent ultrasound interfaces for the da vinci surgical system. IEEE Robot Autom Lett 3(3):2137–2143
Liebling DJ, Dumais ST (2014) Gaze and mouse coordination in everyday work. In: Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: adjunct publication, pp 1141–1150. ACM
Papoutsaki A, Gokaslan A, Tompkin J, He Y, Huang J (2018) The eye of the typer: a benchmark and analysis of gaze behavior during typing. In: Proceedings of the 2018 ACM symposium on eye tracking research and applications, p 16. ACM
Sidenmark L, Lundström A (2019) Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration. In: Eleventh edition of the ACM symposium on eye tracking research and applications (ETRA 2019)
Tripathi S, Guenter B (2017) A statistical approach to continuous self-calibrating eye gaze tracking for head-mounted virtual reality systems. In: 2017 IEEE winter conference on applications of computer vision (WACV), pp 862–870. IEEE
Williams CK, Rasmussen CE (2006) Gaussian processes for machine learning, vol 2. MIT press, Cambridge
Zhu H, Salcudean S, Rohling R (2019) The neyman pearson detection of microsaccades with maximum likelihood estimation of parameters. J Vis 19(13):17–17
Zhu H, Salcudean SE, Rohling RN (2019) A novel gaze-supported multimodal human-computer interaction for ultrasound machines. Int J Comput Assist Radiol Surg 14(7):1107–1115
Acknowledgements
This work was supported by an NSERC Strategic Project Grant and by infrastructure purchased with Grants from CFI, involving Canadian federal and British Columbia Provincial support. The authors would like to deliver appreciations to all participants in our user study, as well as to authors of [17] who shared their experimental data publicly.
Funding
This work was supported by an NSERC Strategic Project Grant and by infrastructure purchased with grants from CFI, involving Canadian federal and British Columbia Provincial support.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Ethical approval
The user study performed in our research is approved by the Behavioural Research Ethics Board at the University of British Columbia.
Informed consent
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Zhu, H., Rohling, R.N. & Salcudean, S.E. Hand-eye coordination-based implicit re-calibration method for gaze tracking on ultrasound machines: a statistical approach. Int J CARS 15, 837–845 (2020). https://doi.org/10.1007/s11548-020-02143-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-020-02143-w