Skip to main content

Part of the book series: Human–Computer Interaction Series ((HCIS))

Abstract

The human eye gaze is an important non-verbal cue that can unobtrusively provide information about the intention and attention of a user to enable intelligent interactive systems. Eye gaze can also be taken as input to systems as a replacement of the conventional mouse and keyboard, and can also be indicative of the cognitive state of the user. However, estimating and applying gaze in real-world applications poses significant challenges. In this chapter, we first review the development of gaze estimation methods in recent years. We especially focus on learning-based gaze estimation methods which benefit from large-scale data and deep learning methods that recently became available. Second, we discuss the challenges of using gaze estimation for real-world applications and our efforts toward making these methods easily usable for the Human-Computer Interaction community. At last, we provide two application examples, demonstrating the use of eye gaze to enable attentive and adaptive interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://www.opengaze.org.

  2. 2.

    https://github.com/swook/GazeML.

References

  1. Barz M, Daiber F, Sonntag D, Bulling A (2018) Error-aware gaze-based interfaces for robust mobile gaze interaction. In: Proceedings of the 2018 ACM symposium on eye tracking research & applications, association for computing machinery, New York, NY, USA, ETRA’18. https://doi.org/10.1145/3204493.3204536

  2. Blignaut P (2009) Fixation identification: the optimum threshold for a dispersion algorithm. Atten Percept Psychophys 71(4):881–895

    Article  Google Scholar 

  3. Bulling A (2016) Pervasive attentive user interfaces. IEEE Comput 49(1):94–98

    Article  MathSciNet  Google Scholar 

  4. Chen Z, Shi B (2020) Offset calibration for appearance-based gaze estimation via gaze decomposition. In: Proceedings of the IEEE/CVF winter conference on applications of computer vision (WACV)

    Google Scholar 

  5. Cheng Y, Zhang X, Lu F, Sato Y (2020) Gaze estimation by exploring two-eye asymmetry. IEEE Trans Image Process 29:5259–5272

    Article  Google Scholar 

  6. Eckstein MK, Guerra-Carrillo B, Singley ATM, Bunge SA (2017) Beyond eye gaze: what else can eye tracking reveal about cognition and cognitive development? Dev Cogn Neurosci 25:69–91

    Article  Google Scholar 

  7. Feit AM, Williams S, Toledo A, Paradiso A, Kulkarni H, Kane S, Morris MR (2017) Toward everyday gaze input: accuracy and precision of eye tracking and implications for design. In: Proceedings of the 2017 CHI conference on human factors in computing systems, association for computing machinery, New York, NY, USA, CHI’17, pp 1118–1130. https://doi.org/10.1145/3025453.3025599

  8. Feit AM, Vordemann L, Park S, Berube C, Hilliges O (2020) Detecting relevance during decision-making from eye movements for ui adaptation. In: ACM symposium on eye tracking research and applications, association for computing machinery, New York, NY, USA, ETRA’20 Full Papers. https://doi.org/10.1145/3379155.3391321

  9. Findlater L, Gajos KZ (2009) Design space and evaluation challenges of adaptive graphical user interfaces. AI Mag 30(4):68–73. https://doi.org/10.1609/aimag.v30i4.2268

    Article  Google Scholar 

  10. Fischer T, Jin Chang H, Demiris Y (2018) Rt-gene: real-time eye gaze estimation in natural environments. In: Proceedings of the European conference on computer vision (ECCV), pp 334–352

    Google Scholar 

  11. Fuhl W, Santini T, Kasneci G, Rosenstiel W, Kasneci E (2017) Pupilnet v2. 0: Convolutional neural networks for CPU based real time robust pupil detection. arXiv:171100112

  12. Funes Mora KA, Monay F, Odobez JM (2014) Eyediap: a database for the development and evaluation of gaze estimation algorithms from RGB and RGB-D cameras. In: Proceedings of the symposium on eye tracking research and applications, pp 255–258

    Google Scholar 

  13. Ganin Y, Kononenko D, Sungatullina D, Lempitsky V (2016) Deepwarp: Photorealistic image resynthesis for gaze manipulation. In: European conference on computer vision. Springer, pp 311–326

    Google Scholar 

  14. Gebhardt C, Hecox B, van Opheusden B, Wigdor D, Hillis J, Hilliges O, Benko H (2019) Learning cooperative personalized policies from gaze data. In: Proceedings of the 32nd annual ACM symposium on user interface software and technology, association for computing machinery, New York, NY, USA, UIST’19, pp 197–208. https://doi.org/10.1145/3332165.3347933

  15. Gidlöf K, Wallin A, Dewhurst R, Holmqvist K (2013) Using eye tracking to trace a cognitive process: gaze behaviour during decision making in a natural environment. J Eye Mov Res 6(1). https://doi.org/10.16910/jemr.6.1.3, https://bop.unibe.ch/index.php/JEMR/article/view/2351

  16. Hansen DW, Ji Q (2009) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500

    Article  Google Scholar 

  17. He J, Pham K, Valliappan N, Xu P, Roberts C, Lagun D, Navalpakkam V (2019a) On-device few-shot personalization for real-time gaze estimation. In: Proceedings of the IEEE international conference on computer vision workshops, pp 0–0

    Google Scholar 

  18. He Z, Spurr A, Zhang X, Hilliges O (2019b) Photo-realistic monocular gaze redirection using generative adversarial networks. In: Proceedings of the IEEE international conference on computer vision, pp 6932–6941

    Google Scholar 

  19. Hirzle T, Gugenheimer J, Geiselhart F, Bulling A, Rukzio E (2019) A design space for gaze interaction on head-mounted displays. In: Proceedings of the 2019 CHI conference on human factors in computing systems, association for computing machinery, New York, NY, USA, CHI’19, pp 1–12. https://doi.org/10.1145/3290605.3300855

  20. Howard IP, Rogers BJ et al (1995) Binocular vision and stereopsis. Oxford University Press, USA

    Google Scholar 

  21. Kellnhofer P, Recasens A, Stent S, Matusik W, Torralba A (2019) Gaze360: physically unconstrained gaze estimation in the wild. In: Proceedings of the IEEE international conference on computer vision, pp 6912–6921

    Google Scholar 

  22. Khamis M, Oechsner C, Alt F, Bulling A (2018) Vrpursuits: interaction in virtual reality using smooth pursuit eye movements. In: Proceedings of the 2018 international conference on advanced visual interfaces, association for computing machinery, New York, NY, USA, AVI’18. https://doi.org/10.1145/3206505.3206522

  23. Kim J, Stengel M, Majercik A, De Mello S, Dunn D, Laine S, McGuire M, Luebke D (2019) Nvgaze: an anatomically-informed dataset for low-latency, near-eye gaze estimation. In: Proceedings of the 2019 CHI conference on human factors in computing systems, pp 1–12

    Google Scholar 

  24. Krafka K, Khosla A, Kellnhofer P, Kannan H, Bhandarkar S, Matusik W, Torralba A (2016) Eye tracking for everyone. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2176–2184

    Google Scholar 

  25. Lindén E, Sjostrand J, Proutiere A (2019) Learning to personalize in appearance-based gaze tracking. In: Proceedings of the IEEE international conference on computer vision workshops, pp 0–0

    Google Scholar 

  26. Lindlbauer D, Feit AM, Hilliges O (2019) Context-aware online adaptation of mixed reality interfaces. In: Proceedings of the 32nd annual ACM symposium on user interface software and technology, pp 147–160

    Google Scholar 

  27. Majaranta P (2011) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global

    Google Scholar 

  28. Majaranta P, Bulling A (2014) Eye tracking and eye-based human–computer interaction. Springer, London, pp 39–65. https://doi.org/10.1007/978-1-4471-6392-3_3

  29. Moshirfar M, Hoggan RN, Muthappan V (2013) Angle kappa and its importance in refractive surgery. Oman J Ophthalmol 6(3):151

    Article  Google Scholar 

  30. Orquin JL, Loose SM (2013) Attention and choice: a review on eye movements in decision making. ACTPSY 144:190–206. https://doi.org/10.1016/j.actpsy.2013.06.003

  31. Papismedov D, Fink L (2019) Do consumers make less accurate decisions when they use mobiles? In: International conference on information systems, Munich

    Google Scholar 

  32. Park S, Gebhardt C, Rädle R, Feit A, Vrzakova H, Dayama N, Yeo HS, Klokmose C, Quigley A, Oulasvirta A, Hilliges O (2018a) AdaM: adapting multi-user interfaces for collaborative environments in real-time. In: SIGCHI conference on human factors in computing systems. ACM, New York, NY, USA, CHI’18

    Google Scholar 

  33. Park S, Spurr A, Hilliges O (2018b) Deep pictorial gaze estimation. In: Proceedings of the European conference on computer vision (ECCV), pp 721–738

    Google Scholar 

  34. Park S, Zhang X, Bulling A, Hilliges O (2018c) Learning to find eye region landmarks for remote gaze estimation in unconstrained settings. In: Proceedings of the 2018 ACM symposium on eye tracking research & applications, pp 1–10

    Google Scholar 

  35. Park S, Mello SD, Molchanov P, Iqbal U, Hilliges O, Kautz J (2019) Few-shot adaptive gaze estimation. In: Proceedings of the IEEE international conference on computer vision, pp 9368–9377

    Google Scholar 

  36. Park S, Aksan E, Zhang X, Hilliges O (2020) Towards end-to-end video-based eye-tracking. In: European conference on computer vision. Springer, pp 747–763

    Google Scholar 

  37. Qvarfordt P, Zhai S (2005) Conversing with the user based on eye-gaze patterns. In: Proceedings of the SIGCHI conference on human factors in computing systems, association for computing machinery, New York, NY, USA, CHI’05, pp 221–230. https://doi.org/10.1145/1054972.1055004

  38. Russo JE, Leclerc F (1994) An eye-fixation analysis of choice processes for consumer nondurables. J Cons Res 21(2):274–290. https://doi.org/10.1086/209397, https://academic.oup.com/jcr/article-pdf/21/2/274/5093700/21-2-274.pdf

  39. Salvucci DD (2001) An integrated model of eye movements and visual encoding. J Cogn Syst Res 1:201–220. www.elsevier.com/locate/cogsys

  40. Salvucci DD, Goldberg JH (2000) Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 symposium on Eye tracking research & applications, pp 71–78

    Google Scholar 

  41. Sesma L, Villanueva A, Cabeza R (2012) Evaluation of pupil center-eye corner vector for gaze estimation using a web cam. In: Proceedings of the symposium on eye tracking research and applications, pp 217–220

    Google Scholar 

  42. Sibert LE, Jacob RJ (2000) Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI conference on Human Factors in Computing Systems, pp 281–288

    Google Scholar 

  43. Sugano Y, Matsushita Y, Sato Y (2014) Learning-by-synthesis for appearance-based 3D gaze estimation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1821–1828

    Google Scholar 

  44. Tan KH, Kriegman DJ, Ahuja N (2002) Appearance-based eye gaze estimation. In: Proceedings of the sixth IEEE workshop on applications of computer vision, 2002. (WACV 2002). IEEE, pp 191–195

    Google Scholar 

  45. Špakov O (2012) Comparison of eye movement filters used in HCI. In: Proceedings of the symposium on eye tracking research and applications, association for computing machinery, New York, NY, USA, ETRA ’12, pp 281–284. https://doi.org/10.1145/2168556.2168616

  46. Wang K, Ji Q (2017) Real time eye gaze tracking with 3d deformable eye-face model. In: Proceedings of the IEEE international conference on computer vision (ICCV)

    Google Scholar 

  47. Wang K, Zhao R, Ji Q (2018) A hierarchical generative model for eye image synthesis and eye gaze estimation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 440–448

    Google Scholar 

  48. Wood E, Bulling A (2014) Eyetab: Model-based gaze estimation on unmodified tablet computers. In: Proceedings of the symposium on eye tracking research and applications, pp 207–210

    Google Scholar 

  49. Wood E, Baltrusaitis T, Zhang X, Sugano Y, Robinson P, Bulling A (2015) Rendering of eyes for eye-shape registration and gaze estimation. In: Proceedings of the IEEE international conference on computer vision (ICCV)

    Google Scholar 

  50. Yu Y, Liu G, Odobez JM (2018) Deep multitask gaze estimation with a constrained landmark-gaze model. In: Proceedings of the European conference on computer vision (ECCV), pp 0–0

    Google Scholar 

  51. Yu Y, Liu G, Odobez JM (2019) Improving few-shot user-specific gaze adaptation via gaze redirection synthesis. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 11937–11946

    Google Scholar 

  52. Zhai S, Morimoto C, Ihde S (1999) Manual and gaze input cascaded (magic) pointing. In: Proceedings of the SIGCHI conference on human factors in computing systems, association for computing machinery, New York, NY, USA, CHI’99, pp 246–253. https://doi.org/10.1145/302979.303053

  53. Zhang X, Sugano Y, Fritz M, Bulling A (2015) Appearance-based gaze estimation in the wild. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4511–4520

    Google Scholar 

  54. Zhang X, Sugano Y, Bulling A (2017a) Everyday eye contact detection using unsupervised gaze target discovery. In: Proceedings of the 30th annual ACM symposium on user interface software and technology, pp 193–203

    Google Scholar 

  55. Zhang X, Sugano Y, Fritz M, Bulling A (2017b) It’s written all over your face: full-face appearance-based gaze estimation. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp 51–60

    Google Scholar 

  56. Zhang X, Sugano Y, Fritz M, Bulling A (2017c) Mpiigaze: real-world dataset and deep appearance-based gaze estimation. IEEE Trans Pattern Anal Mach Intell 41(1):162–175

    Article  Google Scholar 

  57. Zhang X, Huang MX, Sugano Y, Bulling A (2018a) Training person-specific gaze estimators from user interactions with multiple devices. In: Proceedings of the 2018 CHI conference on human factors in computing systems, pp 1–12

    Google Scholar 

  58. Zhang X, Sugano Y, Bulling A (2018b) Revisiting data normalization for appearance-based gaze estimation. In: Proceedings of the 2018 ACM symposium on eye tracking research & applications, pp 1–9

    Google Scholar 

  59. Zhang X, Sugano Y, Bulling A (2019) Evaluation of appearance-based methods and implications for gaze-based applications. In: Proceedings of the 2019 CHI conference on human factors in computing systems, pp 1–13

    Google Scholar 

  60. Zhang X, Park S, Beeler T, Bradley D, Tang S, Hilliges O (2020a) Eth-xgaze: a large scale dataset for gaze estimation under extreme head pose and gaze variation. In: European conference on computer vision. Springer, pp 365–381

    Google Scholar 

  61. Zhang X, Sugano Y, Bulling A, Hilliges O (2020b) Learning-based region selection for end-to-end gaze estimation. In: British machine vision virtual conference (BMVC)

    Google Scholar 

  62. Zheng Y, Park S, Zhang X, De Mello S, Hilliges O (2020) Self-learning transformations for improving gaze and head redirection. Adv Neural Inf Process Syst 33

    Google Scholar 

  63. Zhu JY, Park T, Isola P, Efros AA (2017) Unpaired image-to-image translation using cycle-consistent adversarial networks. In: 2017 IEEE international conference on computer vision (ICCV)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xucong Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Zhang, X., Park, S., Maria Feit, A. (2021). Eye Gaze Estimation and Its Applications. In: Li, Y., Hilliges, O. (eds) Artificial Intelligence for Human Computer Interaction: A Modern Approach. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-030-82681-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-82681-9_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-82680-2

  • Online ISBN: 978-3-030-82681-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics