Abstract
Eye tracking has a long history in medical and psychological research as a tool for recording and studying human visual behavior. Real-time gaze-based text entry can also be a powerful means of communication and control for people with physical disabilities. Following recent technological advances and the advent of affordable eye trackers, there is a growing interest in pervasive attention-aware systems and interfaces that have the potential to revolutionize mainstream human-technology interaction. In this chapter, we provide an introduction to the state-of-the art in eye tracking technology and gaze estimation. We discuss challenges involved in using a perceptual organ, the eye, as an input modality. Examples of real life applications are reviewed, together with design solutions derived from research results. We also discuss how to match the user requirements and key features of different eye tracking systems to find the best system for each task and application.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Alapetite A, Hansen JP, MacKenzie IS (2012) Demo of gaze controlled flying. In: Proceedings of the 7th Nordic conference on human-computer interaction: making sense through design, NordiCHI’12. ACM, New York, pp 773–774
Ashmore M, Duchowski AT, Shoemaker G (2005) Efficient eye pointing with a fisheye lens. In: Proceedings of graphics interface 2005, GI’05. Canadian Human-Computer Communications Society, Waterloo, Ontario, pp 203–210
Barea F, Boquete L, Mazo M, Lopez E (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Trans Neural Syst Rehabil Eng 10(4):209–218
Bates R, Donegan M, Istance HO et al (2006) Introducing COGAIN—communication by gaze interaction. In: Clarkson J, Langdon P, Robinson P (eds) Designing accessible technology. Springer, London, pp 77–84
Bates R, Istance H (2002) Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In: Proceedings of the 5th international ACM conference on assistive technologies, Assets’02. ACM, New York, pp 119–126
Bates R, Istance HO, Vickers S (2008) Gaze interaction with virtual on-line communities. Designing inclusive futures. Springer, London, pp 149–162
Bednarik R, Vrzakova H, Hradis M (2012) What you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 83–90
Bengoechea JJ, Villanueva A, Cabeza R (2012) Hybrid eye detection algorithm for outdoor environments. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 685–688
Brigham FJ, Zaimi E, Matkins JJ et al (2001) The eyes may have it: reconsidering eye-movement research in human cognition. In: Scruggs TE, Mastropieri MA (eds) Technological applications. Advances in learning and behavioral disabilities, vol 15. Emerald Group Publishing Limited, Bingley, pp 39–59
Borghetti D, Bruni A, Fabbrini M et al (2007) A low-cost interface for control of computer functions by means of eye movements. Comput Biol Med 37(12):1765–1770
Bulling A, Cheng S, Brône G et al (2012a) 2nd international workshop on pervasive eye tracking and mobile eye-based interaction (PETMEI 2012). In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp 2012. ACM, New York, pp 673–676
Bulling A, Ward JA, Gellersen H et al (2008a) Robust recognition of reading activity in transit using wearable electrooculography. In: Proceedings of the 6th international conference on pervasive computing, Pervasive 2008, pp 19–37
Bulling A, Gellersen H (2010) Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput 9(4):8–12
Bulling A, Roggen D, Tröster G (2008b) It’s in your eyes—Towards context-awareness and mobile hci using wearable EOG goggles. In: Proceedings of the 10th international conference on ubiquitous computing. ACM, New York, pp 84–93
Bulling A, Roggen D, Tröster G (2009a) Wearable EOG goggles: eye-based interaction in everyday environments. In: Extended abstracts of the 27th ACM conference on human factors in computing systems, CHI’09. ACM, New York, pp 3259–3264
Bulling A, Roggen D, Tröster G (2009b) Wearable EOG goggles: seamless sensing and context-awareness in everyday environments. J Ambient Intell Smart Environ 1(2):157–171
Bulling A, Ward JA, Gellersen H et al (2009c) Eye movement analysis for activity recognition. In: Proceedings of the 11th international conference on ubiquitous computing, UbiComp 2009. ACM, New York, pp 41–50
Bulling A, Roggen D (2011a) Recognition of visual memory recall processes using eye movement analysis. In: Proceedings of the 13th international conference on ubiquitous computing, UbiComp 2011. ACM, New York, pp 455–464
Bulling A, Ward JA, Gellersen H et al (2011b) Eye movement analysis for activity recognition using electrooculography. IEEE Trans Pattern Anal Mach Intell 33(4):741–753
Bulling A, Ward JA, Gellersen H (2012b) Multimodal recognition of reading activity in transit using body-worn sensors. ACM Trans Appl Percept 9(1):2:1–2:21
Bulling A, Weichel C, Gellersen H (2013) EyeContext: recognition of high-level contextual cues from human visual behavior. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 305–308
Canosa RL (2009) Real-world vision: selective perception and task. ACM Trans Appl Percept 6(2):article 11, 34 pp
Castellina E, Corno F (2007) Accessible web surfing through gaze interaction. In: Proceedings of the 3rd Conference on communication by gaze interaction, COGAIN 2007, Leicester, 3–4 Sept 2007, pp 74–77
Chen Y, Newman WS (2004) A human-robot interface based on electrooculography. In: Proceedings of the international conference on robotics and automation, ICRA 2004, vol 1, pp 243–248
Corno F, Gale A, Majaranta P et al (2010) Eye-based direct interaction for environmental control in heterogeneous smart environments. In: Nakashima H et al (eds) Handbook of ambient intelligence and smart environments. Springer, New York, pp 1117–1138
Ding Q, Tong K, Li G (2005) Development of an EOG (ElectroOculography) based human-computer interface. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, EMBS 2005, pp 6829–6831
Donegan M, Morris DJ, Corno F et al (2009) Understanding users and their needs. Univ Access Inf Soc 8(4):259–275
Drewes H, Schmidt A (2007) Interacting with the computer using gaze gestures. In: Proceedings of INTERACT ‘07. Lecture notes in computer science, vol 4663. Springer, Heidelberg, pp 475-488
Du R, Liu R, Wu T et al (2012) Online vigilance analysis combining video and electrooculography features. In: Proceedings of 19th international conference on neural information processing, ICONIP 2012. Lecture notes in computer science, vol 7667. Springer, Heidelberg, pp 447–454
Duchowski AT (2002) A breadth-first survey of eye-tracking applications. Behav Res Meth 34(4):455–470
Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, London
Duchowski AT, Cournia NA, Murphy HA (2004) Gaze-contingent displays: a review. CyberPsychol Behav 7(6):621–634
Duchowski AT, Vertegaal R (2000) Eye-based interaction in graphical systems: theory and practice. Course 05, SIGGRAPH 2000. Course notes. ACM, New York. http://eyecu.ces.clemson.edu/sigcourse/. Accessed 23 Feb 2013
Dybdal ML, San Agustin J, Hansen JP (2012) Gaze input for mobile devices by dwell and gestures. In: Proceedings of the symposium on eye tracking research and applications, ETRA ‘12. ACM, New York, pp 225–228
ElHelw MA, Atkins S, Nicolaou M et al (2008) A gaze-based study for investigating the perception of photorealism in simulated scenes. ACM Trans Appl Percept 5(1):article 3, 20 pp
Ellis S, Cadera R, Misner J et al (1998) Windows to the soul? What eye movements tell us about software usability. In: Proceedings of 7th annual conference of the usability professionals association, Washington, pp 151–178
Essig K, Dornbusch D, Prinzhorn D et al (2012) Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 37–44
Fairclough SH (2011) Physiological computing: interacting with the human nervous system. In: Ouwerkerk M, Westerlink J, Krans M (eds) Sensing emotions in context: the impact of context on behavioural and physiological experience measurements. Springer, Amsterdam, pp 1–22
Fejtová M, Figueiredo L, Novák P et al (2009) Hands-free interaction with a computer and other technologies. Univ Access Inf Soc 8(4):277–295
Fono D, Vertegaal R (2005) EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’05. ACM, New York, pp 151–160
Friedman MB, Kiliany G, Dzmura et al (1982) The eyetracker communication system. Johns Hopkins APL Technical Digest 3(3):250–252
Fujitsu (2012) Fujitsu develops eye tracking technology. Press release 2 Oct 2012. http://www.fujitsu.com/global/news/pr/archives/month/2012/20121002-02.html. Accessed 23 Feb 2013
Goldberg JH, Wichansky AM (2003) Eye tracking in usability evaluation: a practitioner’s guide. In: Hyönä J, Radach R, Deubel H (eds) The mind’s eye: cognitive and applied aspects of eye movement research. North-Holland, Amsterdam, pp 493–516
Greene MR, Liu TY, Wolfe JM (2012) Reconsidering Yarbus: a failure to predict observers’ task from eye movement patterns. Vis Res 62:1–8
Hacisalihzade SS, Stark LW, Allen JS (1992) Visual perception and sequences of eye movement fixations: a stochastic modeling approach. IEEE Trans Syst Man Cybern 22(3):474–481
Hammoud R (ed) (2008) Passive eye monitoring: algorithms, applications and experiments. Series: signals and communication technology. Springer, Berlin
Hansen DW, Ji Q (2009) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500
Hansen DW, Majaranta P (2012) Basics of camera-based gaze tracking. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. Medical Information Science Reference, Hershey, pp 21–26
Hansen DW, Pece AEC (2005) Eye tracking in the wild. Comput Vis Image Underst 98(1):155–181
Hansen DW, Skovsgaard HH, Hansen JP et al (2008) Noise tolerant selection by gaze-controlled pan and zoom in 3D. In: Proceedings of the symposium on eye tracking research and applications, ETRA’08. ACM, New York, pp 205–212
Hansen DW, San Agustin J, Villanueva A (2010) Homography normalization for robust gaze estimation in uncalibrated setups. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 13–20
Hansen JP, Tørning K, Johansen AS et al (2004) Gaze typing compared with input by head and hand. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 131–138
Hillstrom AP, Yantis S (1994) Visual motion and attentional capture. Percept Psychophys 55(4):399–411
Hori J, Sakano K, Miyakawa M, Saitoh Y (2006) Eye movement communication control system based on EOG and voluntary eye blink. In: Proceedings of the 9th international conference on computers helping people with special needs, ICCHP, vol 4061, pp 950–953
Huckauf A, Urbina MH (2008) On object selection in gaze controlled environments. J Eye Mov Res 2(4):1–7
Hyrskykari A, Majaranta P, Aaltonen A et al (2000) Design issues of iDICT: a gaze-assisted translation aid. In: Proceedings of the 2000 symposium on eye tracking research and applications, ETRA 2000. ACM, New York, pp 9–14
Hyrskykari A, Majaranta P, Räihä KJ (2003) Proactive response to eye movements. In: Rauterberg et al (eds) Proceedings of INTERACT 2003, pp 129–136
Isokoski P (2000) Text input methods for eye trackers using off-screen targets. In: Proceedings of the symposium on eye tracking research and applications, ETRA’00. ACM, New York, pp 15–21
Isokoski P, Joos M, Spakov O et al (2009) Gaze controlled games. Univ Access Inf Soc 8(4):323–337
Istance H, Hyrskykari A (2012) Gaze-aware systems and attentive applications. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 175–195
Jacob RJK (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans Inf Sys 9(3):152–169
Jacob RJK (1993) Eye movement-based human-computer interaction techniques: toward non-command interfaces. In: Hartson HR, Hix D (eds) Advances in human-computer interaction, vol 4. Ablex Publishing Co, Norwood, pp 151–190
Jacob RJK (1995) Eye tracking in advanced interface design. In: Barfield W, Furness TA (eds) Virtual environments and advanced interface design. Oxford University Press, New York, pp 258–288
Jokinen K, Majaranta P (2013) Eye-gaze and facial expressions as feedback signals in educational interactions. In: Barres DG et al (eds) Technologies for inclusive education: beyond traditional integration approaches. IGI Global, Hershey, pp 38–58
Kandemir M, Kaski S (2012) Learning relevance from natural eye movements in pervasive interfaces. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 85–92
Kherlopian AR, Gerrein JP, Yue M et al (2006) Electrooculogram based system for computer control using a multiple feature classification model. In: Proceedings of the 28th annual international conference of the engineering in medicine and biology society, EMBS 2006, pp 1295–1298
Kim Y, Doh N, Youm Y et al (2001) Development of a human-mobile communication system using electrooculogram signals. In: Proceedings of the 2001 IEEE/RSJ international conference on intelligent robots and systems, IROS 2001, vol 4, pp 2160–2165
Kinsman TB, Pelz JB (2012) Location by parts: model generation and feature fusion for mobile eye pupil tracking under challenging lighting. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 695–700
Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100
Land MF, Furneaux S (1997) The knowledge base of the oculomotor system. Philos Trans Biol Sci 352(1358):1231–1239
Larsen EJ (2012) Systems and methods for providing feedback by tracking user gaze and gestures. Sony Computer Entertainment Inc. US Patent application 2012/0257035. http://www.faqs.org/patents/app/20120257035. Accessed 23 Feb 2013
Liu SC, Delbruck T (2010) Neuromorphic sensory systems. Curr Opin Neurobiol 20:1–8
Lorenceau J (2012) Cursive writing with smooth pursuit eye movements. Curr Biol 22(16):1506–1509. doi:10.1016/j.cub.2012.06.026
Majaranta P (2012) Communication and text entry by gaze. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 63–77
Majaranta P, Ahola UK, Špakov O (2009a) Fast gaze typing with an adjustable dwell time. In: Proceedings of the 27th international conference on human factors in computing systems, CHI 2009. ACM, New York, pp 357–360
Majaranta P, Bates R, Donegan M (2009b) Eye tracking. In: Stephanidis C. (ed) The universal access handbook, chapter 36. CRC Press, Boca Raton, 20 pp
Majaranta P, MacKenzie IS, Aula A et al (2006) Effects of feedback and dwell time on eye typing speed and accuracy. Univ Access in Inf Soc 5(2):199–208
Majaranta P, Räihä KJ (2002) Twenty years of eye typing: systems and design issues. In: Proceedings of 2002 symposium on eye tracking research and applications, ETRA 2002. ACM, New York, pp 15–22
Manabe H, Fukumoto M (2006) Full-time wearable headphone-type gaze detector. In: Extended abstracts of the SIGCHI conference on human factors in computing systems, CHI 2006. ACM, New York, pp 1073–1078
Mele ML, Federici S (2012) A psychotechnological review on eye-tracking systems: towards user experience. Disabil Rehabil Assist Technol 7(4):261–281
Miniotas D, Špakov O, Tugoy I et al (2006) Speech-augmented eye gaze interaction with small closely spaced targets. In: Proceedings of the 2006 symposium on eye tracking research and applications, ETRA ‘06. ACM, New York, pp 67–72
Mizuno F, Hayasaka T, Tsubota K et al (2003) Development of hands-free operation interface for wearable computer-hyper hospital at home. In: Proceedings of the 25th annual international conference of the engineering in medicine and biology society, EMBS 2003, vol 4, 17–21 Sept 2003, pp 3740–3743
Mohammad Y, Okada S, Nishida T (2010) Autonomous development of gaze control for natural human-robot interaction. In Proceedings of the 2010 workshop on eye gaze in intelligent human machine interaction (EGIHMI ‘10). ACM, New York, NY, USA, pp 63–70
Morimoto CH, Mimica MRM (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24
Mulvey F (2012) Eye anatomy, eye movements and vision. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 10–20
Murphy-Chutorian E, Trivedi MM (2009) Head pose estimation in computer vision: a survey. IEEE Trans Pattern Anal Mach Intell 31(4):607–626
Nagamatsu N, Sugano R, Iwamoto Y et al (2010) User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 251–254
Nakano YI, Jokinen J, Huang HH (2012) 4th workshop on eye gaze in intelligent human machine interaction: eye gaze and multimodality. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 611–612
Nielsen J (1993) Noncommand user interfaces. Commun ACM 36(4):82–99
Ohno T (1998) Features of eye gaze interface for selection tasks. In: Proceedings of the 3rd Asia Pacific computer-human interaction, APCHI’98. IEEE Computer Society, Washington, pp 176–182
Osterberg G (1935) Topography of the layer of rods and cones in the human retina. Acta Ophthalmol Suppl 13(6):1–102
Patmore DW, Knapp RB (1998) Towards an EOG-based eye tracker for computer control. In Proceedings of the 3rd international ACM conference on assistive technologies, ASSETS’98. ACM, New York, pp 197–203
Penzel T, Lo CC, Ivanov PC et al (2005) Analysis of sleep fragmentation and sleep structure in patients with sleep apnea and normal volunteers. In: 27th annual international conference of the engineering in medicine and biology society, IEEE-EMBS 2005, pp 2591–2594
Philips GR, Catellier AA, Barrett SF et al (2007) Electrooculogram wheelchair control. Biomed Sci Instrum 43:164–169
Porta M Ravarelli A, Spagnoli G (2010) ceCursor, a contextual eye cursor for general pointing in windows environments. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 331–337
Rayner K (1995) Eye movements and cognitive processes in reading, visual search, and scene perception. In: Findlay JM et al (eds) Eye movement research: mechanisms, processes and applications. North Holland, Amsterdam, pp 3–22
Rothkopf CA, Pelz JP (2004) Head movement estimation for wearable eye tracker. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 123–130
Räihä K-J, Hyrskykari A, Majaranta P (2011) Tracking of visual attention and adaptive applications. In: Roda C (ed) Human attention in digital environments. Cambridge University Press, Cambridge, pp 166–185
Salvucci DD, Anderson JR (2001) Automated eye-movement protocol analysis. Human-Comput Interact 16(1):39–86
Schneider E, Dera T, Bard K et al (2005) Eye movement driven head-mounted camera: it looks where the eyes look. In: IEEE international conference on systems, man and cybernetics, vol 3, pp 2437–2442
Shell JS, Vertegaal R, Cheng D et al (2004) ECSGlasses and EyePliances: using attention to open sociable windows of interaction. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 93–100
Sibert LE, Jacob RJK (2000) Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’00, ACM, pp 281–288
Skovsgaard H, Räihä KJ, Tall M (2012) Computer control by gaze. In: Majaranta P et al. (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 78–102
Smith JR, Cronin MJ, Karacan I (1971) A multichannel hybrid system for rapid eye movement detection (REM detection). Comp Biomed Res 4(3):275–290
Špakov O, Majaranta P (2012) Enhanced gaze interaction using simple head gestures. In: Proceedings of the 14th international conference on ubiquitous computing, UbiComp’12. ACM Press, New York, pp 705–710
Stampe DM, Reingold EM (1995) Selection by looking: a novel computer interface and its application to psychological research. In: Findlay JM, Walker R, Kentridge RW (eds) Eye movement research: mechanisms, processes and applications. Elsevier Science, Amsterdam, pp 467–478
Stellmach S, Dachselt F (2012) Look and touch: gaze-supported target acquisition. In: Proceedings of the 2012 ACM annual conference on human factors in computing systems, CHI’12. ACM, New York, pp 2981–2990
Sugioka A, Ebisawa Y, Ohtani M (1996) Noncontact video-based eye-gaze detection method allowing large head displacements. In: Proceedings of the 18th annual international conference of the IEEE engineering in medicine and biology society. Bridging disciplines for biomedicine, vol 2, pp 526–528
Ten Kate JH, Frietman EEE, Willems W et al (1979) Eye-switch controlled communication aids. In: Proceedings of the 12th international conference on medical and biological engineering, Jerusalem, Israel, August 1979
Tessendorf B, Bulling A, Roggen D et al (2011) Recognition of hearing needs from body and eye movements to improve hearing instruments. In: Proceedings of the 9th international conference on pervasive computing, Pervasive 2011. Lecture notes in computer science, vol 6696. Springer, Heidelberg, pp 314–331
Tobii (2011) Tobii unveils the world’s first eyecontrolled laptop. Press release, 1 March 2011. http://www.tobii.com/en/eye-tracking-research/global/news-and-events/press-release-archive/archive-2011/tobii-unveils-the-worlds-first-eye-controlled-laptop. Accessed 23 Feb 2013
Vehkaoja AT, Verho JA, Puurtinen MM et al (2005) Wireless head cap for EOG and facial EMG measurements. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, IEEE EMBS 2005, pp 5865–5868
Velichkovsky B, Sprenger A, Unema P (1997) Towards gaze-mediated interaction: collecting solutions of the “Midas touch problem”. In: Proceedings of the IFIP TC13 international conference on human-computer interaction, INTERACT’97. Chapman and Hall, London, pp 509–516
Venkataramanan, A Prabhat P, Choudhury SR et al (2005) Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system. In: Proceedings of the 3rd international conference on intelligent sensing and information processing, ICISIP 2005, IEEE Conference Publications, pp 535–540
Vesterby T, Voss JC, Hansen JP et al (2005) Gaze-guided viewing of interactive movies. Digit Creativity 16(4):193–204
Vickers S, Istance H, Smalley M (2010) EyeGuitar: making rhythm based music video games accessible using only eye movements. In: Proceedings of the 7th international conference on advances in computer entertainment technology, ACE’10. ACM, New York, pp 36–39
Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing, UbiComp 2013. ACM, New York
Wade NJ, Tatler BW (2005) The moving tablet of the eye: the origins of modern eye movement research. Oxford University Press, Oxford
Ware C, Mikaelian HH (1987) An evaluation of an eye tracker as a device for computer input. In: Proceedings of the SIGCHI/GI conference on human factors in computing systems and graphics interface, CHI and GI’87. ACM, New York, pp 183–188
Wijesoma WS, Wee Ks, Wee OC et al (2005) EOG based control of mobile assistive platforms for the severely disabled. In: Proceedings of the IEEE international conference on robotics and biomimetics, ROBIO 2005, pp 490–494
Wobbrock JO, Rubinstein J, Sawyer MW et al (2008) Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In: Proceedings of the symposium on eye tracking research and applications, ETRA 2008. ACM, New York, pp 11–18
Wästlund W, Sponseller K, Pettersson O (2010) What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA 2010. ACM, New York, pp 133–136
Zhang Y, Rasku J, Juhola M (2012) Biometric verification of subjects using saccade eye movements. Int J Biometr 4(4):317–337
Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 851–860
Zhu Z, Ji Q (2005) Eye gaze tracking under natural head movements. In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005, vol 1, pp 918–923
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag London
About this chapter
Cite this chapter
Majaranta, P., Bulling, A. (2014). Eye Tracking and Eye-Based Human–Computer Interaction. In: Fairclough, S., Gilleade, K. (eds) Advances in Physiological Computing. Human–Computer Interaction Series. Springer, London. https://doi.org/10.1007/978-1-4471-6392-3_3
Download citation
DOI: https://doi.org/10.1007/978-1-4471-6392-3_3
Published:
Publisher Name: Springer, London
Print ISBN: 978-1-4471-6391-6
Online ISBN: 978-1-4471-6392-3
eBook Packages: Computer ScienceComputer Science (R0)