Skip to main content

Eye Tracking and Eye-Based Human–Computer Interaction

  • Chapter
  • First Online:
Advances in Physiological Computing

Part of the book series: Human–Computer Interaction Series ((HCIS))

  • 7377 Accesses

Abstract

Eye tracking has a long history in medical and psychological research as a tool for recording and studying human visual behavior. Real-time gaze-based text entry can also be a powerful means of communication and control for people with physical disabilities. Following recent technological advances and the advent of affordable eye trackers, there is a growing interest in pervasive attention-aware systems and interfaces that have the potential to revolutionize mainstream human-technology interaction. In this chapter, we provide an introduction to the state-of-the art in eye tracking technology and gaze estimation. We discuss challenges involved in using a perceptual organ, the eye, as an input modality. Examples of real life applications are reviewed, together with design solutions derived from research results. We also discuss how to match the user requirements and key features of different eye tracking systems to find the best system for each task and application.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  • Alapetite A, Hansen JP, MacKenzie IS (2012) Demo of gaze controlled flying. In: Proceedings of the 7th Nordic conference on human-computer interaction: making sense through design, NordiCHI’12. ACM, New York, pp 773–774

    Google Scholar 

  • Ashmore M, Duchowski AT, Shoemaker G (2005) Efficient eye pointing with a fisheye lens. In: Proceedings of graphics interface 2005, GI’05. Canadian Human-Computer Communications Society, Waterloo, Ontario, pp 203–210

    Google Scholar 

  • Barea F, Boquete L, Mazo M, Lopez E (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Trans Neural Syst Rehabil Eng 10(4):209–218

    Article  Google Scholar 

  • Bates R, Donegan M, Istance HO et al (2006) Introducing COGAIN—communication by gaze interaction. In: Clarkson J, Langdon P, Robinson P (eds) Designing accessible technology. Springer, London, pp 77–84

    Chapter  Google Scholar 

  • Bates R, Istance H (2002) Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In: Proceedings of the 5th international ACM conference on assistive technologies, Assets’02. ACM, New York, pp 119–126

    Google Scholar 

  • Bates R, Istance HO, Vickers S (2008) Gaze interaction with virtual on-line communities. Designing inclusive futures. Springer, London, pp 149–162

    Google Scholar 

  • Bednarik R, Vrzakova H, Hradis M (2012) What you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 83–90

    Google Scholar 

  • Bengoechea JJ, Villanueva A, Cabeza R (2012) Hybrid eye detection algorithm for outdoor environments. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 685–688

    Google Scholar 

  • Brigham FJ, Zaimi E, Matkins JJ et al (2001) The eyes may have it: reconsidering eye-movement research in human cognition. In: Scruggs TE, Mastropieri MA (eds) Technological applications. Advances in learning and behavioral disabilities, vol 15. Emerald Group Publishing Limited, Bingley, pp 39–59

    Google Scholar 

  • Borghetti D, Bruni A, Fabbrini M et al (2007) A low-cost interface for control of computer functions by means of eye movements. Comput Biol Med 37(12):1765–1770

    Article  Google Scholar 

  • Bulling A, Cheng S, Brône G et al (2012a) 2nd international workshop on pervasive eye tracking and mobile eye-based interaction (PETMEI 2012). In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp 2012. ACM, New York, pp 673–676

    Google Scholar 

  • Bulling A, Ward JA, Gellersen H et al (2008a) Robust recognition of reading activity in transit using wearable electrooculography. In: Proceedings of the 6th international conference on pervasive computing, Pervasive 2008, pp 19–37

    Google Scholar 

  • Bulling A, Gellersen H (2010) Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput 9(4):8–12

    Article  Google Scholar 

  • Bulling A, Roggen D, Tröster G (2008b) It’s in your eyes—Towards context-awareness and mobile hci using wearable EOG goggles. In: Proceedings of the 10th international conference on ubiquitous computing. ACM, New York, pp 84–93

    Google Scholar 

  • Bulling A, Roggen D, Tröster G (2009a) Wearable EOG goggles: eye-based interaction in everyday environments. In: Extended abstracts of the 27th ACM conference on human factors in computing systems, CHI’09. ACM, New York, pp 3259–3264

    Google Scholar 

  • Bulling A, Roggen D, Tröster G (2009b) Wearable EOG goggles: seamless sensing and context-awareness in everyday environments. J Ambient Intell Smart Environ 1(2):157–171

    Google Scholar 

  • Bulling A, Ward JA, Gellersen H et al (2009c) Eye movement analysis for activity recognition. In: Proceedings of the 11th international conference on ubiquitous computing, UbiComp 2009. ACM, New York, pp 41–50

    Google Scholar 

  • Bulling A, Roggen D (2011a) Recognition of visual memory recall processes using eye movement analysis. In: Proceedings of the 13th international conference on ubiquitous computing, UbiComp 2011. ACM, New York, pp 455–464

    Google Scholar 

  • Bulling A, Ward JA, Gellersen H et al (2011b) Eye movement analysis for activity recognition using electrooculography. IEEE Trans Pattern Anal Mach Intell 33(4):741–753

    Google Scholar 

  • Bulling A, Ward JA, Gellersen H (2012b) Multimodal recognition of reading activity in transit using body-worn sensors. ACM Trans Appl Percept 9(1):2:1–2:21

    Google Scholar 

  • Bulling A, Weichel C, Gellersen H (2013) EyeContext: recognition of high-level contextual cues from human visual behavior. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 305–308

    Google Scholar 

  • Canosa RL (2009) Real-world vision: selective perception and task. ACM Trans Appl Percept 6(2):article 11, 34 pp

    Google Scholar 

  • Castellina E, Corno F (2007) Accessible web surfing through gaze interaction. In: Proceedings of the 3rd Conference on communication by gaze interaction, COGAIN 2007, Leicester, 3–4 Sept 2007, pp 74–77

    Google Scholar 

  • Chen Y, Newman WS (2004) A human-robot interface based on electrooculography. In: Proceedings of the international conference on robotics and automation, ICRA 2004, vol 1, pp 243–248

    Google Scholar 

  • Corno F, Gale A, Majaranta P et al (2010) Eye-based direct interaction for environmental control in heterogeneous smart environments. In: Nakashima H et al (eds) Handbook of ambient intelligence and smart environments. Springer, New York, pp 1117–1138

    Google Scholar 

  • Ding Q, Tong K, Li G (2005) Development of an EOG (ElectroOculography) based human-computer interface. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, EMBS 2005, pp 6829–6831

    Google Scholar 

  • Donegan M, Morris DJ, Corno F et al (2009) Understanding users and their needs. Univ Access Inf Soc 8(4):259–275

    Article  Google Scholar 

  • Drewes H, Schmidt A (2007) Interacting with the computer using gaze gestures. In: Proceedings of INTERACT ‘07. Lecture notes in computer science, vol 4663. Springer, Heidelberg, pp 475-488

    Google Scholar 

  • Du R, Liu R, Wu T et al (2012) Online vigilance analysis combining video and electrooculography features. In: Proceedings of 19th international conference on neural information processing, ICONIP 2012. Lecture notes in computer science, vol 7667. Springer, Heidelberg, pp 447–454

    Google Scholar 

  • Duchowski AT (2002) A breadth-first survey of eye-tracking applications. Behav Res Meth 34(4):455–470

    Article  Google Scholar 

  • Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, London

    Book  Google Scholar 

  • Duchowski AT, Cournia NA, Murphy HA (2004) Gaze-contingent displays: a review. CyberPsychol Behav 7(6):621–634

    Article  Google Scholar 

  • Duchowski AT, Vertegaal R (2000) Eye-based interaction in graphical systems: theory and practice. Course 05, SIGGRAPH 2000. Course notes. ACM, New York. http://eyecu.ces.clemson.edu/sigcourse/. Accessed 23 Feb 2013

  • Dybdal ML, San Agustin J, Hansen JP (2012) Gaze input for mobile devices by dwell and gestures. In: Proceedings of the symposium on eye tracking research and applications, ETRA ‘12. ACM, New York, pp 225–228

    Google Scholar 

  • ElHelw MA, Atkins S, Nicolaou M et al (2008) A gaze-based study for investigating the perception of photorealism in simulated scenes. ACM Trans Appl Percept 5(1):article 3, 20 pp

    Google Scholar 

  • Ellis S, Cadera R, Misner J et al (1998) Windows to the soul? What eye movements tell us about software usability. In: Proceedings of 7th annual conference of the usability professionals association, Washington, pp 151–178

    Google Scholar 

  • Essig K, Dornbusch D, Prinzhorn D et al (2012) Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 37–44

    Google Scholar 

  • Fairclough SH (2011) Physiological computing: interacting with the human nervous system. In: Ouwerkerk M, Westerlink J, Krans M (eds) Sensing emotions in context: the impact of context on behavioural and physiological experience measurements. Springer, Amsterdam, pp 1–22

    Google Scholar 

  • Fejtová M, Figueiredo L, Novák P et al (2009) Hands-free interaction with a computer and other technologies. Univ Access Inf Soc 8(4):277–295

    Google Scholar 

  • Fono D, Vertegaal R (2005) EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’05. ACM, New York, pp 151–160

    Google Scholar 

  • Friedman MB, Kiliany G, Dzmura et al (1982) The eyetracker communication system. Johns Hopkins APL Technical Digest 3(3):250–252

    Google Scholar 

  • Fujitsu (2012) Fujitsu develops eye tracking technology. Press release 2 Oct 2012. http://www.fujitsu.com/global/news/pr/archives/month/2012/20121002-02.html. Accessed 23 Feb 2013

  • Goldberg JH, Wichansky AM (2003) Eye tracking in usability evaluation: a practitioner’s guide. In: Hyönä J, Radach R, Deubel H (eds) The mind’s eye: cognitive and applied aspects of eye movement research. North-Holland, Amsterdam, pp 493–516

    Chapter  Google Scholar 

  • Greene MR, Liu TY, Wolfe JM (2012) Reconsidering Yarbus: a failure to predict observers’ task from eye movement patterns. Vis Res 62:1–8

    Article  Google Scholar 

  • Hacisalihzade SS, Stark LW, Allen JS (1992) Visual perception and sequences of eye movement fixations: a stochastic modeling approach. IEEE Trans Syst Man Cybern 22(3):474–481

    Article  Google Scholar 

  • Hammoud R (ed) (2008) Passive eye monitoring: algorithms, applications and experiments. Series: signals and communication technology. Springer, Berlin

    Google Scholar 

  • Hansen DW, Ji Q (2009) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500

    Article  Google Scholar 

  • Hansen DW, Majaranta P (2012) Basics of camera-based gaze tracking. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. Medical Information Science Reference, Hershey, pp 21–26

    Google Scholar 

  • Hansen DW, Pece AEC (2005) Eye tracking in the wild. Comput Vis Image Underst 98(1):155–181

    Article  Google Scholar 

  • Hansen DW, Skovsgaard HH, Hansen JP et al (2008) Noise tolerant selection by gaze-controlled pan and zoom in 3D. In: Proceedings of the symposium on eye tracking research and applications, ETRA’08. ACM, New York, pp 205–212

    Google Scholar 

  • Hansen DW, San Agustin J, Villanueva A (2010) Homography normalization for robust gaze estimation in uncalibrated setups. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 13–20

    Google Scholar 

  • Hansen JP, Tørning K, Johansen AS et al (2004) Gaze typing compared with input by head and hand. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 131–138

    Google Scholar 

  • Hillstrom AP, Yantis S (1994) Visual motion and attentional capture. Percept Psychophys 55(4):399–411

    Article  Google Scholar 

  • Hori J, Sakano K, Miyakawa M, Saitoh Y (2006) Eye movement communication control system based on EOG and voluntary eye blink. In: Proceedings of the 9th international conference on computers helping people with special needs, ICCHP, vol 4061, pp 950–953

    Google Scholar 

  • Huckauf A, Urbina MH (2008) On object selection in gaze controlled environments. J Eye Mov Res 2(4):1–7

    Google Scholar 

  • Hyrskykari A, Majaranta P, Aaltonen A et al (2000) Design issues of iDICT: a gaze-assisted translation aid. In: Proceedings of the 2000 symposium on eye tracking research and applications, ETRA 2000. ACM, New York, pp 9–14

    Google Scholar 

  • Hyrskykari A, Majaranta P, Räihä KJ (2003) Proactive response to eye movements. In: Rauterberg et al (eds) Proceedings of INTERACT 2003, pp 129–136

    Google Scholar 

  • Isokoski P (2000) Text input methods for eye trackers using off-screen targets. In: Proceedings of the symposium on eye tracking research and applications, ETRA’00. ACM, New York, pp 15–21

    Google Scholar 

  • Isokoski P, Joos M, Spakov O et al (2009) Gaze controlled games. Univ Access Inf Soc 8(4):323–337

    Article  Google Scholar 

  • Istance H, Hyrskykari A (2012) Gaze-aware systems and attentive applications. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 175–195

    Google Scholar 

  • Jacob RJK (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans Inf Sys 9(3):152–169

    Article  Google Scholar 

  • Jacob RJK (1993) Eye movement-based human-computer interaction techniques: toward non-command interfaces. In: Hartson HR, Hix D (eds) Advances in human-computer interaction, vol 4. Ablex Publishing Co, Norwood, pp 151–190

    Google Scholar 

  • Jacob RJK (1995) Eye tracking in advanced interface design. In: Barfield W, Furness TA (eds) Virtual environments and advanced interface design. Oxford University Press, New York, pp 258–288

    Google Scholar 

  • Jokinen K, Majaranta P (2013) Eye-gaze and facial expressions as feedback signals in educational interactions. In: Barres DG et al (eds) Technologies for inclusive education: beyond traditional integration approaches. IGI Global, Hershey, pp 38–58

    Google Scholar 

  • Kandemir M, Kaski S (2012) Learning relevance from natural eye movements in pervasive interfaces. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 85–92

    Google Scholar 

  • Kherlopian AR, Gerrein JP, Yue M et al (2006) Electrooculogram based system for computer control using a multiple feature classification model. In: Proceedings of the 28th annual international conference of the engineering in medicine and biology society, EMBS 2006, pp 1295–1298

    Google Scholar 

  • Kim Y, Doh N, Youm Y et al (2001) Development of a human-mobile communication system using electrooculogram signals. In: Proceedings of the 2001 IEEE/RSJ international conference on intelligent robots and systems, IROS 2001, vol 4, pp 2160–2165

    Google Scholar 

  • Kinsman TB, Pelz JB (2012) Location by parts: model generation and feature fusion for mobile eye pupil tracking under challenging lighting. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 695–700

    Google Scholar 

  • Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100

    Article  Google Scholar 

  • Land MF, Furneaux S (1997) The knowledge base of the oculomotor system. Philos Trans Biol Sci 352(1358):1231–1239

    Article  Google Scholar 

  • Larsen EJ (2012) Systems and methods for providing feedback by tracking user gaze and gestures. Sony Computer Entertainment Inc. US Patent application 2012/0257035. http://www.faqs.org/patents/app/20120257035. Accessed 23 Feb 2013

  • Liu SC, Delbruck T (2010) Neuromorphic sensory systems. Curr Opin Neurobiol 20:1–8

    Article  Google Scholar 

  • Lorenceau J (2012) Cursive writing with smooth pursuit eye movements. Curr Biol 22(16):1506–1509. doi:10.1016/j.cub.2012.06.026

    Article  Google Scholar 

  • Majaranta P (2012) Communication and text entry by gaze. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 63–77

    Google Scholar 

  • Majaranta P, Ahola UK, Špakov O (2009a) Fast gaze typing with an adjustable dwell time. In: Proceedings of the 27th international conference on human factors in computing systems, CHI 2009. ACM, New York, pp 357–360

    Google Scholar 

  • Majaranta P, Bates R, Donegan M (2009b) Eye tracking. In: Stephanidis C. (ed) The universal access handbook, chapter 36. CRC Press, Boca Raton, 20 pp

    Google Scholar 

  • Majaranta P, MacKenzie IS, Aula A et al (2006) Effects of feedback and dwell time on eye typing speed and accuracy. Univ Access in Inf Soc 5(2):199–208

    Article  Google Scholar 

  • Majaranta P, Räihä KJ (2002) Twenty years of eye typing: systems and design issues. In: Proceedings of 2002 symposium on eye tracking research and applications, ETRA 2002. ACM, New York, pp 15–22

    Google Scholar 

  • Manabe H, Fukumoto M (2006) Full-time wearable headphone-type gaze detector. In: Extended abstracts of the SIGCHI conference on human factors in computing systems, CHI 2006. ACM, New York, pp 1073–1078

    Google Scholar 

  • Mele ML, Federici S (2012) A psychotechnological review on eye-tracking systems: towards user experience. Disabil Rehabil Assist Technol 7(4):261–281

    Article  Google Scholar 

  • Miniotas D, Špakov O, Tugoy I et al (2006) Speech-augmented eye gaze interaction with small closely spaced targets. In: Proceedings of the 2006 symposium on eye tracking research and applications, ETRA ‘06. ACM, New York, pp 67–72

    Google Scholar 

  • Mizuno F, Hayasaka T, Tsubota K et al (2003) Development of hands-free operation interface for wearable computer-hyper hospital at home. In: Proceedings of the 25th annual international conference of the engineering in medicine and biology society, EMBS 2003, vol 4, 17–21 Sept 2003, pp 3740–3743

    Google Scholar 

  • Mohammad Y, Okada S, Nishida T (2010) Autonomous development of gaze control for natural human-robot interaction. In Proceedings of the 2010 workshop on eye gaze in intelligent human machine interaction (EGIHMI ‘10). ACM, New York, NY, USA, pp 63–70

    Google Scholar 

  • Morimoto CH, Mimica MRM (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24

    Article  Google Scholar 

  • Mulvey F (2012) Eye anatomy, eye movements and vision. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 10–20

    Google Scholar 

  • Murphy-Chutorian E, Trivedi MM (2009) Head pose estimation in computer vision: a survey. IEEE Trans Pattern Anal Mach Intell 31(4):607–626

    Article  Google Scholar 

  • Nagamatsu N, Sugano R, Iwamoto Y et al (2010) User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 251–254

    Google Scholar 

  • Nakano YI, Jokinen J, Huang HH (2012) 4th workshop on eye gaze in intelligent human machine interaction: eye gaze and multimodality. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 611–612

    Google Scholar 

  • Nielsen J (1993) Noncommand user interfaces. Commun ACM 36(4):82–99

    Article  Google Scholar 

  • Ohno T (1998) Features of eye gaze interface for selection tasks. In: Proceedings of the 3rd Asia Pacific computer-human interaction, APCHI’98. IEEE Computer Society, Washington, pp 176–182

    Google Scholar 

  • Osterberg G (1935) Topography of the layer of rods and cones in the human retina. Acta Ophthalmol Suppl 13(6):1–102

    Google Scholar 

  • Patmore DW, Knapp RB (1998) Towards an EOG-based eye tracker for computer control. In Proceedings of the 3rd international ACM conference on assistive technologies, ASSETS’98. ACM, New York, pp 197–203

    Google Scholar 

  • Penzel T, Lo CC, Ivanov PC et al (2005) Analysis of sleep fragmentation and sleep structure in patients with sleep apnea and normal volunteers. In: 27th annual international conference of the engineering in medicine and biology society, IEEE-EMBS 2005, pp 2591–2594

    Google Scholar 

  • Philips GR, Catellier AA, Barrett SF et al (2007) Electrooculogram wheelchair control. Biomed Sci Instrum 43:164–169

    Google Scholar 

  • Porta M Ravarelli A, Spagnoli G (2010) ceCursor, a contextual eye cursor for general pointing in windows environments. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 331–337

    Google Scholar 

  • Rayner K (1995) Eye movements and cognitive processes in reading, visual search, and scene perception. In: Findlay JM et al (eds) Eye movement research: mechanisms, processes and applications. North Holland, Amsterdam, pp 3–22

    Chapter  Google Scholar 

  • Rothkopf CA, Pelz JP (2004) Head movement estimation for wearable eye tracker. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 123–130

    Google Scholar 

  • Räihä K-J, Hyrskykari A, Majaranta P (2011) Tracking of visual attention and adaptive applications. In: Roda C (ed) Human attention in digital environments. Cambridge University Press, Cambridge, pp 166–185

    Google Scholar 

  • Salvucci DD, Anderson JR (2001) Automated eye-movement protocol analysis. Human-Comput Interact 16(1):39–86

    Article  Google Scholar 

  • Schneider E, Dera T, Bard K et al (2005) Eye movement driven head-mounted camera: it looks where the eyes look. In: IEEE international conference on systems, man and cybernetics, vol 3, pp 2437–2442

    Google Scholar 

  • Shell JS, Vertegaal R, Cheng D et al (2004) ECSGlasses and EyePliances: using attention to open sociable windows of interaction. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 93–100

    Google Scholar 

  • Sibert LE, Jacob RJK (2000) Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’00, ACM, pp 281–288

    Google Scholar 

  • Skovsgaard H, Räihä KJ, Tall M (2012) Computer control by gaze. In: Majaranta P et al. (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 78–102

    Google Scholar 

  • Smith JR, Cronin MJ, Karacan I (1971) A multichannel hybrid system for rapid eye movement detection (REM detection). Comp Biomed Res 4(3):275–290

    Article  Google Scholar 

  • Špakov O, Majaranta P (2012) Enhanced gaze interaction using simple head gestures. In: Proceedings of the 14th international conference on ubiquitous computing, UbiComp’12. ACM Press, New York, pp 705–710

    Google Scholar 

  • Stampe DM, Reingold EM (1995) Selection by looking: a novel computer interface and its application to psychological research. In: Findlay JM, Walker R, Kentridge RW (eds) Eye movement research: mechanisms, processes and applications. Elsevier Science, Amsterdam, pp 467–478

    Chapter  Google Scholar 

  • Stellmach S, Dachselt F (2012) Look and touch: gaze-supported target acquisition. In: Proceedings of the 2012 ACM annual conference on human factors in computing systems, CHI’12. ACM, New York, pp 2981–2990

    Google Scholar 

  • Sugioka A, Ebisawa Y, Ohtani M (1996) Noncontact video-based eye-gaze detection method allowing large head displacements. In: Proceedings of the 18th annual international conference of the IEEE engineering in medicine and biology society. Bridging disciplines for biomedicine, vol 2, pp 526–528

    Google Scholar 

  • Ten Kate JH, Frietman EEE, Willems W et al (1979) Eye-switch controlled communication aids. In: Proceedings of the 12th international conference on medical and biological engineering, Jerusalem, Israel, August 1979

    Google Scholar 

  • Tessendorf B, Bulling A, Roggen D et al (2011) Recognition of hearing needs from body and eye movements to improve hearing instruments. In: Proceedings of the 9th international conference on pervasive computing, Pervasive 2011. Lecture notes in computer science, vol 6696. Springer, Heidelberg, pp 314–331

    Google Scholar 

  • Tobii (2011) Tobii unveils the world’s first eyecontrolled laptop. Press release, 1 March 2011. http://www.tobii.com/en/eye-tracking-research/global/news-and-events/press-release-archive/archive-2011/tobii-unveils-the-worlds-first-eye-controlled-laptop. Accessed 23 Feb 2013

  • Vehkaoja AT, Verho JA, Puurtinen MM et al (2005) Wireless head cap for EOG and facial EMG measurements. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, IEEE EMBS 2005, pp 5865–5868

    Google Scholar 

  • Velichkovsky B, Sprenger A, Unema P (1997) Towards gaze-mediated interaction: collecting solutions of the “Midas touch problem”. In: Proceedings of the IFIP TC13 international conference on human-computer interaction, INTERACT’97. Chapman and Hall, London, pp 509–516

    Google Scholar 

  • Venkataramanan, A Prabhat P, Choudhury SR et al (2005) Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system. In: Proceedings of the 3rd international conference on intelligent sensing and information processing, ICISIP 2005, IEEE Conference Publications, pp 535–540

    Google Scholar 

  • Vesterby T, Voss JC, Hansen JP et al (2005) Gaze-guided viewing of interactive movies. Digit Creativity 16(4):193–204

    Article  Google Scholar 

  • Vickers S, Istance H, Smalley M (2010) EyeGuitar: making rhythm based music video games accessible using only eye movements. In: Proceedings of the 7th international conference on advances in computer entertainment technology, ACE’10. ACM, New York, pp 36–39

    Google Scholar 

  • Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing, UbiComp 2013. ACM, New York

    Google Scholar 

  • Wade NJ, Tatler BW (2005) The moving tablet of the eye: the origins of modern eye movement research. Oxford University Press, Oxford

    Book  Google Scholar 

  • Ware C, Mikaelian HH (1987) An evaluation of an eye tracker as a device for computer input. In: Proceedings of the SIGCHI/GI conference on human factors in computing systems and graphics interface, CHI and GI’87. ACM, New York, pp 183–188

    Google Scholar 

  • Wijesoma WS, Wee Ks, Wee OC et al (2005) EOG based control of mobile assistive platforms for the severely disabled. In: Proceedings of the IEEE international conference on robotics and biomimetics, ROBIO 2005, pp 490–494

    Google Scholar 

  • Wobbrock JO, Rubinstein J, Sawyer MW et al (2008) Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In: Proceedings of the symposium on eye tracking research and applications, ETRA 2008. ACM, New York, pp 11–18

    Google Scholar 

  • Wästlund W, Sponseller K, Pettersson O (2010) What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA 2010. ACM, New York, pp 133–136

    Google Scholar 

  • Zhang Y, Rasku J, Juhola M (2012) Biometric verification of subjects using saccade eye movements. Int J Biometr 4(4):317–337

    Article  Google Scholar 

  • Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 851–860

    Google Scholar 

  • Zhu Z, Ji Q (2005) Eye gaze tracking under natural head movements. In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005, vol 1, pp 918–923

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Päivi Majaranta .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag London

About this chapter

Cite this chapter

Majaranta, P., Bulling, A. (2014). Eye Tracking and Eye-Based Human–Computer Interaction. In: Fairclough, S., Gilleade, K. (eds) Advances in Physiological Computing. Human–Computer Interaction Series. Springer, London. https://doi.org/10.1007/978-1-4471-6392-3_3

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-6392-3_3

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-6391-6

  • Online ISBN: 978-1-4471-6392-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics