Kalman Filtering in the Design of Eye-Gaze-Guided Computer Interfaces

  • Oleg V. Komogortsev
  • Javed I. Khan
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4552)


In this paper, we design an Attention Focus Kalman Filter (AFKF) - a framework that offers interaction capabilities by constructing an eye-movement language, provides real-time perceptual compression through Human Visual System (HVS) modeling, and improves system’s reliability. These goals are achieved by an AFKF through identification of basic eye-movement types in real-time, the prediction of a user’s perceptual attention focus, and the use of the eye’s visual sensitivity function and eye-position data signal de-noising.


Human Visual System Modeling Kalman Filter Human Computer Interaction Perceptual Compression 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Jacob, R.J.K.: Eye tracking in advanced interface design, Virtual environments and advanced interface design. Oxford University Press, Inc., New York, NY (1995)Google Scholar
  2. 2.
    Ware, C., Mikaelian, H.T.: An Evaluation of an Eye Tracker as a Device for Computer Input. In: Proc. ACM CHI+GI’87 Human Factors in Computing Systems Conference pp. 183–188 (1987)Google Scholar
  3. 3.
    Komogortsev, O.: World of Warcraft Percept Interface, http://www.cs.kent.edu/~okomogor/wowpercept/wowpercept.htm
  4. 4.
    Duchowski, A.T.: Eye Tracking Methodology: Theory and Practic. Springer, London, UK (2003)Google Scholar
  5. 5.
    Grindinger, T.: Eye Movement Analysis and Prediction with the Kalman Filter, Masters thesis, Computer Science, Clemson University, Clemson, SC, USA (August 2006)Google Scholar
  6. 6.
    Bahill, A.T.: Development, validation and sensitivity analyses of human eye movement models. CRC Critical Reviews in Bioengineering 4, 311–355 (1980)Google Scholar
  7. 7.
    Komogortsev, O., Khan, J.: Perceptual Multimedia Compression based on the Predictive Kalman Filter Eye Movement Modeling. In: Proceedings of the Multimedia Computing and Networking Conference (MMCN 2007), San Jose, pp. 1–12 (January 28 – February 1, 2007)Google Scholar
  8. 8.
    Sauter, D., Martin, B.J., Di Renzo, N., Vomscheid, C.: Analysis of eyetracking movements using innovations generated by a Kalman filter. Med. Biol. Eng. Comput 29, 63–69 (1991)CrossRefGoogle Scholar
  9. 9.
    Norimichi, T., Chizuko, E., Hideaki, H., Yoichi, M.: Image compression and decompression based on gazing area. In: Human Vision and Electronic Imagin, SPIE (April 1996)Google Scholar
  10. 10.
    Stelmach, L.B., Tam, W.J.: Processing image sequences based on eye movements. In: Proc. SPIE 2179, 90–98 (1994)Google Scholar
  11. 11.
    Murphy, H., Duchowski, A.T.: Gaze-Contigent Level Of Detail Rendering. Eurographics (2001)Google Scholar
  12. 12.
    Carpenter, R.H.S.: Movements of the Eyes, pp. 56–57. Pion, London (1977)Google Scholar
  13. 13.
    Robinson, D.A.: Models of the saccadic eye movement control system. Kybernetic 14, 71 (1973)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Oleg V. Komogortsev
    • 1
  • Javed I. Khan
    • 1
  1. 1.Perceptual Engineering Laboratory, Department of Computer Science, Kent State University, Kent, OH, 44242USA

Personalised recommendations