Advertisement

Machine Vision and Applications

, Volume 20, Issue 6, pp 353–364 | Cite as

Real-time gaze tracking with appearance-based models

  • Javier Orozco
  • F. Xavier Roca
  • Jordi Gonzàlez
Original Paper

Abstract

Psychological evidence has emphasized the importance of eye gaze analysis in human computer interaction and emotion interpretation. To this end, current image analysis algorithms take into consideration eye-lid and iris motion detection using colour information and edge detectors. However, eye movement is fast and and hence difficult to use to obtain a precise and robust tracking. Instead, our method proposed to describe eyelid and iris movements as continuous variables using appearance-based tracking. This approach combines the strengths of adaptive appearance models, optimization methods and backtracking techniques. Thus, in the proposed method textures are learned on-line from near frontal images and illumination changes, occlusions and fast movements are managed. The method achieves real-time performance by combining two appearance-based trackers to a backtracking algorithm for eyelid estimation and another for iris estimation. These contributions represent a significant advance towards a reliable gaze motion description for HCI and expression analysis, where the strength of complementary methodologies are combined to avoid using high quality images, colour information, texture training, camera settings and other time-consuming processes.

Keywords

Eyelid and iris tracking Appearance models Blinking Iris saccade Real-time gaze tracking 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bernogger, S., Yin, L., Basu, A., Pinz, A.: Eye tracking and animation for mpeg-4 coding. In: ICPR’98: Proceedings of the 14th International Conference on Pattern Recognition, Washington, USA, vol. 2, pp. 1281–1284 (1998)Google Scholar
  2. 2.
    Cauchy, A.: Méthodes générales pour la résolution des systémes déquations simultanées. C.R. Acad. Sci. Par. 25, 536–538 (1847)Google Scholar
  3. 3.
    Clark, D.: Comparing huber’s m-estimator function with the mean square error in backpropagation networks when the training data is noisy. The Information Science Discussion Paper Series, 2000(19) (2000)Google Scholar
  4. 4.
    Cohen, I., Sebe, N., Chen, L., Garg, A., Huang, T.S.: Facial expression recognition from video sequences: temporal and static modeling. Comput. Vision Image Understand 91(1–2), 160–187 (2003)CrossRefGoogle Scholar
  5. 5.
    Cootes, T.F., Taylor, C.J.: Statistical Models of Appearance for Computer Vision. University of Manchester (2004)Google Scholar
  6. 6.
    Cootes, T.F., Taylor, C.J., Cooper, D.H., Graham, J.: Active shape models - their training and application. Comput. Vis. Image Understand. 61(1), 39–59 (1995)CrossRefGoogle Scholar
  7. 7.
    Cristinacce, D., Cootes, T., Scott, I.: A multi-stage approach to facial feature detection. In: Proceedings of the British Machine Vision Conference, Kingston, September 2004Google Scholar
  8. 8.
    Doubek P., Svoboda T., Nummiaro K., Koller-Meier E., Van Gool L. Face Tracking in a Multi-Camera Environment Computer Vision Lab, 266, November 2003Google Scholar
  9. 9.
    Edwards, G.J., Cootes, T.F., Taylor, C.J.: Face recognition using active appearance models. In: Proceedings of the Fifth European Conference on Computer Vision, vol. 2, pp. 581–695 (1998)Google Scholar
  10. 10.
    Ekman, P.: Emotions Revealed. Times Books Henry Holt and Company, New York (2003)Google Scholar
  11. 11.
    Ekman, P., Friesen, W.V.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto (1978)Google Scholar
  12. 12.
    Face and Gesture Recognition Working Group. www-prima.inrialpes.fr/FGnet
  13. 13.
    FGnet Facial Expressions and Emotion Database. www.mmk.ei.tum.de/waf/fgnet/feedtum.html
  14. 14.
    Huber, P.J.: Robust estimation of a location parameter. Ann. Math. Statist. 35, 73–73 (1964)zbMATHCrossRefMathSciNetGoogle Scholar
  15. 15.
    Moriyama, T., Xiao, J., Cohn, J., Kanade, T.: Meticulously detailed eye model and its application to analysis of facial image. IEEE Trans. Pattern Anal. Mach. Intell. 28(5), 738–752 (2006)CrossRefGoogle Scholar
  16. 16.
    Nocedal J., Wright S. Numerical optimization. 1999Google Scholar
  17. 17.
    Sirhey, S., Rosenfeld, A., Duric, Z.: A method of detecting and tracking irises and eyelids in video. In: International Conference on Pattern Recognition, vol. 35, pp. 1389–1401 (2002)Google Scholar
  18. 18.
    Tan, H., Zhang, Y.: Detecting eye blink states by tracking iris and eyelids. Pattern Recognition Lett. (2005)Google Scholar
  19. 19.
    Wu Y., Liu, H., Zha, HL.: A new method of detecting human eyelids based on deformable templates. In: Systems, Man and Cybernetics, 2004 IEEE International Conference, vol. 1, pp. 604–609 (2004)Google Scholar
  20. 20.
    Zhu, Z., Ji, Q.: Eye and gaze tracking for interactive graphic display. Mach. Vis. Appl. 15, 139–148 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2008

Authors and Affiliations

  • Javier Orozco
    • 1
  • F. Xavier Roca
    • 1
  • Jordi Gonzàlez
    • 2
  1. 1.Computer Vision Center & Dept. de Ciències de la ComputaciòBellaterraSpain
  2. 2.Institut de Robòtica i Informàtica Industrial (UPC–CSIC)BarcelonaSpain

Personalised recommendations