Advertisement

An Incremental Learning Method for Unconstrained Gaze Estimation

  • Yusuke Sugano
  • Yasuyuki Matsushita
  • Yoichi Sato
  • Hideki Koike
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5304)

Abstract

This paper presents an online learning algorithm for appea- rance-based gaze estimation that allows free head movement in a casual desktop environment. Our method avoids the lengthy calibration stage using an incremental learning approach. Our system keeps running as a background process on the desktop PC and continuously updates the estimation parameters by taking user’s operations on the PC monitor as input. To handle free head movement of a user, we propose a pose-based clustering approach that efficiently extends an appearance manifold model to handle the large variations of the head pose. The effectiveness of the proposed method is validated by quantitative performance evaluation with three users.

Keywords

Incremental Learn Online Learning Algorithm Facial Mesh Desktop Environment Multiple Light Source 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Supplementary material

978-3-540-88690-7_49_MOESM1_ESM.mpg (26.7 mb)
Supplementary material (27,349 KB)

References

  1. 1.
    Hutchinson, T.E., White Jr., K.P., Martin, W.N., Reichert, K.C., Frey, L.A.: Human-computer interaction using eye-gaze input. IEEE Transactions on Systems, Man and Cybernetics 19(6), 1527–1534 (1989)CrossRefGoogle Scholar
  2. 2.
    Jacob, R.J.: What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 11–18 (1990)Google Scholar
  3. 3.
    Baluja, S., Pomerleau, D.: Non-intrusive gaze tracking using artificial neural networks. Advances in Neural Information Processing Systems (NIPS) 6, 753–760 (1994)Google Scholar
  4. 4.
    Xu, L.Q., Machin, D., Sheppard, P.: A novel approach to real-time non-intrusive gaze finding. In: Proceedings of the British Machine Vision Conference, pp. 428–437 (1998)Google Scholar
  5. 5.
    Tan, K.H., Kriegman, D.J., Ahuja, N.: Appearance-based eye gaze estimation. In: Proceedings of the Sixth IEEE Workshop on Applications of Computer Vision (WACV 2002), pp. 191–195 (2002)Google Scholar
  6. 6.
    Williams, O., Blake, A., Cipolla, R.: Sparse and semi-supervised visual mapping with the S3GP. In: Proceedings of the 2006 IEEE Conference on Computer Vision and Pattern Recognition, pp. 230–237 (2006)Google Scholar
  7. 7.
    Shih, S.W., Liu, J.: A novel approach to 3-d gaze tracking using stereo cameras. IEEE Transactions on Systems, Man and Cybernetics, Part B 34(1), 234–245 (2004)CrossRefGoogle Scholar
  8. 8.
    Zhu, Z., Ji, Q.: Eye gaze tracking under natural head movements. In: Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition (CVPR 2005), vol. 1, pp. 918–923 (2005)Google Scholar
  9. 9.
    Zhu, Z., Ji, Q., Bennett, K.P.: Nonlinear eye gaze mapping function estimation via support vector regression. In: Proceedings of the 18th International Conference on Pattern Recognition (ICPR 2006), vol. 1, pp. 1132–1135 (2006)Google Scholar
  10. 10.
    Morimoto, C., Amir, A., Flickner, M.: Detecting eye position and gaze from a single camera and 2 light sources. In: Proceedings of the 16th International Conference on Pattern Recognition (ICPR 2002), pp. 314–317 (2002)Google Scholar
  11. 11.
    Hennessey, C., Noureddin, B., Lawrence, P.: A single camera eye-gaze tracking system with free head motion. In: Proceedings of the 2006 symposium on Eye tracking research & applications, pp. 87–94 (2006)Google Scholar
  12. 12.
    Yoo, D.H., Chung, M.J.: A novel non-intrusive eye gaze estimation using cross-ratio under large head motion. Computer Vision and Image Understanding 98(1), 25–51 (2005)CrossRefGoogle Scholar
  13. 13.
    Coutinho, F.L., Morimoto, C.H.: Free head motion eye gaze tracking using a single camera and multiple light sources. In: Proceedings of the Brazilian Symposium on Computer Graphics and Image Processing, pp. 171–178 (2006)Google Scholar
  14. 14.
    Beymer, D., Flickner, M.: Eye gaze tracking using an active stereo head. In: Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition (CVPR 2003), vol. 2, pp. 451–458 (2003)Google Scholar
  15. 15.
    Matsumoto, Y., Ogasawara, T., Zelinsky, A.: Behavior recognition based on head pose and gaze direction measurement. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000), vol. 3, pp. 2127–2132 (2000)Google Scholar
  16. 16.
    Wang, J.G., Sung, E.: Study on eye gaze estimation. IEEE Transactions on Systems, Man and Cybernetics, Part B 32(3), 332–350 (2002)CrossRefGoogle Scholar
  17. 17.
    Wang, Q., Zhang, W., Tang, X., Shum, H.Y.: Real-time bayesian 3-d pose tracking. IEEE Transactions on Circuits and Systems for Video Technology 16(12), 1533–1541 (2006)CrossRefGoogle Scholar
  18. 18.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  19. 19.
    Vijayakumar, S., D’Souza, A., Schaal, S.: Incremental online learning in high dimensions. Neural Computation 17(12), 2602–2634 (2005)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Skocaj, D., Leonardis, A.: Weighted and robust incremental method for subspace learning. In: Proceedings of the Ninth IEEE International Conference on Computer Vision (ICCV 2003), pp. 1494–1501 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Yusuke Sugano
    • 1
  • Yasuyuki Matsushita
    • 2
  • Yoichi Sato
    • 1
  • Hideki Koike
    • 3
  1. 1.The University of TokyoTokyoJapan
  2. 2.Microsoft Research AsiaBeijingChina
  3. 3.The University of Electro-CommunicationsTokyoJapan

Personalised recommendations