Robust Gaze Estimation via Normalized Iris Center-Eye Corner Vector

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9834)

Abstract

Gaze estimation plays an important role in many practical scenarios such as human robot interaction. Although high accurate gaze estimation could be obtained in constrained settings with additional IR sources or depth sensors, single web-cam based gaze estimation still remains challenging. This paper propose a normalized iris center-eye corner (NIC-EC) vector based gaze estimation methods using a single, low cost web-cam. Firstly, reliable facial features and pupil centers are extracted. Then, the NIC-EC vector is proposed to enhance the robustness and accuracy for pupil center-eye corner vector based gaze estimations. Finally, an interpolation method is employed for the mapping between constructed vectors and points of regard. Experimental results showed that the proposed method has significantly improved the accuracy over the pupil center-eye corner vector based gaze estimation method with average accuracy of \(1.66^\circ \) under slight head movements.

Keywords

Gaze estimation Eye tracking Normalized iris center-eye corner vector Interpolation 

References

  1. 1.
    Cai, H., et al.: Gaze estimation driven solution for interacting children with ASD. International Symposium on Micro-Nano Mechatronics and Human Science (MHS). IEEE (2015)Google Scholar
  2. 2.
    Hansen, D.W., Ji, Q.: In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010)CrossRefGoogle Scholar
  3. 3.
    Lu, F., Sugano, Y., Okabe, T., et al.: Adaptive linear regression for appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2033–2046 (2014)CrossRefGoogle Scholar
  4. 4.
    Zhang, X., Sugano, Y., Fritz, M., et al.: Appearance-based gaze estimation in the wild. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4511–4520 (2015)Google Scholar
  5. 5.
    Mora, K., Odobez, J.M.: Geometric generative gaze estimation (G3E) for remote RGB-D cameras. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1773–1780 (2014)Google Scholar
  6. 6.
    Sugano, Y., Matsushita, Y., Sato, Y.: Learning-by-synthesis for appearance-based 3d gaze estimation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1821–1828 (2014)Google Scholar
  7. 7.
    Morimoto, C.H., Mimica, M.R.M.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)CrossRefGoogle Scholar
  8. 8.
    Topal, C., Gunal, S., Kodeviren, O., et al.: A low-computational approach on gaze estimation with eye touch system. IEEE Trans. Cybern. 44(2), 228–239 (2014)CrossRefGoogle Scholar
  9. 9.
    Sigut, J., Sidha, S.A.: Iris center corneal reflection method for gaze tracking using visible light. IEEE Trans. Biomed. Eng. 58(2), 411–419 (2011)CrossRefGoogle Scholar
  10. 10.
    Cho, D.C., Kim, W.Y.: Long-range gaze tracking system for large movements. IEEE Trans. Biomed. Eng. 60(12), 3432–3440 (2013)CrossRefGoogle Scholar
  11. 11.
    Sesma, L., Villanueva, A., Cabeza, R.: Evaluation of pupil center-eye corner vector for gaze estimation using a web cam. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 217–220. ACM (2012)Google Scholar
  12. 12.
    Cheung, Y., Peng, Q.: Eye gaze tracking with a web camera in a desktop environment. IEEE Trans. Hum. Mach. Syst. 45(4), 419–430 (2015)CrossRefGoogle Scholar
  13. 13.
    Xiong, X., De la Torre, F.: Supervised descent method for solving nonlinear least squares problems in computer vision. arXiv preprint arXiv:1405.0601 (2014)
  14. 14.
    Cai, H., Liu, B., Zhang, J., et al.: Visual Focus of Attention Estimation Using Eye Center LocalizationGoogle Scholar
  15. 15.
    Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vision 57(2), 137–154 (2004)CrossRefGoogle Scholar
  16. 16.
    Zhu, J., Yang, J.: Subpixel eye gaze tracking. In: Fifth IEEE International Conference on Automatic Face and Gesture Recognition, 2002, Proceedings. IEEE (2002)Google Scholar
  17. 17.
    Liu, H.: Exploring human hand capabilities into embedded multifingered object manipulation. IEEE Trans. Industr. Inf. 7(3), 389–398 (2011)CrossRefGoogle Scholar
  18. 18.
    Ju, Z., Liu, H.: Human hand motion analysis with multisensory information. IEEE/ASME Trans. Mechatron. 19(2), 456–466 (2014)CrossRefGoogle Scholar
  19. 19.
    Zhou, X., Yu, H., Liu, H., et al.: Tracking multiple video targets with an improved GM-PHD tracker. Sensors 15(12), 30240–30260 (2015)CrossRefGoogle Scholar
  20. 20.
    Zhou, X., Li, Y., He, B., et al.: GM-PHD-based multi-target visual tracking using entropy distribution and game theory. IEEE Trans. Industr. Inf. 10(2), 1064–1076 (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Haibin Cai
    • 1
  • Hui Yu
    • 1
  • Xiaolong Zhou
    • 2
  • Honghai Liu
    • 1
  1. 1.School of ComputingUniversity of PortsmouthPortsmouthUK
  2. 2.Shenzhen Research InstituteCity University of Hong KongHong KongChina

Personalised recommendations