Skip to main content
Log in

Gaze direction estimation by component separation for recognition of Eye Accessing Cues

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

This paper investigates the recognition of the Eye Accessing Cues used in the Neuro-Linguistic Programming as a method for inferring one’s thinking mechanisms, since the meaning of non-visual gaze directions may be directly related to the internal mental processes. The direction of gaze is identified by separating the components of the eye (i.e., iris, sclera and surrounding skin) followed by retrieving the relative position of the iris within the eye bounding box that was previously extracted from an eye landmarks localizer. The eye cues are retrieved via a logistic classifier from features that describe the typical regions within the eye bounding box. The simultaneous investigation of both eyes, as well as the eye tracking over consecutive frames, are shown to increase the overall performance. The here proposed solution is tested on four databases proving to have superior performance when compared in terms of recognition rate with methods relying on state of the art algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Ashraf, A.B., Lucey, S., Cohn, J.F., Chen, T., Ambadar, Z., Prkachin, K., Solomon, P.: The painful face—pain expression recognition using active appearance models. Image Vis Comput. 27(12), 1788–1796 (2009)

    Article  Google Scholar 

  2. Asteriadis, S., Soufleros, D., Karpouzis, K., Kollias, S.: A natural head pose and eye gaze dataset. In: ACM Workshop on Affective Interaction in Natural Environments, pp. 1–4 (2009)

  3. Bandler, R., Grinder, J.: Frogs into princes: neuro linguistic programming. Real People Press, Moab (1979)

    Google Scholar 

  4. Boykov, Y., Kolmogorov, V.: An experimental comparison of min-cut/max-flow algorithms for energy minimization in vision. IEEE Trans. Pattern Anal. Mach. Intell. 26(9), 1124–1137 (2004)

    Article  Google Scholar 

  5. Cadavid, S., Mahoor, M., Messinger, D., Cohn, J.: Automated classification of gaze direction using spectral regression and support vector machine. In: ACII, pp. 1–6 (2009)

  6. Cascia, M.L., Sclaroff, S., Athitsos, V.: Fast, reliable head tracking under varying illumination: An approach based on registration of texture-mapped 3d models. IEEE Trans. Pattern Anal. Mach. Intell. 22(4), 322–336 (2000)

    Article  Google Scholar 

  7. le Cessie, S., van Houwelingen, J.: Ridge estimators in logistic regression. Appl. Stat. 41(1), 191–201 (1992)

    Article  MATH  Google Scholar 

  8. Cohn, J.F., De la Torre, F.: Automated Face Analysis for Affective Computing. In: The Oxford Handbook of Affective Computing. Oxford University Press (2014)

  9. Comaniciu, D., Meer, P.: Mean shift: a robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 24(5), 603–619 (2002)

    Article  Google Scholar 

  10. Cristinacce, D., Cootes, T.: Feature detection and tracking with constrained local models. In: BMVC, pp. 929–938 (2006)

  11. Diamantopoulos, G.: Novel eye feature extraction and tracking for non-visual eye-movement applications. Ph.D. thesis, University of Birmingham (2010)

  12. Duchowski, A.: Eye tracking methodology: theory and practice. Springer, New York (2007)

    Google Scholar 

  13. Ekman, P.: Emotion in the human face. Cambridge University Press, Cambridge (1982)

    Google Scholar 

  14. Everingham, M., Zisserman, A.: Regression and classification approaches to eye localization in face images. In: IEEE FG, pp. 441–446 (2006)

  15. Fasel, B., Luettin, J.: Automatic facial expression analysis: a survey. Pattern Recognit. 36(1), 256–275 (1999)

    Google Scholar 

  16. Feng, G.C., Yuen, P.C.: Variance projection function and its application to eye detection for human face recognition. Pattern Recogn. Lett. 19(9), 899–906 (1998)

    Article  Google Scholar 

  17. Florea, C., Florea, L., Vertan, C.: Learning pain from emotion: transferred hot data representation for pain intensity estimation. In: ECCV workshop on ACVR (2014)

  18. Florea, L., Florea, C., Vertan, C., Vranceânu, R.: Zero-crossing based image projections encoding for eye localization. In: EUSIPCO, pp. 150–154 (2012)

  19. Florea, L., Florea, C., Vranceanu, R., Vertan, C.: Can your eyes tell me how you think? A gaze directed estimation of the mental activity. In: BMVC (2013)

  20. Hansen, D., Pece, A.: Eye tracking in the wild. Comput. Vis. Image Underst. 98(1), 182–210 (2005)

    Article  Google Scholar 

  21. Hansen, D., Qiang, J.: In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010)

    Article  Google Scholar 

  22. Heyman, T., Spruyt, V., Ledda, A.: 3d face tracking and gaze estimation using a monocular camera. In: Proc. of International Conference on Positioning and Context-Awareness, pp. 23–28 (2011)

  23. Kasinśki, A., Florek, A., Schmidt, A.: The PUT face database. Image Process. Commun. 13(3–4), 59–64 (2008)

    Google Scholar 

  24. Laeng, B., Teodorescu, D.S.: Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cogn. Sci. 26, 207–231 (2002)

    Article  Google Scholar 

  25. McDuff, D., Kaliouby, R.E., Picard, R.: Predicting online media effectiveness based on smile responses gathered over the internet. In: IEEE FG (2013)

  26. Messinger, D.S., Mahoor, M.H., Chow, S.M., Cohn, J.: Automated measurement of facial expression in infant-mother interaction: a pilot study. Infancy 14(3), 285–305 (2009)

    Article  Google Scholar 

  27. Meyer, F.: Topographic distance and watershed lines. Signal Process. 38, 113–125 (1994)

    Article  MATH  Google Scholar 

  28. Pentland, A.: Honest signals: how they shape our world. MIT Press, Cambridge (2008)

    Google Scholar 

  29. Pires, B., Hwangbo, M., Devyver, M., Kanade, T.: Visible-spectrum gaze tracking for sports. In: WACV (2013)

  30. Rehg, J., Abowd, G., Rozga, A., et al.: Decoding childrens social behavior. In: IEEE CVPR, pp. 3414–3421 (2013)

  31. Sturt, J., Ali, S., Robertson, W., Metcalfe, D., Grove, A., Bourne, C., Bridle, C.: Neurolinguistic programming: systematic review of the effects on health outcomes. Br. J. Gen. Pract. 62(604), 757–764 (2012)

    Article  Google Scholar 

  32. Tsiamyrtzis, P., Dowdall, J., Shastri, D., Pavlidis, I.T., Frank, M.G., Ekman, P.: Imaging facial physiology for the detection of deceit. Int. J. Comput. Vision 71, 197–214 (2007)

    Article  Google Scholar 

  33. Turkan, M., Pardas, M., Cetin, A.E.: Edge projections for eye localization. Opt. Eng. 47, 047–054 (2008)

    Google Scholar 

  34. Valenti, R., Gevers, T.: Accurate eye center location and tracking using isophote curvature. In: IEEE CVPR, pp. 1–8 (2008)

  35. Valstar M., Martinez, T., Binefa, X., Pantic, M.: Facial point detection using boosted regression and graph models. In: IEEE CVPR, pp. 2729–2736 (2010)

  36. Viola, P., Jones, M.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)

    Article  Google Scholar 

  37. Vranceanu, R., Florea, C., Florea, L., Vertan, C.: NLP EAC recognition by component separation in the eye region. In: CAIP, pp. 225–232 (2013a)

  38. Vranceanu, R., Florea, L., Florea, C.: A computer vision approach for the eye accesing cue model used in neuro-linguistic programming. Sci. Bull. Univ. Politehnica Bucharest Ser. C 75(4), 79–90 (2013b)

    Google Scholar 

  39. Wang, P., Green, M.B., Ji, Q., Wayman, J.: Automatic eye detection and its validation. In: IEEE Workshop on FRGC, CVPR, p. 164 (2005)

  40. Weidenbacher U., Layher, G., Strauss, P., Neumann, H.: A comprehensive head pose and gaze database. In: IET International Conference on Intelligent Environments., pp. 455–458 (2007)

  41. Wolf, L., Freund, Z., Avidan, S.: (2010) An eye for an eye: a single camera gaze-replacement method. In: IEEE CVPR, pp. 817–824 (2010)

  42. Wu, J., Zhou, Z.H.: Efficient face candidates selector for face detection. Pattern Recogn. 36(5), 1175–1186 (2003)

    Article  Google Scholar 

  43. Yoo, D., Chung, M.: A novel non-intrusive eye gaze estimation using cross-ratio under large head motion. Comput. Vis. Image Underst. 98(1), 25–51 (2005)

    Article  Google Scholar 

  44. Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)

    Article  Google Scholar 

  45. Zhou, Z.: Projection functions for eye detection. Pattern Recogn. 37(5), 1049–1056 (2003)

    Article  Google Scholar 

  46. Zhu, X., Ramanan, D.: Face detection, pose estimation, and landmark localization in the wild. In: IEEE CVPR, pp. 2879–2886 (2012)

Download references

Acknowledgments

This work has been co-funded by the Sectoral Operational Program Human Resources Development (SOP HRD) 2007–2013, financed from the European Social Fund and by the Romanian Government under the contract number POSDRU/107/1.5/S/76903, POSDRU/89/ 1.5/S/62557 and POSDRU/159/1.5/S/134398.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Corneliu Florea.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vrânceanu, R., Florea, C., Florea, L. et al. Gaze direction estimation by component separation for recognition of Eye Accessing Cues . Machine Vision and Applications 26, 267–278 (2015). https://doi.org/10.1007/s00138-014-0656-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-014-0656-8

Keywords

Navigation