Abstract
Data is in a very important position for pattern recognition tasks including eye gaze estimation. In the literature, most researchers used normal face datasets, which are not specifically designed for eye gaze estimation. As a result, it is difficult to obtain fine labeled eye gaze direction. Therefore large datasets with well-defined gaze directions are desired.
To facilitate related researches, we collect and establish the Oulu Multi-pose Eye Gaze Dataset. Inspired by the psychological observation that gaze direction is intrinsically linked with the head orientation, we are devoted to a new data set of eye gaze images captured under multiple head poses. It finally results in a dataset containing over 40K images from 50 subjects, who were asked to fixate on 10 special points on screen under different poses respectively. We investigate a new eye gaze estimation approach by using the IGO based description, and compare it with other popular eye gaze estimation approaches to provide the baseline results on our dataset.
Chapter PDF
Similar content being viewed by others
References
Fitch, A., Kadyrov, A., Christmas, W., Kittler, J.: Orientation correlation. In: Proc. BMVC. Citeseer, pp. 1–10 (2002)
Smith, B., Yin, Q., Nayar, S.: Gaze locking: passive eye contact detection for human-object interaction. In: Proc. UIST., pp. 271–280. ACM (2013)
Chang, C., Lin, C.: LIBSVM: a library for support vector machines. ACM Trans. Intelligent Systems and Technology, 2(27) (2011)
Morimoto, C., Amir, A., Flickner, M.: Detecting eye position and gaze from a single camera and 2 light sources. In: Proc. ICPR., pp. 314–317. IEEE (2002)
Beymer, D., Flickner, M.: Eye gaze tracking using an active stereo head. In: Proc. CVPR. IEEE (2003)
Hansen, D., Pece, A.: Eye tracking in the wild. Comput. Vis. Image Und. 98(1), 155–181 (2005)
Hansen, D., Ji, Q.: In the eye of the beholder: A survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010)
FASTRAK 3D tracker. http://www.cortechsolutions.com/Products/EL/EL-FP/EL-FP-FT
Lu, F., Okabe, T., Sugano, Y., Sato, Y.: Learning gaze biases with head motion for head pose-free gaze estimation. Image and Vision Computing 32(3), 169–179 (2014)
Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive Linear Regression for Appearance-Based Gaze Estimation. IEEE Trans. Pattern Anal. Mach. Intell., PrePrints (2014)
Tzimiropoulos, G., Zafeiriou, S., Pantic, M.: Subspace learning from image gradient orientations. IEEE Trans. Pattern Anal. Mach. Intell. 34(12), 2454–2466 (2012)
Tzimiropoulos, G., Argyriou, V., Zafeiriou, S., Stathaki, T.: Robust fft-based scale-invariant image registration with image gradients. IEEE Trans Pattern Anal. Mach. Intell. 32(10), 1899–1906 (2010)
Bar-Itzhack, I.: New method for extracting the quaternion from a rotation matrix. Journal of Guidance, Control, and Dynamics 23(6), 1085–1087 (2000)
Illusion of face 9. http://www.psy.ritsumei.ac.jp/~akitaoka/kao9e.html
Wang, J., Sung, E.: Study on eye gaze estimation. IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics 32(3), 332–350 (2002)
Kim, K., Ramakrishna, R.: Vision-based eye-gaze tracking for human computer interface. In: Proc. SMC, pp. 324–329. IEEE (1999)
Tan, K., Kriegman, D., Ahuja, N.: Appearance-based eye gaze estimation. In: Proc. WACV, pp. 191–195. IEEE (2002)
Symons, L., Lee, K., Cedrone, C., Nishimura, M.: What are you looking at? acuity for triadic eye gaze. J. Gen. Psychol. 131(4), 451–469 (2004)
Cline, M.: The perception of where a person is looking. Am. J. Psychol. 80(1), 41–50 (1967)
Gamer, M., Hecht, H.: Are you looking at me? Measuring the cone of gaze. J. Exp. Psychol. [Hum Percept.]. 33(3), 705–715 (2007)
Turk, M., Pentland, A.: Eigenfaces for recognition. Journal of Cognitive Neuroscience 3(1), 71–86 (1991)
Duda, R., Hart, P., Stork, D.: Pattern classification. John Willey & Sons 2, 114–124 (2001)
Jenkins, R.: The lighter side of gaze perception. Perception 36, 1266–1268 (2007)
Valenti, R., Sebe, N., Gevers, T.: Combining Head Pose and Eye Location Information for Gaze Estimation. IEEE Trans. Image Process 21(2), 802–815 (2012)
Ando, S.: Luminance-induced shift in the apparent direction of gaze. Perception 31, 657–674 (2002)
Asteriadis, S., Soufleros, D., Karpouzis, K., Kollias, S.: A natural head pose and eye gaze dataset. In: Proc. AFFINE Workshop (2009)
Baluja, S., Pomerleau, D.: Non-intrusive gaze tracking using artificial neural networks. Tech. rep., Department of Computer Science, Carnegie Mellon University (1994)
Langton, S., Honeyman, H., Tessler, E.: The influence of head contour and nose angle on the perception of eye-gaze direction. Perception & Psychophysics 66(5), 752–771 (2004)
Zhang, T., Tang, Y.Y., Fang, B., Shang, Z., Liu, X.: Face recognition under varying illumination using gradientfaces. IEEE Trans. Image Process. 18(11), 2599–2606 (2009)
Weidenbacher, U., Layher, G., Strauss, P., Neumann, H.: A comprehensive head pose and gaze database. In: Proc. IET (2007)
Hong, X., Zhao, G., Pietikainen, M.: Pose Estimation via Complex-Frequency Domain Analysis of Image Gradient Orientations. In: Proc. ICPR. IEEE (2014)
Zhao, X., Shan, S., Chai, X., Chen, X.: Cascaded Shape Space Pruning for Robust Facial Landmark Detection. In: Proc. ICCV. IEEE (2013)
Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-Based Gaze Estimation Using Visual Saliency. IEEE Trans. Pattern Anal. Mach. Intell. 35(2), 329–341 (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
He, Q. et al. (2015). OMEG: Oulu Multi-Pose Eye Gaze Dataset. In: Paulsen, R., Pedersen, K. (eds) Image Analysis. SCIA 2015. Lecture Notes in Computer Science(), vol 9127. Springer, Cham. https://doi.org/10.1007/978-3-319-19665-7_35
Download citation
DOI: https://doi.org/10.1007/978-3-319-19665-7_35
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-19664-0
Online ISBN: 978-3-319-19665-7
eBook Packages: Computer ScienceComputer Science (R0)