Real-Time Gaze Tracking for Public Displays

  • Andreas Sippl
  • Clemens Holzmann
  • Doris Zachhuber
  • Alois Ferscha
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6439)

Abstract

In this paper, we explore the real-time tracking of human gazes in front of large public displays. The aim of our work is to estimate at which area of a display one ore more people are looking at a time, independently from the distance and angle to the display as well as the height of the tracked people. Gaze tracking is relevant for a variety of purposes, including the automatic recognition of the user’s focus of attention, or the control of interactive applications with gaze gestures. The scope of the present paper is on the former, and we show how gaze tracking can be used for implicit interaction in the pervasive advertising domain. We have developed a prototype for this purpose, which (i) uses an overhead mounted camera to distinguish four gaze areas on a large display, (ii) works for a wide range of positions in front of the display, and (iii) provides an estimation of the currently gazed quarters in real time. A detailed description of the prototype as well as the results of a user study with 12 participants, which show the recognition accuracy for different positions in front of the display, are presented.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Proc. of the 2nd International Workshop on Pervasive Advertising, in conjunction with Informatik 2009, Lübeck, Germany (2009)Google Scholar
  2. 2.
    Adomavicius, G., Tuzhilin, A.: Toward the next generation of recommender systems: A survey of the state-of-the-art and possible extensions. IEEE Transactions on Knowledge and Data Engineering 17, 734–749 (2005)CrossRefGoogle Scholar
  3. 3.
    Morimoto, C.H., Mimica, M.R.M.: Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98(1), 4–24 (2005)CrossRefGoogle Scholar
  4. 4.
    Mubin, O., Lashina, T., Loenen, E.: How not to become a buffoon in front of a shop window: A solution allowing natural head movement for interaction with a public display. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5727, pp. 250–263. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  5. 5.
    Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(4), 607–626 (2009)CrossRefGoogle Scholar
  6. 6.
    Nakanishi, Y., Fujii, T., Kiatjima, K., Sato, Y., Koike, H.: Vision-based face tracking system for large displays. In: Borriello, G., Holmquist, L.E. (eds.) UbiComp 2002. LNCS, vol. 2498, pp. 152–159. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  7. 7.
    Newman, R., Matsumoto, Y., Rougeaux, S., Zelinsky, A.: Real-time stereo tracking for head pose and gaze estimation. In: Proc. of 4th Conference on Automatic Face and Gesture Recognition (FG 2000), pp. 122–128. IEEE CS, Los Alamitos (2000)Google Scholar
  8. 8.
    Ranganathan, A., Campbell, R.H.: Advertising in a pervasive computing environment. In: Proc. of the 2nd Workshop on Mobile Commerce (WMC 2002), pp. 10–14. ACM, New York (2002)CrossRefGoogle Scholar
  9. 9.
    Schmidt, A.: Implicit human computer interaction through context. Personal and Ubiquitous Computing 4(2/3) (2000)Google Scholar
  10. 10.
    Strandvall, T.: Eye tracking in human-computer interaction and usability research. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5727, pp. 936–937. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  11. 11.
    Vogel, D., Balakrishnan, R.: Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users. In: Proc. of UIST, pp. 137–146. ACM, New York (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Andreas Sippl
    • 1
  • Clemens Holzmann
    • 2
  • Doris Zachhuber
    • 1
  • Alois Ferscha
    • 1
  1. 1.Institute for Pervasive ComputingJohannes Kepler University LinzAustria
  2. 2.Mobile ComputingUpper Austria University of Applied SciencesAustria

Personalised recommendations