Skip to main content

Accuracy of Monocular Gaze Tracking on 3D Geometry

Part of the Mathematics and Visualization book series (MATHVISUAL)

Abstract

Many applications such as data visualization or object recognition benefit from accurate knowledge of where a person is looking at. We present a system for accurately tracking gaze positions on a three dimensional object using a monocular head mounted eye tracker. We accomplish this by (1) using digital manufacturing to create stimuli whose geometry is know to high accuracy, (2) embedding fiducial markers into the manufactured objects to reliably estimate the rigid transformation of the object, and, (3) using a perspective model to relate pupil positions to 3D locations. This combination enables the efficient and accurate computation of gaze position on an object from measured pupil positions. We validate the of our system experimentally, achieving an angular resolution of 0.8 and a 1.5 % depth error using a simple calibration procedure with 11 points.

Keywords

  • Fiducial Marker
  • Angular Error
  • Rigid Transformation
  • Chin Rest
  • Depth Error

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-47024-5_10
  • Chapter length: 16 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   109.00
Price excludes VAT (USA)
  • ISBN: 978-3-319-47024-5
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   149.99
Price excludes VAT (USA)
Hardcover Book
USD   219.99
Price excludes VAT (USA)
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

References

  1. Abbott, W.W., Faisal, A.A.: Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces. J. Neural Eng. 9, 1–11 (2012)

    CrossRef  Google Scholar 

  2. Agarwal, S., Mierle, K., others: Ceres solver. http://ceres-solver.org. Cited 21 Dec 2015

  3. Barz, M., Bulling, A., Daiber, F.: Computational modelling and prediction of gaze estimation error for head-mounted eye trackers. German research center for artificial intelligence (DFKI) research reports, p. 10 (2015). https://perceptual.mpi-inf.mpg.de/files/2015/01/gazequality.pdf. Cited 21 Dec 2014

  4. Bradski, G.: The OpenCV Library. Dr. Dobb’s. J. Softw. Tools 25 (11), 120, 122–125 (2000)

    Google Scholar 

  5. Bruce, N., Tsotsos, J.: Saliency based on information maximization. Adv. Neural Inf. Process. Syst. 18, 155–162 (2005)

    Google Scholar 

  6. Cournia, N., Smith, J.D., Duchowski, A.T.: Gaze-vs. hand-based pointing in virtual environments. In: CHI’03 extended abstracts on human factors in computing systems, pp. 772–773. ACM (2003)

    Google Scholar 

  7. Duchowski, A.T., Pelfrey, B., House, D.H., Wang, R.: Measuring gaze depth with an eye tracker during stereoscopic display. In: Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization, p. 15. ACM (2011)

    Google Scholar 

  8. Essig, K., Pomplun, M., Ritter, H.: A neural network for 3D gaze recording with binocular eye trackers. Intern. J. Parallel Emerg. Distrib. Syst. 21, 79–95 (2006)

    MathSciNet  CrossRef  MATH  Google Scholar 

  9. Häkkinen, J., Kawai, T., Takatalo, J., Mitsuya, R., Nyman, G.: What do people look at when they watch stereoscopic movies? In: Woods, A.J., Holliman, N.S., Dodgson, N.A. (eds.) Stereoscopic Displays and Applications XXI, International Society for Optics and Photonics, Bellingham, Washington USA (2010)

    Google Scholar 

  10. Hanhart, P., Ebrahimi, T.: EYEC3D: 3D video eye tracking dataset. In: 2014 Sixth International Workshop on Quality of Multimedia Experience (QoMEX), pp. 55–56. IEEE (2014)

    Google Scholar 

  11. Heng, L., Li, B., Pollefeys, M.: Camodocal: automatic intrinsic and extrinsic calibration of a rig with multiple generic cameras and odometry. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1793–1800. IEEE (2013)

    Google Scholar 

  12. Hennessey, C., Lawrence, P.: Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions. IEEE Trans. Biomed. Eng. 56, 790–799 (2009)

    CrossRef  Google Scholar 

  13. Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., van de Weijer, J.: Eye tracking: a comprehensive guide to methods and measures. Oxford University Press, New York (2011)

    Google Scholar 

  14. Howard, I.P.: Preceiving in depth. Oxford University Press, Oxford (2012)

    Google Scholar 

  15. Huynh-Thu, Q., Schiatti, L.: Examination of 3D visual attention in stereoscopic video content. In: Rogowitz, B.E., Pappas, T.N. (eds.) IS&T/SPIE Electronic Imaging, pp. 78650J–78650J. International Society for Optics and Photonics, Bellingham, Washington USA (2011)

    Google Scholar 

  16. Jansen, L., Onat, S., König, P.: Influence of disparity on fixation and saccades in free viewing of natural scenes. J. Vis. 9, 1–19 (2009)

    CrossRef  Google Scholar 

  17. Judd, T., Ehinger, K., Durand, F., Torralba, A.: Learning to predict where humans look. In: International Conference on Computer Vision, pp. 2106–2113. IEEE (2009)

    Google Scholar 

  18. Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication – UbiComp’14 Adjunct, pp. 1151–1160. ACM (2014)

    Google Scholar 

  19. Kensler, A., Shirley, P.: Optimizing ray-triangle intersection via automated search. In: IEEE Symposium on Interactive Ray Tracing, pp. 33–38. IEEE (2006)

    Google Scholar 

  20. Ki, J., Kwon, YM.: 3D gaze estimation and interaction. In: 3DTV Conference: The True Vision – Capture, Transmission and Display of 3D Video, pp. 373–376. IEEE (2008)

    Google Scholar 

  21. Koenderink, J.J.: Pictorial relief. Philos. Trans. R. Soc. A: Math. Phys. Eng. Sci. 356, 1071–1086 (1998)

    MathSciNet  CrossRef  MATH  Google Scholar 

  22. Lang, C., Nguyen, T.V., Katti, H., Yadati, K., Kankanhalli, M., Yan, S.: Depth matters: influence of depth cues on visual saliency. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) Computer Vision – ECCV 2012, pp. 101–115. Springer Berlin Heidelberg (2012)

    Google Scholar 

  23. Lee, J.W., Cho, C.W., Shin, K.Y., Lee, E.C., Park, K.R.: 3D gaze tracking method using purkinje images on eye optical model and Pupil. Opt. Lasers Eng. 50, 736–751 (2012)

    CrossRef  Google Scholar 

  24. Maggia, C., Guyader, N., Guérin-Dugué, A.: Using natural versus artificial stimuli to perform calibration for 3D gaze tracking. In: Rogowitz, B.E., Pappas, T.N., de Ridder, H. (eds.) Human Vision and Electronic Imaging XVIII, International Society for Optics and Photonics, Bellingham, Washington USA (2013)

    Google Scholar 

  25. Mathe, S., Sminchisescu, C.: Dynamic eye movement datasets and learnt saliency models for visual action recognition. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) Computer Vision – ECCV 2012, pp. 842–856. Springer Berlin Heidelberg (2012)

    CrossRef  Google Scholar 

  26. Pfeiffer, T., Latoschik, M.E., Wachsmuth, I.: Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments. J. Virtual Real. Broadcast. 5, 1860–2037 (2008)

    Google Scholar 

  27. Pfeiffer, T., Renner, P.: Eyesee3d: a low-cost approach for analyzing mobile 3d eye tracking data using computer vision and augmented reality technology. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 369–376. ACM (2014)

    Google Scholar 

  28. Ramanathan, S., Katti, H., Sebe, N., Kankanhalli, M., Chua, T.-S.: An eye fixation database for saliency detection in images. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) Computer Vision – ECCV 2010, pp. 30–43. Springer Berlin Heidelberg (2010)

    CrossRef  Google Scholar 

  29. Ramasamy, C., House, D.H., Duchowski, A.T., Daugherty, B.: Using eye tracking to analyze stereoscopic filmmaking. In: Posters on SIGGRAPH’09, p. 1. ACM (2009)

    Google Scholar 

  30. Ritter, J.: An efficient bounding sphere. In: Glassner, A.S. (eds.) Graphics Gems, pp. 301–303. Academic Press, Boston (1990)

    CrossRef  Google Scholar 

  31. Schneider, P.J., Eberly, D.: Geometric Tools for Computer Graphics. Elsevier science Inc., New York (2002)

    Google Scholar 

  32. Stellmach, S., Nacke, L., Dachselt, R.: 3d attentional maps: aggregated gaze visualizations in three-dimensional virtual environments. In: Proceedings of the International Conference on Advanced Visual Interfaces, pp. 345–348. ACM (2010)

    Google Scholar 

  33. Toet, A.: Computational versus psychophysical bottom-up image saliency: a comparative evaluation study. IEEE Trans. Pattern Anal. Mach. Intell. 33, 2131–2146 (2011)

    CrossRef  Google Scholar 

Download references

Acknowledgements

This work has been partially supported by the ERC through grant ERC-2010-StG 259550 (XSHAPE). We thank Felix Haase for his valuable support in performing the experiments and Marianne Maertens for discussions on the experimental setup.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xi Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Wang, X., Lindlbauer, D., Lessig, C., Alexa, M. (2017). Accuracy of Monocular Gaze Tracking on 3D Geometry. In: Burch, M., Chuang, L., Fisher, B., Schmidt, A., Weiskopf, D. (eds) Eye Tracking and Visualization. ETVIS 2015. Mathematics and Visualization. Springer, Cham. https://doi.org/10.1007/978-3-319-47024-5_10

Download citation