Estimating the Visual Direction with Two-Circle Algorithm

  • Haiyuan Wu
  • Qian Chen
  • Toshikazu Wada
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3338)

Abstract

This paper describes a novel method to estimate visual direction from a single monocular image with “two-circle” algorithm. We assume that the visual direction of both eyes is parallel and iris boundaries are circles in 3-D space. Our “two-circle” algorithm can estimate the normal vector of the supporting plane of two iris boundaries, from which the direction of the visual direction can be calculated. Most existing gaze estimation algorithms require eye corners and some heuristic knowledge about the structure of the eye as well as the iris contours. In contrast to the exiting methods, ours does not use that additional information. Another advantage of our algorithm is that it does not require the focal length, therefore, it is capable of estimating the visual direction from an image taken by an active camera. The extensive experiments over simulated images and real images demonstrate the robustness and the effectiveness of our method.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Chen, Q., Wu, H., Wada, T.: Camera calibration with two arbitrary coplanar circles. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3023, pp. 521–532. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  2. 2.
    Wu, H., et al.: Automatic Facial Feature Points Detection with SUSAN Operator. In: SCIA, pp. 257–263 (2001)Google Scholar
  3. 3.
    Wang, J.G., Sung, E., Venkateswarlu, R.: Eye Gaze Estimation from a Single Image of One Eye. In: ICCV 2003 (2003)Google Scholar
  4. 4.
    Criminisi, A., Shotton, J., Blake, A., Torr, P.H.S.: Gaze Manipulation for One-to-one Teleconferencing, ICCV 2003 (2003)Google Scholar
  5. 5.
    Yoo, D.H., et al.: Non-contact Eye Gaze Tracking System by Mapping of Corneal Reflections, FG (2002)Google Scholar
  6. 6.
    Matsumoto, Y., Zelinsky, A.: presented An Algorithm for Real-time Stereo Vision Implementation of Head Pose and Gaze Direction Measurement, FG, pp.499-504 (2000)Google Scholar
  7. 7.
    Zhu, J., Yang, J.: Subpixel Eye Gaze Tracking. In: FG (2002)Google Scholar
  8. 8.
    Y.: Dual-state Parametric Eye Tracking. In: FG, pp. 110–115 (2000)Google Scholar
  9. 9.
    Schubert, A.: Detection and Tracking of Facial Feature in Real time Using a Synergistic Approach of Spatio-Temporal Models and Generalized Hough-Transform Techniques. In: FG, pp. 116–121 (2000)Google Scholar
  10. 10.
    Torre, F.D.I., Yacoob, Y., Davis, L.: A Probabilistic Framework for Rigid and Non-rigid Appearance Based Tracking and Recognition. In: FG, pp. 491–498 (2000)Google Scholar
  11. 11.
    Dhome, M., Lapreste, J.T., Rives, G., Richetin, M.: Spatial Localization of Modeled Objects of Revolution in Monocular Perspective Vision. In: ECCV 90, pp. 475–485 (1990)Google Scholar
  12. 12.
    Kanatani, K., Wu, L.: 3D Interpretation of Conics and Orthogonality. Image Understanding 58, 286–301 (1993)Google Scholar
  13. 13.
    Wu, L., Kanatani, K.: Interpretation of Conic Motion and Its Applications. Int. Journal of Computer Vision 10(1), 67–84 (1993)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Haiyuan Wu
    • 1
  • Qian Chen
    • 1
  • Toshikazu Wada
    • 1
  1. 1.Faculty of Systems EngineeringWakayama UniversityJapan

Personalised recommendations