Advertisement

International Journal of Computer Vision

, Volume 70, Issue 1, pp 23–40 | Cite as

Corneal Imaging System: Environment from Eyes

  • Ko NishinoEmail author
  • Shree K. Nayar
Article

Abstract

This paper provides a comprehensive analysis of exactly what visual information about the world is embedded within a single image of an eye. It turns out that the cornea of an eye and a camera viewing the eye form a catadioptric imaging system. We refer to this as a corneal imaging system. Unlike a typical catadioptric system, a corneal one is flexible in that the reflector (cornea) is not rigidly attached to the camera. Using a geometric model of the cornea based on anatomical studies, its 3D location and orientation can be estimated from a single image of the eye. Once this is done, a wide-angle view of the environment of the person can be obtained from the image. In addition, we can compute the projection of the environment onto the retina with its center aligned with the gaze direction. This foveated retinal image reveals what the person is looking at. We present a detailed analysis of the characteristics of the corneal imaging system including field of view, resolution and locus of viewpoints. When both eyes of a person are captured in an image, we have a stereo corneal imaging system. We analyze the epipolar geometry of this stereo system and show how it can be used to compute 3D structure. The framework we present in this paper for interpreting eye images is passive and non-invasive. It has direct implications for several fields including visual recognition, human-machine interfaces, computer graphics and human affect studies.

Keywords

eye cornea catadioptric imaging system stereo panorama retinal projection 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Baker, S. and Nayar, S.K. 1999. A Theory of Single-Viewpoint Catadioptric Image Formation. IJCV, 35(2):1–22.Google Scholar
  2. Baker, T.Y. 1943. Ray tracing through non-spherical surfaces. Proc. of The Royal Society of London, 55:361–364.Google Scholar
  3. Blanz, V. and Vetter, T. 1999. A Morphable Model for the Synthesis of 3D Faces. In ACM SIGGRAPH 99.Google Scholar
  4. Bolt, R.A. 1982. Eyes at the Interface. In ACM CHI, pages 360–362.Google Scholar
  5. Burkhard, D.G. and Shealy, D.L. 1973. Flux Density for Ray Propagation in Geometrical Optics. JOSA, 63(3):299–304.Google Scholar
  6. Cornbleet, S. 1984. Microwave and Optical Ray Geometry. John Wiley and SonsGoogle Scholar
  7. Daugman, J.G. 1993. High Confidence Visual Recognition of Persons by a Test of Statistical Independence. IEEE TPAMI, 15(11):1148–1161.Google Scholar
  8. Davson, H. 1990. Physiology of the Eye. Macmillan, 5th edition.Google Scholar
  9. Debevec, P., Hawkins, T., Tchou, C., Duiker, H-P. and Sarokin, W. 2000. Acquiring the Reflectance Field of a Human Face. In ACM SIGGRAPH 00, pp. 145–156.Google Scholar
  10. P. Debevec 1998. Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Imagebased Graphics with Global Illumination and High Dynamic Range Photography. In ACM SIGGRAPH 98, pp. 189–198.Google Scholar
  11. Ebisawa, Y. 1998. Improved Video-Based Eye-Gazed Detection Method. IEEE Trans. on Instrumentation and Measurement, 47(4):948–955.Google Scholar
  12. Ekman, P. and Rosenberg, E. L. editors 1997. What the Face Reveals. Oxford University Press, New York.Google Scholar
  13. Ekman, P. 1993. Facial Expression of Emotion. American Psychologist, 48:384–392.Google Scholar
  14. Flom, L. and Safir, A. 1987. Iris Recognition System. US patent 4,641,349.Google Scholar
  15. Halstead, M.A., Barsky, B.A., Klein, S.A., and Mandell, R.B. 1996. Reconstructing Curved Surfaces from Specular Reflection Patterns Using Spline Surface Fitting of Normals. In ACM SIGGRAPH 96, pp. 335–342.Google Scholar
  16. Hutchinson, T.E., White, K.P., Reichert, K.C., and Frey, L.A. 1989. Human-computer Interaction using Eye-gaze Input. IEEE TSMC, 19:1527–1533.Google Scholar
  17. Ikeuchi, K. and Suehiro, T. 1994. Toward an Assembly Plan from Observation, Part 1: Task Recognition with Polyhedral Objects. IEEE Trans. Robotics and Automation, 10(3):368–385.Google Scholar
  18. Jacob, R. 1990. What You Look at is What You Get: Eye Movement-Based Interaction Techniques. In ACM CHI, pp. 11–18.Google Scholar
  19. Kang, S.B. and Ikeuchi, K. 1997. Toward Automatic Robot Instruction from Perception–Mapping Human Grasps to Manipulator Grasps. IEEE Trans. on Robotics and Automation, 13(1).Google Scholar
  20. Kaufman, P.L. and Alm, A. editors 2003. Adler’s Physiology of the Eye: Clinical Application. Mosby, 10th edition.Google Scholar
  21. Marschner, S.R. and Greenberg, D.P. 1997. Inverse Lighting for Photography. In IS&T/SID Color Imaging Conference, pp. 262–265.Google Scholar
  22. Nayar, S.K. 1988. Sphereo: Recovering depth using a single camera and two specular spheres. In SPIE: Optics, Illumination and Image Sensing for Machine Vision II.Google Scholar
  23. Nene, S.A. and Nayar, S.K. 1998. Stereo Using Mirrors. In IEEE ICCV 98, pp. 1087–1094.Google Scholar
  24. Nishino, K. and Nayar, S.K. 2004. Eyes for Relighting. ACM Trans. on Graphics (Proceedings of SIGGRAPH 2004), 23(3):704–711.Google Scholar
  25. Nishino, K. and Nayar, S.K. 2004. The World in Eyes. In IEEE Conference on Computer Vision and Pattern Recognition, volume I, pp. 444–451.Google Scholar
  26. Ohno, T., Mukawa, N. and Yoshikawa, A. 2002. FreeGaze: a gaze tracking system for everyday gaze interaction. In Proc. of the Symposium on ETRA, pp. 125–132.Google Scholar
  27. Pajdla, T., Svoboda, T. and Hlaváč, V. 2000. Epipolar geometry of central panoramic cameras. In Panoramic Vision : Sensors, Theory, and Applications. Springer Verlag.Google Scholar
  28. Stein, C.P. 1995. Accurate Internal Camera Calibration Using Rotation, with Analysis of Sources of Errors. In ICCV, pp. 230–236.Google Scholar
  29. Stiefelhagen, R., Yang, J., and Waibel, A. 1997. A Model-Based Gaze-Tracking System. International Journal of Artificial Intelligence Tools, 6(2):193–209.Google Scholar
  30. Swaminathan, R., Grossberg, M.D., and Nayar, S.K. 2001. Caustics of Catadioptric Cameras. In IEEE ICCV 01, vol. II, pp. 2–9.Google Scholar
  31. Tan, K-H., Kriegman, D.J., and Ahuja, N. 2002. Appearance-based Eye Gaze Estimation. In WACV, pp. 191–195.Google Scholar
  32. Tomkins, S. S. 1962. Affect, imagery, consciousness. Springer, New York.Google Scholar
  33. Tsumura, N., Dang, M.N., Makino, T., and Miyake, Y. 2003. Estimating the Directions to Light Sources Using Images of Eye for Reconstructing 3D Human Face. In Proc. of IS&T/SID’s Eleventh Color Imaging Conference, pp. 77–81.Google Scholar
  34. von Helmholtz, H. 1909. Physiologic Optics, volume 1 and 2. Voss, Hamburg, Germany, third edition.Google Scholar
  35. Wang, J-G., Sung, E., and Venkateswarlu, R. 2003. Eye Gaze Estimation from a Single Image of One Eye. In IEEE ICCV 03, pp. 136–143.Google Scholar
  36. Wasserman, S. and Faust, K. 1994. Social Network Analysis: Methods and Applications. Cambridge University Press.Google Scholar
  37. Westheimer, G. 1980. Medical Physiology, volume 1, chapter 16 The eye, pages 481–503. The C.V. Mosby Company.Google Scholar
  38. Wolff, L.B. and Boult, T.E. 1991. Constraining Object Features Using a Polarization Reflectance Model. IEEE TPAMI, 13(7):635–657.Google Scholar
  39. Wolff, L.B. 1990. Polarization-based Material Classification from Specular Reflection. IEEE TPAMI, 12(11):1059–1071.Google Scholar
  40. Xu, L-Q., Machin, D. and Sheppard, P. 1998. A Novel Approach to Real-time Non-intrusive Gaze Finding. In BMVC, pp. 58–67.Google Scholar
  41. Young, L.R. and Sheena, D. 1975. Survey of Eye Movement Recording Methods. Behavior Research Methods and Instrumentation, 7(5):397–429.Google Scholar

Copyright information

© Springer Science + Business Media, LLC 2006

Authors and Affiliations

  1. 1.Department of Computer ScienceDrexel UniversityPhiladelphia
  2. 2.Department of Computer ScienceColumbia UniversityNew York

Personalised recommendations