Machine Vision and Applications

, Volume 20, Issue 5, pp 319–337 | Cite as

A robust eye gaze tracking method based on a virtual eyeball model

Original Paper

Abstract

Gaze positions can provide important cues for natural computer interfaces. In this paper, we describe a new gaze estimation method based on a three dimensional analysis of the human eye which can be used in head-mounted display (HMD) environments. This paper presents four advantages over previous works. First, in order to obtain accurate gaze positions, we used a virtual eyeball model based on the 3D characteristics of the human eyeball. Second, we calculated the 3D position of the virtual eyeball and gaze vector by using a camera and three collimated IR-LEDs. Third, three reference frames (the camera, the monitor and the eye reference frames) were unified, which simplified the complex 3D converting calculations and allowed for calculation of the 3D eye position and gaze position on a HMD monitor. Fourth, a simple user-dependent calibration method was proposed by gazing at one position based on Kappa compensation. Experimental results showed that the eye gaze estimation error of the proposed method was lower than 1°.

Keywords

Bright Spot Head Mount Display Rotational Radius Pupil Center Virtual Screen 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Azarbayejani, A., Starner, T., Horowitz, B., Pentland, A.: Visually controlled graphics. IEEE Trans. Pattern Anal. Mach. Intell. 15(6), 602–605 (1993)CrossRefGoogle Scholar
  2. 2.
    Gemmell, J., Toyama, K., Zitnick, C., Kang, T., Seitz, S.: Gaze awareness for video-conferencing: A software approach. IEEE Multimedia 7(4), 26–35 (2000)CrossRefGoogle Scholar
  3. 3.
    Hutchinson, T.E., White, K.P., Martin, W.N., Reichert, K.C., Frey, L.A.: Human–computer interaction using eye-gaze input. IEEE Trans. Syst. Man Cybernet. 19(6), 1527–1534 (1989)CrossRefGoogle Scholar
  4. 4.
    Matsumoto, Y., Ino, T., Ogsawara, T.: Development of intelligent wheelchair system with face and gaze based interface. In: Proc. IEEE 10th International Workshop on Robot and Human Interactive Communication, pp. 262–267, September 2001Google Scholar
  5. 5.
    Kim, J., Park, K.R., Lee, J.J., LeClair, S.R.: Intelligent process control via gaze detection technology. Eng. Appl. Artif. Intell. 13(5), 577–587 (2000)CrossRefGoogle Scholar
  6. 6.
    Ishii, H., Okada, Y., Shimoda, H., Yoshikawa, H.: Construction of the measurement system and its experimental study for diagnosing cerebral functional disorders using eye-sensing HMD. In: Proc. 41st SICE Annual Conference, vol. 2, pp. 1248–1253, Auguest 2002Google Scholar
  7. 7.
    Aotake, Y., Sasai, H., Ozawa, T., Fukushima, S., Shimoda, H., Yoshikawa, H.: A new adaptive CAI system based on bio-informatic sensing: study on real-time method of analyzing ocular information by using eye-sensing HMD and method of adaptive CAI system configuration. In: Proc. IEEE International Conference on Systems, Man and Cybernetics, vol. 3, pp. 733–738, October 1999Google Scholar
  8. 8.
    Lin, C.S.: An eye behavior measuring device for VR system. Opt. Lasers Eng. 38(6), 333–359 (2002)CrossRefGoogle Scholar
  9. 9.
  10. 10.
  11. 11.
    Lee, J.J.: Three dimensional eye gaze estimation in wearable monitor environment. Ph.D Thesis, Yonsei University (2005)Google Scholar
  12. 12.
    Park, K.R., Lee, J.J., Kim, J.H.: Gaze position detection by computing the three dimensional facial positions and motions. Pattern Recognition 35(11), 2559–2569 (2002)MATHCrossRefGoogle Scholar
  13. 13.
    Gee, A., Cipolla, R.: Fast visual tracking by temporal consensus. Image Vision Comput. 14, 105–114 (1996)CrossRefGoogle Scholar
  14. 14.
    Ballard, P., Stockman, G.C.: Controlling a computer via facial aspect. IEEE Trans. Syst. Man Cybernet. 25(4), 669–677 (1995)CrossRefGoogle Scholar
  15. 15.
    Wang, J.G., Sung, E.: Pose determination of human faces by using vanishing points. Pattern Recognition 34(12), 2427–2445 (2001)MATHCrossRefGoogle Scholar
  16. 16.
    Ji, Q.: 3D Face pose estimation and tracking from a monocular camera. Image and Vision Computing 20(7), 499–511 (2002)CrossRefGoogle Scholar
  17. 17.
    Rikert, T.D., Jones, M.J.: Gaze estimation using morphable models. In: Proc. IEEE Int. Conf. Automatic Face Gesture Recognition, pp. 436–441, April 1998Google Scholar
  18. 18.
    Collet, C., Finkel, A., Gherbi, R.: CapRe: a gaze tracking system in man-machine interaction. In: Proc. IEEE Int. Conf. on Intelligent Engineering Systems, pp. 577–581, September 1997Google Scholar
  19. 19.
    Heinzmann, J., Zelinsky, A.: 3-D facial pose and gaze point estimation using a robust real-time tracking paradigm. In: Proc. Third IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 142–1 47, April 1998Google Scholar
  20. 20.
    Colombo, C., Andronico, S., Dario, P.: Prototype of a vision-based gaze-driven man-machine interface. In: Proc. IEEE Int. Conf. on Intelligent Robots and Systems, pp. 188–192, August 1995Google Scholar
  21. 21.
    Colombo, C., Del Bimbo, A.: Interacting through eyes. Robot. Autonomous Syst. 19(3–4), 359–368 (1997)CrossRefGoogle Scholar
  22. 22.
    Betke, M., Kawai, J.: Gaze detection via self-organizing gray-scale units. In: Proc. Int. Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, pp. 7–76, September 1999Google Scholar
  23. 23.
    Zhu, J., Yang, J.: Subpixel eye gaze tracking. In: Proc. Fifth IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 124–129, May 2002Google Scholar
  24. 24.
    Morimoto, C.H., Koons, D., Amit, A., Flickner, M., Zhai, S.: Keeping an eye for HCI. In: Proc. XII Brazilian Symposium on Computer Graphics and Image Processing, pp. 171–1 76 (1999)Google Scholar
  25. 25.
    Mimica, M.R.M., Morimoto, C.H.: A computer vision framework for eye gaze tracking. In: Proc. XVI Brazilian Symposium on Computer Graphics and Image Processing, pp. 406–412, October 2003Google Scholar
  26. 26.
    Ebisawa, Y.: Improved video-based eye-gaze detection method. IEEE Trans. Instrumen. Measure. 47(4), 948–955 (1998)CrossRefGoogle Scholar
  27. 27.
    Sugioka, A., Ebisawa, Y., Ohtani, M.: Noncontact video-based eye-gaze detection method allowing large head displacements. In: Proc. 8th Int. Conf. of the IEEE Engineering in Medicine and Biology Society, vol. 2, pp. 526–528 (1996)Google Scholar
  28. 28.
    Yoo, D.H., Kim, J.H., Lee, B.R., Chung, M.J.: Non-contact eye gaze tracking system by mapping of corneal reflections. In: Proc. Fifth IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 101–1 06, May 2002Google Scholar
  29. 29.
    Baluja, S., Pomerleau, D.: Non-intrusive gaze tracking using artificial neural networks. Technical report (CMU-CS-94-102) of School of Computer Science, Carnegie Mellon University, USA, January 1994Google Scholar
  30. 30.
    Kaufman, A.E., Bandopadhay, A., Shaviv, B.D.: An eye tracking computer user interface. In: Proc. Res. Frontier Virtual Reality Workshop, pp. 78–84, October 1993Google Scholar
  31. 31.
    Gips, J., Olivieri, P., Tecce, J.: Direct control of the computer through electrodes placed around the eyes. In: Proc. Fifth Int. Conf. Human Computer Interaction, pp. 630–635 (1993)Google Scholar
  32. 32.
    Wang, J.G., Sung, E.: Study on eye gaze estimation. IEEE Trans. Syst. Man Cybernet. B 32(3), 332–350 (2002)CrossRefGoogle Scholar
  33. 33.
    Shih, S.W., Liu, J.: A Novel Approach to 3-D Gaze Tracking Using Stereo Cameras. IEEE Trans. Syst. Man Cybernet. B 34(1), 234–245 (2004)CrossRefGoogle Scholar
  34. 34.
    Ohno, T., Mukawa, N., Yoshikawa, A.: FreeGaze: a gaze tracking system for everyday gaze interaction. In: Proc. Symposium on Eye Tracking Research and Applications, pp. 125–132 (2002)Google Scholar
  35. 35.
  36. 36.
    Park, K.R.: Robust Gaze Estimation for Human Computer Interaction. Lecture Notes in Computer Science (PRICAI), vol. 4099, pp. 1222–1226. Guilin, China (2006)Google Scholar
  37. 37.
    Zhu, Z., Ji, Q.: Eye and Gaze Tracking for Interactive Graphic Display. Machine Vision and Application 15(3), 139–148 (2004)Google Scholar
  38. 38.
    Zhu, Z., Ji, Q.: A head motion free gaze tracker with one-time calibration. In: IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), San Diego, CA, June 2005Google Scholar
  39. 39.
    Guestrin, et al.: General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. (2006)Google Scholar
  40. 40.
    Zhu, Z., Ji, Q.: Novel eye gaze tracking techniques under natural head movement. IEEE Trans. Biomed. Eng. (in press, 2008)Google Scholar
  41. 41.
    Lee, E.C., Park, K.R.: A study on eye gaze estimation method based on cornea model of human eye. In: MIRAGE 2007, INRIA Rocquencourt, France, March, 28–30, 2007. Lecture Notes in Computer ScienceGoogle Scholar
  42. 42.
    Lee, S.J., Park, K.R., Kim, J.H.: A study on fake Iris detection based on the reflectance of the Iris to the Sclera for Iris recognition. In: ITC-CSCC 2005, Jeju, Korea, July 4–7 , 2005 pp. 1555–1556Google Scholar
  43. 43.
    Lee, S. et al.: Robust fake iris detection based on variation of the reflectance ratio between the Iris and the Sclera. In: BSYM (Biometrics Symposium) 2006, Baltimore, USA, September 19–21, 2006Google Scholar
  44. 44.
    Gullstrand, A.: Helmholz’s Physiological Optics pp. 350–358. Optical Society of America (1924)Google Scholar
  45. 45.
    Breglia, D.R.: Helmet mounted eye tracker using a position sensing detector. United States Patent 4702575, 1987Google Scholar
  46. 46.
    Park, R.S., Park, G.E.: The center of ocular rotation in the horizontal plane. Am. J. Physiol. 104, 545–552 (1933)Google Scholar
  47. 47.
    Gonzalez, R.C., Woods, R.E.: Digital Image Processing pp. 587–591, 2nd edn. Prentice-Hall, Englewood Cliffs (2002)Google Scholar
  48. 48.
    Gonzalez, R.C., Woods, R.E.: Digital Image Processing. 1st edn. Prentice-Hall, Englewood CliffsGoogle Scholar
  49. 48.
    Shah, M.: Fundamentals of Computer Vision. pp. 9–14 (1992)Google Scholar
  50. 50.
    Lee, E.C., Park, K.R., Whang, M.C., Park, J.S.: Robust gaze tracking method for stereoscopic virtual reality system. HCI International 2007, Beijing, China, July 22–27, 2007. Lecture Notes in Computer ScienceGoogle Scholar
  51. 51.
  52. 52.
    Tsai, R.Y.: A versatile camera Calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV Cameras and Lenses. IEEE J. Robot. Automat. RA-3(4) (1987)Google Scholar
  53. 53.
    Zhang, Z.: Flexible camera calibration by viewing a plane from unknown orientations. In: Proceedings of ICCV (1999)Google Scholar
  54. 54.
    Heijjila, J., Silben, O.: A four-step camera calibration procedure with implicit image correction. In: Proceedings of CVPR (1997)Google Scholar
  55. 55.
    Strum and Maybank, On plane-based camera calibration : a general algorithm, singularities, applications. In: Proceedings of CVPR (1999)Google Scholar
  56. 56.
    Jain, R., Kasturi, R., Schunck, B.G.: Machine Vision. McGraw-Hill International Editions, New York, pp. 301–302 (1995)Google Scholar
  57. 57.
    Guil, N., Zapata, E.: Lower order circle and ellipse hough transform. Pattern Recognition 30(10), 1729–1744 (1997)CrossRefGoogle Scholar
  58. 58.
    Anton, H., Rorres, C.: Elementary Linear Algebra pp. 333–335. Wiley International (2005)Google Scholar
  59. 59.
    Chapra, S.C. et al.: Numerical Methods for Engineers. McGraw-Hill, New York (1989)Google Scholar
  60. 60.
    Grand, Y.L.: Light, Color and Vision. Wiley, New York (1957)Google Scholar
  61. 61.
    Eskridge, J.B. et al.: The Hirschberg test: a double-masked clinical evaluation. Am. J. Optometry Physiol. Opt. (1988)Google Scholar
  62. 62.
    Schaeffel, F.: Kappa and Hirschberg ratio measured with an automated video gaze tracker. Optometry Vis. Sci. 79(5), 329–334 (2002)CrossRefGoogle Scholar
  63. 63.
    Lee, E.C., Park, K.R., Kim, J.: Fake Iris detection by using the Purkinje image. Lecture Notes Comput. Sci. (ICB’06), vol. 3832 (January), pp.397–403 (2006)Google Scholar
  64. 64.
    Lee, J.J., Park, K.R., Kim, J.H.: Gaze detection system under HMD environments for user interface. In: ICANN/ICONIP, Istanbul, Turkey, June 26–29, 2003Google Scholar
  65. 65.
    Lee, E.C., Park, K.R.: 3D first person shooting game by using eye gaze tracking. J. KIPS 12B(4), 465–472 (2005)Google Scholar
  66. 66.
    Lee, E.C., Park, K.R.: A study on manipulating method 3D game in HMD environment by using eye tracking. J. Ins. Electron. Eng. Korea (in press, 2008)Google Scholar
  67. 67.
    Lee, E.C., Park, K.R., Whang, M.C., Lim, J.S.: Near infra-red vision-based facial and eye gaze estimation method for stereoscopic display system. In: 10th International Federation of Automatic Control (IFAC), Ritz-Carlton Hotel, Seoul, Korea, September 4–6, 2007Google Scholar
  68. 68.
    Lee, E.C. et al.: Method and apparatus for tracking gaze position. US Patent pending, number: 11/91813, 6 Dec 2007Google Scholar
  69. 69.
    Lee, E.C., Park, K.R.: Manipulating character’s view direction of three dimensional first person shooting game by using gaze tracking in HMD environment. In: 2nd Next Generation Computing Conference, KINTEX, Ilsan, Korea, November 16–17, 2006Google Scholar

Copyright information

© Springer-Verlag 2008

Authors and Affiliations

  1. 1.Department of Computer ScienceSangmyung UniversitySeoulRepublic of Korea
  2. 2.Department of Electronics EngineeringDongguk UniversitySeoulRepublic of Korea

Personalised recommendations