Journal of Real-Time Image Processing

, Volume 8, Issue 1, pp 21–33 | Cite as

Comparitive study on photometric normalization algorithms for an innovative, robust and real-time eye gaze tracker

  • Antonino Armato
  • Antonio Lanatà
  • Enzo Pasquale Scilingo
Special Issue

Abstract

Eye gaze trackers (EGTs) are generally developed for scientific exploration in controlled environments or laboratories and data have been used in ophthalmology, neurology, psychology, and related areas to study oculomotor characteristics and abnormalities, and their relation to cognition and mental states. The illumination is one of the most restrictive limitation of the EGTs, due to a problem of pupil center estimation during illumination changes. Most of the current systems, indeed, work under controlled illumination conditions either in dark or indoor environments, e.g. using infrared sources or conforming the sources of light to fixed levels or pointing directions. This work is focused on exploring and comparing several photometric normalization techniques to improve EGT systems during light changes. In particular, a new wearable and wireless eye tracking system (HATCAM) is used for testing the different techniques in terms of real-time capability, eye tracking and pupil area detection. Embedding real-time image enhancement into the HATCAM can make it an innovative and robust system for eye tracking in different lighting conditions, i.e. darkness, sunlight, indoor and outdoor environments.

References

  1. 1.
    Anon Asl.: Also available as http://www.asl.com (2010)
  2. 2.
    Babcock, J., Pelz, J., Peak, J.: The wearable eyetracker: a tool for the study of high-level visual tasks. In: Proceedings of the Military Sensing Symposia Specialty Group on Camouflage, Concealment, and Deception, Tucson, Arizona, Citeseer (2004)Google Scholar
  3. 3.
    Bennett, N., Burridge, R., Saito, N.: A method to detect and characterize ellipses using the Hough transform. IEEE Trans. Pattern Anal. Mach. Intell. 21(7), 652–657 (2002)CrossRefGoogle Scholar
  4. 4.
    Barnard, K., Funt, B.: Investigations into multi-scale Retinex. In: Colour Imaging Vision and Technology, CIteseer. Wiley, New York, pp. 9–17 (1999)Google Scholar
  5. 5.
    Chen, W., Er, M., Wu, S.: Illumination compensation and normalization for robust face recognition using discrete cosine transform in logarithm domain. IEEE Trans. Syst. Man Cybernet. Part B: Cybernet. 36(2), 458–466 (2006)CrossRefGoogle Scholar
  6. 6.
    DiScenna, A., Das, V., Zivotofsky, A., Seidman, S., Leigh, R.: Evaluation of a video tracking device for measurement of horizontal and vertical eye rotations during locomotion. J. Neurosci. Methods 58(1–2), 89–94 (1995)CrossRefGoogle Scholar
  7. 7.
    Dongheng, L., Winfield, D., Parkhurst, D.: Starburst: a hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (2005)Google Scholar
  8. 8.
    Du, S., Ward, R.: Wavelet-based illumination normalization for face recognition. In: IEEE International Conference on Image Processing, 2005. ICIP 2005. IEEE, vol. 2 (2005)Google Scholar
  9. 9.
    Duchowski, AT.: Eye Tracking Methodology. Springer, Berlin (2003)Google Scholar
  10. 10.
    Fitzgibbon, A., Pilu, M., Fisher, R.: Direct least square fitting of ellipses. IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (2002)CrossRefGoogle Scholar
  11. 11.
    Forsyth, D., Ponce, J.: Computer vision: a modern approach. Prentice Hall Professional Technical Reference (2002)Google Scholar
  12. 12.
    Franchak, J., Kretch, K., Soska, K., Babcock, J., Adolph, K.: Head mounted eye tracking of infants’ natural interactions: a new method. In: Proceedings of the 2010 Symposium on Eye Tracking Research Applications, Citeseer, pp. 21–27 (2010)Google Scholar
  13. 13.
    Grimson, W., Huttenlocher, D.: On the sensitivity of the Hough transform for object recognition. IEEE Trans. Pattern Anal. Mach. Intell. 12(3), 255–274 (2002)CrossRefGoogle Scholar
  14. 14.
    Hines, G.D., Rahman, Z., Jobson, D.J., Woodell, G.A.: Single-scale retinex using digital signal processors. In: Global Signal Processing Conference, CIteseer, 27(30) (2004)Google Scholar
  15. 15.
    Holappa, J., Ahonen, T., Pietikainen, M.: An optimized illumination normalization method for face recognition. In: 2nd IEEE International Conference on Biometrics: Theory, Applications and Systems, 2008. BTAS 2008. pp. 1–6 (2008)Google Scholar
  16. 16.
    Jobson, D., Rahman, Z., Woodell, G.: A multiscale retinex for bridging the gap between color images and the human observation of scenes. IEEE Trans. Image Process. 6(7), 965–976 (2002)CrossRefGoogle Scholar
  17. 17.
    Jobson, D., Rahman, Z., Woodell, G.: Properties and performance of a center/surround retinex. IEEE Trans. Image Process. 6(3), 451–462 (2002)CrossRefGoogle Scholar
  18. 18.
    Kjonsberg, H., Kolbjornsen, O.: Markov mesh simulations with data conditioning through indicator kriging. In: Proceedings of Geostats, vol. 8 (2008)Google Scholar
  19. 19.
    Kruskal, W., Wallis, W.: Use of ranks in one-criterion variance analysis. J. Am. Stat. Assoc. 47(260), 583–621 (1952)MATHCrossRefGoogle Scholar
  20. 20.
    Land, E., McCann, J.: Lightness and retinex theory. J. Optical Soc. Am. 61(1), 1–11 (1971)CrossRefGoogle Scholar
  21. 21.
    Land, M., Lee, D.: Where we look when we steer. Nature 369(6483), 742–744 (1994)CrossRefGoogle Scholar
  22. 22.
    LcTechnologies (2010) Also available as http://www.eyegaze.com
  23. 23.
    Li, D., Babcock, J., Parkhurst, D.: OpenEyes: a low-cost head-mounted eye-tracking solution. In: Proceedings of the 2006 Symposium on Eye Tracking Research and Applications. ACM, pp 95–100 (2006)Google Scholar
  24. 24.
    Lilliefors, H.: On the Kolmogorov–Smirnov test for normality with mean and variance unknown. J. Am. Stat. Assoc. 62(318), 399–402 (1967)CrossRefGoogle Scholar
  25. 25.
    Maini, E.: Robust ellipse-specific fitting for real-time machine vision. In: Brain, Vision, and Artificial Intelligence, vol. 3704, pp. 318–327 (2005)Google Scholar
  26. 26.
    Marsi, S., Saponara, S.: Integrated video motion estimator with retinex-like pre-processing for robust motion analysis in automotive scenarios: algorithmic and real-time architecture design. J. Real-Time Image Process. 5(4), 275–289 (2010)Google Scholar
  27. 27.
    Meyer, A.B., Ohme, M., Martinetz, T., Barth, E.: A single-camera remote eye tracker. Percept. Interact. Technol. 208–211 (2006)Google Scholar
  28. 28.
    Morimoto, C., Mimica, M.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)CrossRefGoogle Scholar
  29. 29.
    Morimoto, C., Amir, A., Flickner, M.: Detecting eye position and gaze from a single camera and 2 light sources. Pattern Recogn. 4(40), 314 (2002)Google Scholar
  30. 30.
    Pelz, J., Canosa, R., Babcock, J.: Extended tasks elicit complex eye movement patterns. In: Proceedings of the 2000 symposium on Eye tracking research & applications, ACM, p. 43 (2000)Google Scholar
  31. 31.
    Perez, P.: Markov random fields and images. CWI Quarterly, Institut de recherche en informatique et syst {è} mes al {é} atoires. 11(4), 413–437 (1998)Google Scholar
  32. 32.
    Ranawade, S.: Face recognition and verification using artificial neural network. Int. J. Comput. Appl. IJCA 1(14), 23–30 (2010)Google Scholar
  33. 33.
    Robinson, D.: A method of measuring eye movemnent using a scieral search coil in a magnetic field. IEEE Trans. Biomed. Electron. 10(4), 137–145 (2008)Google Scholar
  34. 34.
    San Agustin, J., Skovsgaard, H., Mollenbach, E., Barret, M., Tall, M., Hansen, D.W., Hansen, J.P.: Evaluation of a low-cost open-source gaze tracker. In: Proceedings of the 2010 symposium on eye-tracking research & applications, ACM, pp. 77–80 (2010)Google Scholar
  35. 35.
    Shashua, A., Riklin-Raviv, T.: The quotient image: Class-based re-rendering and recognition with varying illuminations. IEEE Trans. Pattern Anal. Mach. Intell. 23(2), 129–139 (2002)CrossRefGoogle Scholar
  36. 36.
    Sliney, D., Aron-Rosa, D., DeLori, F., Fankhauser, F., Landry, R., Mainster, M., Marshall, J., Rassow, B., Stuck, B., Trokel, S. et al.: Adjustment of guidelines for exposure of the eye to optical radiation from ocular instruments: statement from a task group of the International Commission on Non-Ionizing Radiation Protection (ICNIRP). Appl. Optics 44(11), 2162–2176 (2005)CrossRefGoogle Scholar
  37. 37.
    Stavig, G., Gibbons, J.: Comparing the mean and the median as measures of centrality. International Statistical Review/Revue Internationale de Statistique 45(1), 63–70 (1977)MATHCrossRefGoogle Scholar
  38. 38.
    Stilkerich, S., Reiger, R.: On the simulation and development of massive parallel digital architectures for Markov random fields [image processing applications]. In: Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 5, pp.169–172 (2004)Google Scholar
  39. 39.
    Struc, V., Zibert, J., Pavesic, N.: Histogram remapping as a preprocessing step for robust face recognition. WSEAS Trans. Inf. Sci. Appl. 6(3), 520–529 (2009)Google Scholar
  40. 40.
    Wang, H., Li, S., Wang, Y.: Face recognition under varying lighting conditions using self quotient image. In: Proceedings of Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. IEEE, pp. 819–824 (2004a)Google Scholar
  41. 41.
    Wang, H., Li, S., Wang, Y.: Generalized quotient image. IEEE Computer Society (2004b)Google Scholar
  42. 42.
    Young, L., Sheena, D.: Survey of eye movement recording methods. Behav. Res. Methods Instrum. 7, 397–429 (1975)Google Scholar
  43. 43.
    Zhu, Z., Ji, Q.: Novel eye gaze tracking techniques under natural head movement. IEEE Trans. Biomed. Eng. 54(12), 2246–2260 (2007)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2011

Authors and Affiliations

  • Antonino Armato
    • 1
  • Antonio Lanatà
    • 2
  • Enzo Pasquale Scilingo
    • 2
  1. 1.Department of Information EngineeringUniversity of PisaPisaItaly
  2. 2.Interdepartmental Research Center “E. Piaggio” and Department of Information EngineeringUniversity of PisaPisaItaly

Personalised recommendations