Machine Vision and Applications

, Volume 27, Issue 8, pp 1275–1288 | Cite as

Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art

  • Wolfgang Fuhl
  • Marc Tonsen
  • Andreas Bulling
  • Enkelejda Kasneci
Special Issue Paper

Abstract

Robust and accurate detection of the pupil position is a key building block for head-mounted eye tracking and prerequisite for applications on top, such as gaze-based human–computer interaction or attention analysis. Despite a large body of work, detecting the pupil in images recorded under real-world conditions is challenging given significant variability in the eye appearance (e.g., illumination, reflections, occlusions, etc.), individual differences in eye physiology, as well as other sources of noise, such as contact lenses or make-up. In this paper we review six state-of-the-art pupil detection methods, namely ElSe (Fuhl et al. in Proceedings of the ninth biennial ACM symposium on eye tracking research & applications, ACM. New York, NY, USA, pp 123–130, 2016), ExCuSe (Fuhl et al. in Computer analysis of images and patterns. Springer, New York, pp 39–51, 2015), Pupil Labs (Kassner et al. in Adjunct proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing (UbiComp), pp 1151–1160, 2014. doi:10.1145/2638728.2641695), SET (Javadi et al. in Front Neuroeng 8, 2015), Starburst (Li et al. in Computer vision and pattern recognition-workshops, 2005. IEEE Computer society conference on CVPR workshops. IEEE, pp 79–79, 2005), and Świrski (Świrski et al. in Proceedings of the symposium on eye tracking research and applications (ETRA). ACM, pp 173–176, 2012. doi:10.1145/2168556.2168585). We compare their performance on a large-scale data set consisting of 225,569 annotated eye images taken from four publicly available data sets. Our experimental results show that the algorithm ElSe (Fuhl et al. 2016) outperforms other pupil detection methods by a large margin, offering thus robust and accurate pupil positions on challenging everyday eye images.

Keywords

Pupil detection Head-mounted eye tracking Data set Computer vision Image processing 

References

  1. 1.
    Braunagel, C., Kasneci, E., Stolzmann, W., Rosenstiel, W.: Driver-activity recognition in the context of conditionally autonomous driving. In: 2015 IEEE 18th International Conference on Intelligent Transportation Systems, pp 1652–1657 (2015). doi:10.1109/ITSC.2015.268
  2. 2.
    Bulling, A., Ward, J.A., Gellersen, H., Tröster, G.: Eye movement analysis for activity recognition using electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 741–753 (2011). doi:10.1109/TPAMI.2010.86 CrossRefGoogle Scholar
  3. 3.
    Bulling, A., Weichel, C., Gellersen, H.: Eyecontext: recognition of high-level contextual cues from human visual behaviour. In: Proceedings of the 31st SIGCHI International Conference on Human Factors in Computing Systems (CHI), pp. 305–308 (2013). doi:10.1145/2470654.2470697
  4. 4.
    Douglas, D.H., Peucker, T.K.: Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartogr. Int. J. Geogr. Inf. Geovisualization 10(2), 112–122 (1973)CrossRefGoogle Scholar
  5. 5.
    Fuhl, W., Kübler, T., Sippel, K., Rosenstiel, W., Kasneci, E.: ExCuSe: robust pupil detection in real-world scenarios. In: Azzopardi, G., Petkov, N. (eds.) Computer Analysis of Images and Patterns, Springer, New York, pp. 39–51 (2015)Google Scholar
  6. 6.
    Fuhl, W., Santini, T.C., KŁubler, T., Kasneci, E.: Else: Ellipse selection for robust pupil detection in real-world environments. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ACM, New York, NY, USA, ETRA ’16, pp 123–130 (2016)Google Scholar
  7. 7.
    Goni, S., Echeto, J., Villanueva, A., Cabeza, R.: Robust algorithm for pupil-glint vector detection in a video-oculography eyetracking system. In: Pattern Recognition, 2004. Proceedings of the 17th International Conference on ICPR 2004. IEEE (2004)Google Scholar
  8. 8.
    Javadi, A.H., Hakimi, Z., Barati, M., Walsh, V., Tcheang, L.: Set: a pupil detection method using sinusoidal approximation. Front. Neuroeng. 8, 4 (2015)Google Scholar
  9. 9.
    Jian, M., Lam, K.M.: Simultaneous hallucination and recognition of low-resolution faces based on singular value decomposition. Circuits Syst. Video Technol. IEEE Trans. 25(11), 1761–1772 (2015)CrossRefGoogle Scholar
  10. 10.
    Jian, M., Lam, K.M., Dong, J.: A novel face-hallucination scheme based on singular value decomposition. Pattern Recognit. 46(11), 3091–3102 (2013)CrossRefGoogle Scholar
  11. 11.
    Jian, M., Lam, K.M., Dong, J.: Facial-feature detection and localization based on a hierarchical scheme. Inf. Sci. 262, 1–14 (2014)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Kasneci, E.: Towards the automated recognition of assistance need for drivers with impaired visual field. PhD thesis, University of Tübingen, Tübingen (2013). http://tobias-lib.uni-tuebingen.de/volltexte/2013/7033
  13. 13.
    Kasneci, E., Sippel, K., Aehling, K., Heister, M., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Driving with binocular visual field loss? A study on a supervised on-road parcours with simultaneous eye and head tracking. Plos One 9(2), e87,470 (2014)CrossRefGoogle Scholar
  14. 14.
    Kasneci, E., Sippel, K., Heister, M., Aehling, K., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Homonymous visual field loss and its impact on visual exploration: A supermarket study. TVST 3(6), 2 (2014)CrossRefGoogle Scholar
  15. 15.
    Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 1151–1160 (2014). doi:10.1145/2638728.2641695
  16. 16.
    Kasneci, E., Kasneci, G., Kübler, T.C., Rosenstiel, W.: Artificial Neural Networks: Methods and Applications in Bio-Neuroinformatics, Springer International Publishing, chap Online Recognition of Fixations, Saccades, and Smooth Pursuits for Automated Analysis of Trac Hazard Perception, pp 411–434 (2015). doi:10.1007/978-3-319-09903-320
  17. 17.
    Keil, A., Albuquerque, G., Berger, K., Magnor, M.A.: Real-time gaze tracking with a consumer-grade video camera In: Vaclav, S. (ed.) WSCG’2010, February 1–4, 2010, UNION Agency–Science Press, Plzen (2010)Google Scholar
  18. 18.
    Li, D., Winfield, D., Parkhurst, D.J.: Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: Computer Vision and Pattern Recognition-Workshops, 2005. IEEE Computer Society Conference on CVPR Workshops. IEEE, pp. 79–79 (2005)Google Scholar
  19. 19.
    Lin, L., Pan, L., Wei, L., Yu, L.: A robust and accurate detection of pupil images. In: 3rd International Conference on Biomedical Engineering and Informatics (BMEI), 2010, IEEE, vol. 1, pp. 70–74 (2010)Google Scholar
  20. 20.
    Liu, X., Xu, F., Fujimura, K.: Real-time eye detection and tracking for driver observation under various light conditions. In: Intelligent Vehicle Symposium, 2002, IEEE, vol. 2, pp. 344–351. IEEE (2002)Google Scholar
  21. 21.
    Long, X., Tonguz, O.K., Kiderman, A.: A high speed eye tracking system with robust pupil center estimation algorithm. In: Engineering in Medicine and Biology Society, 2007. EMBS 2007. 29th Annual International Conference of the IEEE. IEEE (2007)Google Scholar
  22. 22.
    Majaranta, P., Bulling, A.: Eye Tracking and Eye-Based Human-Computer Interaction. Advances in Physiological Computing. Springer, London (2014). doi:10.1007/978-1-4471-6392-33 CrossRefGoogle Scholar
  23. 23.
    Mohammed, G.J., Hong, B.R., Jarjes, A.A.: Accurate pupil features extraction based on new projection function. Comput. Inf. 29(4), 663–680 (2012)Google Scholar
  24. 24.
    Peréz, A., Cordoba, M.L., Garcia, A., Méndez, R., Munoz, M.L., Pedraza, J.L., Sanchez. F.: A precise eye-gaze detection and tracking system. In: Václav S. (ed.) WSCG ‘2003: Posters: The 11th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2003, pp. 105–108. UNION Agency, Plzen (2003)Google Scholar
  25. 25.
    Schnipke, S.K., Todd, M.W.: Trials and tribulations of using an eye-tracking system. In: CHI’00 Extended Abstracts on Human Factors in Computing Systems. ACM (2000)Google Scholar
  26. 26.
    Sippel, K., Kasneci, E., Aehling, K., Heister, M., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Binocular glaucomatous visual field loss and its impact on visual exploration—a supermarket study. PLoS One 9(8), e106,089 (2014). doi:10.1371/journal.pone.0106089 CrossRefGoogle Scholar
  27. 27.
    Stellmach, S., Dachselt, R.: Look & touch: gaze-supported target acquisition. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2981–2990. ACM (2012)Google Scholar
  28. 28.
    Sugano, Y., Bulling, A.: Self-calibrating head-mounted eye trackers using egocentric visual saliency. In: Proceedings of the 28th ACM Symposium on User Interface Software and Technology (UIST), pp. 363–372 (2015). doi:10.1145/2807442.2807445
  29. 29.
    Suzuki, S., et al.: Topological structural analysis of digitized binary images by border following. Comput. Vis. Graphics Image Process. 30(1), 32–46 (1985)CrossRefMATHGoogle Scholar
  30. 30.
    Świrski, L., Bulling, A., Dodgson, N.: Robust real-time pupil tracking in highly off-axis images. In: Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), pp. 173–176. ACM (2012). doi:10.1145/2168556.2168585
  31. 31.
    Tafaj, E., Kübler, T., Kasneci, G., Rosenstiel, W., Bogdan, M.: Online classification of eye tracking data for automated analysis of traffic hazard perception. In: Artificial Neural Networks and Machine Learning, ICANN 2013, vol. 8131, pp. 442–450. Springer, Berlin—Heidelberg (2013)Google Scholar
  32. 32.
    Tonsen, M., Zhang, X., Sugano, Y., Bulling, A.: Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. In: Proceedings of the ACM International Symposium on Eye Tracking Research & Applications (ETRA), pp. 139–142 (2016). doi:10.1145/2857491.2857520
  33. 33.
    Trösterer, S., Meschtscherjakov, A., Wilfinger, D., Tscheligi, M.: Eye tracking in the car: challenges in a dual-task scenario on a test track. In: Proceedings of the 6th AutomotiveUI. ACM (2014)Google Scholar
  34. 34.
    Turner, J., Bulling, A., Alexander, J., Gellersen, H.: Cross-device gaze-supported point-to-point content transfer. In: Proceedings of the ACM International Symposium on Eye Tracking Research & Applications (ETRA), pp. 19–26 (2014). doi:10.1145/2578153.2578155
  35. 35.
    Valenti, R., Gevers, T.: Accurate eye center location through invariant isocentric patterns. Trans. Pattern Anal. Mach. Intell. 34(9), 1785–1798 (2012)CrossRefGoogle Scholar
  36. 36.
    Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Computer Vision and Pattern Recognition, 2001. Proceedings of the 2001 IEEE Computer Society Conference on CVPR 2001, vol. 1, pp. I–511. IEEE (2001)Google Scholar
  37. 37.
    Wood, E., Bulling, A.: Eyetab: Model-based gaze estimation on unmodified tablet computers. In: Proceedings of the 8th Symposium on Eye Tracking Research & Applications (ETRA), pp. 207–210 (2014). doi:10.1145/2578153.2578185
  38. 38.
    Zhu, D., Moore, S.T., Raphan, T.: Robust pupil center detection using a curvature algorithm. Comput. Methods Progr. Biomed. 59(3), 145–157 (1999)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • Wolfgang Fuhl
    • 1
  • Marc Tonsen
    • 2
  • Andreas Bulling
    • 2
  • Enkelejda Kasneci
    • 1
  1. 1.Perception Engineering GroupUniversity of TübingenTübingenGermany
  2. 2.Perceptual User Interfaces GroupMax Planck Institute for InformaticsSaarbrückenGermany

Personalised recommendations