Online Classification of Eye Tracking Data for Automated Analysis of Traffic Hazard Perception

  • Enkelejda Tafaj
  • Thomas C. Kübler
  • Gjergji Kasneci
  • Wolfgang Rosenstiel
  • Martin Bogdan
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8131)

Abstract

Complex and hazardous driving situations often arise with the delayed perception of traffic objects. To automatically detect whether such objects have been perceived by the driver, there is a need for techniques that can reliably recognize whether the driver’s eyes have fixated or are pursuing the hazardous object (i.e., detecting fixations, saccades, and smooth pursuits from raw eye tracking data). This paper presents a system for analyzing the driver’s visual behavior based on an adaptive online algorithm for detecting and distinguishing between fixation clusters, saccades, and smooth pursuits.

Keywords

classification eye data traffic hazard perception 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Berger, C., Winkels, M., Lischke, A., Höppner, J.: GazeAlyze: a MATLAB toolbox for the analysis of eye movement data. Behavior Research Methods 44(2), 404–419 (2012)CrossRefGoogle Scholar
  2. 2.
    Camilli, M., Nacchia, R., Terenzi, M., Di Nocera, F.: Astef: A simple tool for examining fixations. Behavior Research Methods 40, 373–382 (2008)CrossRefGoogle Scholar
  3. 3.
    Duchowski, A.: Eye tracking methodology: Theory and practice. Springer, London (2007)Google Scholar
  4. 4.
    Gitelman, D.R.: Ilab: a program for postexperimental eye movement analysis. Behavioral Research Methods, Instruments and Computers 34(4), 605–612 (2002)CrossRefGoogle Scholar
  5. 5.
    Jolliffe, I.T.: Principal Component Analysis. Springer, New York (1986)CrossRefGoogle Scholar
  6. 6.
    Komogortsev, O.V., Karpov, A.: Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior Research Methods 45, 203–215 (2013)CrossRefGoogle Scholar
  7. 7.
    Komogortsev, O.V., Gobert, D.V., Jayarathna, S., Koh, D., Gowda, S.: Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Transactions on Biomedical Engineering 57, 2635–2645 (2010)CrossRefGoogle Scholar
  8. 8.
    Leigh, R.J., Zee, D.S.: The neurology of eye movements. Oxford University Press (2006)Google Scholar
  9. 9.
    Munn, S.M., Stefano, L., Pelz, J.B.: Fixation-identification in dynamic scenes: comparing an automated algorithm to manual coding. In: Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, APGV 2008, pp. 33–42. ACM, New York (2008)Google Scholar
  10. 10.
    Nagayama, Y.: Role of visual perception in driving. IATSS Research 2, 64–73 (1978)Google Scholar
  11. 11.
    Noton, D., Stark, L.W.: Eye movements and visual perception. Scientific American 224(6), 34–43 (1971)Google Scholar
  12. 12.
    Privitera, C.M., Stark, L.W.: Algorithms for defining visual regions-of-interest: Comparison with eye fixations. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(9), 970–982 (2000)CrossRefGoogle Scholar
  13. 13.
    Privitera, C.M., Stark, L.W.: Scanpath theory, attention, and image processing algorithms for predicting human eye fixations. In: Itti, L., Rees, G., Tsotsos, J. (eds.) Neurobiology of Attention, pp. 269–299 (2005)Google Scholar
  14. 14.
    Salvucci, D., Goldberg, J.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the Eye Tracking Research and Applications, pp. 71–78 (2000)Google Scholar
  15. 15.
    Santella, A., DeCarlo, D.: Robust clustering of eye movement recordings for quantification of visual interest. In: Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, pp. 27–34 (2004)Google Scholar
  16. 16.
    Tafaj, E., Kasneci, G., Rosenstiel, W., Bogdan, M.: Bayesian online clustering of eye movement data. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 285–288. ACM, New York (2012)CrossRefGoogle Scholar
  17. 17.
    Turano, K.A., Geruschat, D.R., Baker, F.H.: Oculomotor strategies for the direction of gaze tested with a real-world activity. Vision Research 43, 333–346 (2003)CrossRefGoogle Scholar
  18. 18.
    Velichkovsky, B.M., Rothert, A., Kopf, M., Dornhöfer, S.M., Joos, M.: Towards an express-diagnostics for level of processing and hazard perception. Transportation Research Part F: Traffic Psychology and Behaviour 5(2), 145–156 (2002)CrossRefGoogle Scholar
  19. 19.
    Vidal, M., Bulling, A., Gellersen, H.: Detection of smooth pursuits using eye movement shape features. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 177–180. ACM, New York (2012)CrossRefGoogle Scholar
  20. 20.
    Wooding, D.S.: Fixation maps: quantifying eye-movement traces. In: Proceedings of the Eye Tracking Research and Applications, pp. 31–36 (2002)Google Scholar
  21. 21.
    Zeeb, E.: Daimler’s new full-scale, high-dynamic driving simulator-a technical overview. Actes INRETS, 157–165 (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Enkelejda Tafaj
    • 1
  • Thomas C. Kübler
    • 1
  • Gjergji Kasneci
    • 2
  • Wolfgang Rosenstiel
    • 1
  • Martin Bogdan
    • 3
  1. 1.Department of Computer EngineeringUniversity of TübingenGermany
  2. 2.Hasso-Plattner-InstituteGermany
  3. 3.Department of Computer EngineeringUniversity of LeipzigGermany

Personalised recommendations