Advertisement

Improving the performance of eye trackers with limited spatial accuracy and low sampling rates for reading analysis by heuristic fixation-to-word mapping

  • Oleg Špakov
  • Howell Istance
  • Aulikki Hyrskykari
  • Harri Siirtola
  • Kari-Jouko Räihä
Article

Abstract

The recent growth in low-cost eye-tracking systems makes it feasible to incorporate real-time measurement and analysis of eye position data into activities such as learning to read. It also enables field studies of reading behavior in the classroom and other learning environments. We present a study of the data quality provided by two remote eye trackers, one being a low-sampling-rate, low-cost system. Then we present two algorithms for mapping fixations derived from the data to the words being read. One is for immediate (or real-time) mapping of fixations to words and the other for deferred (or post hoc) mapping. Following this, an evaluation study is reported. Both studies were carried out in the classroom of a Finnish elementary school with students who were second graders. This study shows very high success rates in automatically mapping fixations to the lines of text being read when the mapping is deferred. The success rates for immediate mapping are comparable with those obtained in earlier studies, although here the data is collected some 10 min after initial calibration of low-sample (30 Hz) remote eye trackers, rather than a laboratory setting using high-sampling-rate trackers. The results provide a solid basis for developing systems for use in classrooms and other learning environments that can provide immediate automatic support with reading, and share data between a group of learners and the teacher of that group. This makes possible new approaches to the learning of reading and comprehension skills.

Keywords

Low-cost eye tracker Reading aid Fixation mapping algorithm Data quality Elementary school 

Notes

Acknowledgements

The work was supported by the Academy of Finland as part of the GaSP project (Grant number 2501287895). We wish to thank the students of class 2C at Lamminpää School in Tampere, Finland who took part in the work reported here so enthusiastically. Our grateful thanks go as well to Matti Taimi and Suvi Taipale, members of staff at the school, for their great support. Inka Hyrskykari worked as the Research Assistant and ran the data collection trials in the classroom. We also wish to thank the reviewers of the paper for their valuable comments and input.

References

  1. Abdulin, E. R., & Komogortsev, O. V .(2015). Person verification via eye movement-driven text reading model. In 7th international conference on biometrics theory, applications and systems, IEEE, BTAS.  https://doi.org/10.1109/BTAS.2015.7358786, http://ieeexplore.ieee.org/document/7358786/.
  2. Beymer, D., & Russell, D. M. (2005). WebGazeAnalyzer: A system for capturing and analyzing web reading behavior using eye gaze. In CHI ’05 extended abstracts on human factors in computing systems, ACM, New York, NY, USA, CHI EA ’05, pp 1913–1916.  https://doi.org/10.1145/1056808.1057055.
  3. Beymer, D., Orton, P. Z., & Russell, D. M. (2007). An eye tracking study of how pictures influence online reading. In IFIP conference on human-computer interaction, Springer, pp 456–460.Google Scholar
  4. Biedert, R., Hees, J., Dengel, A., & Buscher, G. (2012). A robust realtime reading-skimming classifier. In Proceedings of the symposium on eye tracking research and applications, ACM, New York, NY, USA, ETRA ’12, pp 123–130.  https://doi.org/10.1145/2168556.2168575.
  5. Cohen, A. L. (2013). Software for the automatic correction of recorded eye fixation locations in reading experiments. Behavior Research Methods, 45(3), 679–683.  https://doi.org/10.3758/s13428-012-0280-3.CrossRefGoogle Scholar
  6. Feit, A. M., Williams, S., Toledo, A., Paradiso, A., Kulkarni, H., Kane, S., & Morris, M. R. (2017). Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 CHI conference on human factors in computing systems, ACM, New York, NY, USA, CHI ’17, pp 1118–1130.  https://doi.org/10.1145/3025453.3025599.
  7. Hamari, J., & Eranti, V. (2011). Framework for designing and evaluating game achievements. In Proceedings of the 2011 DiGRA international conference: Think design play, DiGRA/Utrecht school of the arts. http://www.digra.org/wp-content/uploads/digital-library/11307.59151.pdf.
  8. Hannus, M., & Hyönä, J. (1999). Utilization of illustrations during learning of science textbook passages among low- and high-ability children. Contemporary Educational Psychology, 24(2), 95–123.  https://doi.org/10.1006/ceps.1998.0987, http://www.sciencedirect.com/science/article/pii/S0361476X98909870.CrossRefGoogle Scholar
  9. Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & van de Weijer, J. (2011). Eye Tracking: A comprehensive guide to methods and measures. OUP Oxford. https://books.google.fi/books?id=5rIDPV1EoLUC.
  10. Holmqvist, K., Nyström, M., & Mulvey, F. (2012). Eye tracker data quality: What it is and how to measure it. In Proceedings of the symposium on eye tracking research and applications, ACM, New York, NY, USA, ETRA ’12, pp 45–52.  https://doi.org/10.1145/2168556.2168563.
  11. Hornof, A. J., & Halverson, T. (2002). Cleaning up systematic error in eye-tracking data by using required fixation locations. Behavior Research Methods, Instruments, & Computers, 34(4), 592–604.  https://doi.org/10.3758/BF03195487.CrossRefGoogle Scholar
  12. Hyrskykari, A. (2006a). Eyes in Attentive Interfaces: Experiences from Creating IDict, a Gaze-aware Reading Aid. Dissertations in Interactive Technology, University of Tampere, Department of Computer Sciences. https://books.google.fi/books?id=g6cNMwAACAAJ.
  13. Hyrskykari, A. (2006b). Utilizing eye movements: Overcoming inaccuracy while tracking the focus of attention during reading. Computers in Human Behavior, 22(4), 657 – 671.  https://doi.org/10.1016/j.chb.2005.12.013.CrossRefGoogle Scholar
  14. Martinez-Gomez, P., Chen, C., Hara, T., Kano, Y., & Aizawa, A. (2012). Image registration for text-gaze alignment. In Proceedings of the 2012 ACM international conference on intelligent user interfaces, ACM, New York, NY, USA, IUI ’12, pp 257–260.  https://doi.org/10.1145/2166966.2167012.
  15. Niehorster, D. C., Cornelissen, T. H. W., Holmqvist, K., Hooge, I. T. C., & Hessels, R. S. (2018). What to expect from your remote eye-tracker when participants are unrestrained. Behavior Research Methods, 50(1), 213–227.  https://doi.org/10.3758/s13428-017-0863-0.CrossRefGoogle Scholar
  16. Palmer, C., & Sharif, B. (2016). Towards automating fixation correction for source code. In Proceedings of the ninth biennial ACM symposium on eye tracking research & applications, ACM, New York, NY, USA, ETRA ’16, pp 65–68.  https://doi.org/10.1145/2857491.2857544.
  17. Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422.  https://doi.org/10.1037/0033-2909.124.3.372 CrossRefPubMedGoogle Scholar
  18. Reingold, E. M. (2014). Eye tracking research and technology: Towards objective measurement of data quality. Visual Cognition, 22(3-4), 635–652.  https://doi.org/10.1080/13506285.2013.876481 CrossRefPubMedPubMedCentralGoogle Scholar
  19. Sanches, C. L., Kise, K., & Augereau, O. (2015). Eye gaze and text line matching for reading analysis. In Adjunct proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2015 ACM international symposium on wearable computers, ACM, New York, NY, USA, UbiComp/ISWC’15 Adjunct, pp 1227–1233.  https://doi.org/10.1145/2800835.2807936.
  20. Sharmin, S., Špakov, O., & Räihä, K.J. (2013). Reading on-screen text with gaze-based auto-scrolling. In Proceedings of the 2013 conference on eye tracking South Africa, ACM, New York, NY, USA, ETSA ’13, pp 24–31.  https://doi.org/10.1145/2509315.2509319.
  21. Sibert, J. L., Gokturk, M., & Lavine, R. A. (2000). The reading assistant: Eye gaze triggered auditory prompting for reading remediation. In Proceedings of the 13th annual ACM symposium on user interface software and technology, ACM, New York, NY, USA, UIST’00, pp 101–107.  https://doi.org/10.1145/354401.354418.
  22. Stampe, D. M., & Reingold, E. M. (1995). Selection by looking: a novel computer interface and its application to psychological research. Studies in Visual Information Processing, 6, 467–478.CrossRefGoogle Scholar
  23. Špakov, O., Siirtola, H., Istance, H., & Räihä, K.J. (2017). Visualizing the reading activity of people learning to read. Journal of Eye Movement Research 10(5).  https://doi.org/10.16910/jemr.10.5.5.
  24. Špakov, O., Istance, H., Viitanen, T., Siirtola, H., & Räihä, K.J. (2018). Enabling unsupervised eye tracker calibration by school children through games. In Proceedings of the 2018 ACM symposium on eye tracking research & applications, ACM, New York, NY, USA, ETRA ’18, pp 36:1–36:9.  https://doi.org/10.1145/3204493.3204534.

Copyright information

© Psychonomic Society, Inc. 2018

Authors and Affiliations

  • Oleg Špakov
    • 1
  • Howell Istance
    • 1
  • Aulikki Hyrskykari
    • 1
  • Harri Siirtola
    • 1
  • Kari-Jouko Räihä
    • 1
  1. 1.Tampere Unit for Computer-Human InteractionUniversity of TampereTampereFinland

Personalised recommendations