Behavior Research Methods

, Volume 50, Issue 2, pp 451–465 | Cite as

Online webcam-based eye tracking in cognitive science: A first look



Online experimentation is emerging in many areas of cognitive psychology as a viable alternative or supplement to classical in-lab experimentation. While performance- and reaction-time-based paradigms are covered in recent studies, one instrument of cognitive psychology has not received much attention up to now: eye tracking. In this study, we used JavaScript-based eye tracking algorithms recently made available by Papoutsaki et al. (International Joint Conference on Artificial Intelligence, 2016) together with consumer-grade webcams to investigate the potential of online eye tracking to benefit from the common advantages of online data conduction. We compared three in-lab conducted tasks (fixation, pursuit, and free viewing) with online-acquired data to analyze the spatial precision in the first two, and replicability of well-known gazing patterns in the third task. Our results indicate that in-lab data exhibit an offset of about 172 px (15% of screen size, 3.94° visual angle) in the fixation task, while online data is slightly less accurate (18% of screen size, 207 px), and shows higher variance. The same results were found for the pursuit task with a constant offset during the stimulus movement (211 px in-lab, 216 px online). In the free-viewing task, we were able to replicate the high attention attribution to eyes (28.25%) compared to other key regions like the nose (9.71%) and mouth (4.00%). Overall, we found web technology-based eye tracking to be suitable for all three tasks and are confident that the required hard- and software will be improved continuously for even more sophisticated experimental paradigms in all of cognitive psychology.


Online experiment Web technology Eye tracking Online study Cognitive psychology 



We would like to thank Astrid Hönekopp and Alexander Diel for help in collecting the data and Katharina Sommer for providing the instructional images. All code, raw data and analysis files can be found at The Open Science Framework (


  1. Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the time course of spoken word recognition using eye movements: Evidence for continuous mapping models. Journal of Memory and Language, 38(38), 419–439. doi: 10.1006/jmla.1997.2558 CrossRefGoogle Scholar
  2. Birnbaum, M. H. (2000). Introduction to psychological experiments on the internet. In M. H. Birnbaum (Ed.), Psychological experiments on the internet (pp. xv–xx). Academic Press. doi: 10.1016/B978-012099980-4/50001-0
  3. Blais, C., Jack, R. E., Scheepers, C., Fiset, D., & Caldara, R. (2008). Culture shapes how we look at faces. PLoS ONE, 3(8). doi: 10.1371/journal.pone.0003022
  4. Boraston, Z., & Blakemore, S.-J. (2007). The application of eye-tracking technology in the study of autism. The Journal of Physiology, 581, 893–898. doi: 10.1113/jphysiol.2007.133587 CrossRefPubMedPubMedCentralGoogle Scholar
  5. Bulling, A., & Gellersen, H. (2010). Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing, 9(4), 8–12. doi: 10.1109/MPRV.2010.86 CrossRefGoogle Scholar
  6. Burton, A. M., White, D., & McNeill, A. (2010). The Glasgow face matching test. Behavior Research Methods, 42(1), 286–291. doi: 10.3758/BRM.42.1.286 CrossRefPubMedGoogle Scholar
  7. Caldara, R., Zhou, X., & Miellet, S. (2010). Putting culture under the “Spotlight” reveals universal information use for face recognition. PLoS ONE, 5(3), 1–12. doi: 10.1371/journal.pone.0009708 CrossRefGoogle Scholar
  8. Chapman, P., Underwood, G., & Roberts, K. (2002). Visual search patterns in trained and untrained novice drivers. Transportation Research Part F: Traffic Psychology and Behaviour, 5(2), 157–167. doi: 10.1016/S1369-8478(02)00014-1 CrossRefGoogle Scholar
  9. Chen, M. (2001). What can a mouse cursor tell us more? Correlation of eye/mouse movements on web browsing. Proceedings of the ACM Conference on Human Factors in Computing Systems, 281–282. doi: 10.1145/634067.634234
  10. Chua, H. F., Boland, J. E., & Nisbett, R. E. (2005). Cultural variation in eye movements during scene perception. Proceedings of the National Academy of Sciences of the United States of America, 102(35), 12629–12633. doi: 10.1073/pnas.0506162102 CrossRefPubMedPubMedCentralGoogle Scholar
  11. Dalmaijer, E. S. (2014). Is the low-cost EyeTribe eye tracker any good for research? PeerJ PrePrints, 4(606901), 1–35. doi: 10.7287/peerj.preprints.141v2 Google Scholar
  12. Dalmaijer, E. S., Mathôt, S., & Van der Stigchel, S. (2013). PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior Research Methods, (February 2016), 1–16. doi: 10.3758/s13428-013-0422-2
  13. Duchowski, A. T. (2002). A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers, 34(4), 455–470. doi: 10.3758/BF03195475 CrossRefGoogle Scholar
  14. Fukushima, K., Fukushima, J., Warabi, T., & Barnes, G. R. (2013). Cognitive processes involved in smooth pursuit eye movements: Behavioral evidence, neural substrate and clinical correlation. Frontiers in Systems Neuroscience, 7(March), 4. doi: 10.3389/fnsys.2013.00004 PubMedPubMedCentralGoogle Scholar
  15. Germine, L., Nakayama, K., Duchaine, B. C., Chabris, C. F., Chatterjee, G., & Wilmer, J. B. (2012). Is the Web as good as the lab? Comparable performance from Web and lab in cognitive/perceptual experiments. Psychonomic Bulletin & Review, 19(5), 847–857. doi: 10.3758/s13423-012-0296-9 CrossRefGoogle Scholar
  16. Gosling, S. D., Vazire, S., Srivastava, S., & John, O. P. (2000). Should we trust web-based studies? A comparative analysis of six preconceptions about internet questionnaires. The American Psychologist, 59(2), 93–104. doi: 10.1037/0003-066X.59.2.93 CrossRefGoogle Scholar
  17. Holzman, P. S., Proctor, L. R., & Hughes, D. W. (1973). Eye-tracking patterns in schizophrenia. Science (New York, N.Y.), 181(4095), 179–181.CrossRefGoogle Scholar
  18. Janik, S. W., Wellens, A. R., Goldberg, M. L., & Dell’Osso, L. F. (1978). Eyes as the center of focus in the visual examination of human faces. Perceptual and Motor Skills, 47, 857–858. doi: 10.2466/pms.1978.47.3.857 CrossRefPubMedGoogle Scholar
  19. Lisberger, S. G., Morris, E. J., & Tychsen, L. (1987). Visual motion processing and sensory-motor integration for smooth pursuit eye movements. Annual Review of Neuroscience, 10(1), 97–129. doi: 10.1146/ CrossRefPubMedGoogle Scholar
  20. Mathôt, S., Schreij, D., & Theeuwes, J. (2012). OpenSesame: An open-source, graphical experiment builder for the social sciences. Behavior Research Methods, 44(2), 314–324. doi: 10.3758/s13428-011-0168-7 CrossRefPubMedGoogle Scholar
  21. Møller, F., Laursen, M. L., Tygesen, J., & Sjølie, A. K. (2002). Binocular quantification and characterization of microsaccades. Graefe's Archive for Clinical and Experimental Ophthalmology, 240(9), 765–770. doi: 10.1007/s00417-002-0519-2 CrossRefPubMedGoogle Scholar
  22. Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., & Hays, J. (2016). WebGazer : Scalable webcam eye tracking using user interactions. International Joint Conference on Artificial Intelligence.Google Scholar
  23. Pelphrey, K. A., Sasson, N. J., Reznick, J. S., Paul, G., Goldman, B. D., & Piven, J. (2002). Visual scanning of faces in autism. Journal of Autism and Developmental Disorders, 32(4), 249–261. doi: 10.1023/A:1016374617369 CrossRefPubMedGoogle Scholar
  24. Semmelmann, K., & Weigelt, S. (2016). Online psychophysics: Reaction time effects in cognitive experiments. Behavior Research Methods. doi: 10.3758/s13428-016-0783-4 Google Scholar
  25. Stewart, N., Ungemach, C., Harris, A. J. L., Bartels, D. M., Newell, B. R., Paolacci, G., & Chandler, J. (2015). The average laboratory samples a population of 7,300 Amazon Mechanical Turk workers. Judgment and Decision making, 10(5), 479–491. doi: 10.1017/CBO9781107415324.004 Google Scholar
  26. Valenti, R., Staiano, J., Sebe, N., & Gevers, T. (2009). Webcam-based visual gaze estimation (1), pp. 662–671.Google Scholar
  27. van Gog, T., & Scheiter, K. (2010). Eye tracking as a tool to study and enhance multimedia learning. Learning and Instruction, 20(2), 95–99. doi: 10.1016/j.learninstruc.2009.02.009 CrossRefGoogle Scholar
  28. Walker, R., Walker, D. G., Husain, M., & Kennard, C. (2000). Control of voluntary and reflexive saccades. Experimental Brain Research, 130(4), 540–544. doi: 10.1007/s002219900285 CrossRefPubMedGoogle Scholar
  29. Wedel, M., & Pieters, R. (2000). Eye fixations on advertisements and memory for brands: A model and findings. Marketing Science, 19(4), 297–312. doi: 10.1287/mksc. CrossRefGoogle Scholar
  30. Xu, P., Ehinger, K. A., Zhang, Y., Finkelstein, A., Kulkarni, S. R., & Xiao, J. (2015). TurkerGaze: Crowdsourcing saliency with webcam based eye tracking. arXiv Preprint arXiv: …, 91(12), 5. Retrieved from
  31. Yarbus, A. L. (1967). Eye movements and vision. Neuropsychologia, 6(4), 222. doi: 10.1016/0028-3932(68)90012-2 Google Scholar

Copyright information

© Psychonomic Society, Inc. 2017

Authors and Affiliations

  1. 1.Developmental Neuropsychology, Department of PsychologyRuhr-University BochumBochumGermany

Personalised recommendations