Advertisement

An adaptive algorithm for fast and reliable online saccade detection

  • Richard SchweitzerEmail author
  • Martin Rolfs
Article

Abstract

To investigate visual perception around the time of eye movements, vision scientists manipulate stimuli contingent upon the onset of a saccade. For these experimental paradigms, timing is especially crucial, because saccade offset imposes a deadline on the display change. Although efficient online saccade detection can greatly improve timing, most algorithms rely on spatial-boundary techniques or absolute-velocity thresholds, which both suffer from weaknesses: late detections and false alarms, respectively. We propose an adaptive, velocity-based algorithm for online saccade detection that surpasses both standard techniques in speed and accuracy and allows the user to freely define the detection criteria. Inspired by the Engbert–Kliegl algorithm for microsaccade detection, our algorithm computes two-dimensional velocity thresholds from variance in the preceding fixation samples, while compensating for noisy or missing data samples. An optional direction criterion limits detection to the instructed saccade direction, further increasing robustness. We validated the algorithm by simulating its performance on a large saccade dataset and found that high detection accuracy (false-alarm rates of < 1%) could be achieved with detection latencies of only 3 ms. High accuracy was maintained even under simulated high-noise conditions. To demonstrate that purely intrasaccadic presentations are technically feasible, we devised an experimental test in which a Gabor patch drifted at saccadic peak velocities. Whereas this stimulus was invisible when presented during fixation, observers reliably detected it during saccades. Photodiode measurements verified that—including all system delays—the stimuli were physically displayed on average 20 ms after saccade onset. Thus, the proposed algorithm provides a valuable tool for gaze-contingent paradigms.

Keywords

Saccade detection Eye movements Intrasaccadic perception Gaze-contingent presentation 

Notes

Acknowledgements

We acknowledge the significant contributions of Ralf Engbert, Konstantin Mergenthaler, Petra Sinn, and Hans Trukenbrod for making the code of their microsaccade detection toolbox publicly available, as well as Nicolas Devillard for the excellent ANSI C implementations and comparisons of different median search algorithms (http://ndevilla.free.fr/median/median/index.html). R.S. was supported by the Studienstiftung des deutschen Volkes and the Berlin School of Mind and Brain. M.R. was supported by the Deutsche Forschungsgemeinschaft (DFG, grants RO3579/2-1, RO3579/8-1, and RO3579/10-1).

Availability

Implementations of the proposed algorithm in C, Python, and Matlab are available on Github: https://github.com/richardschweitzer/OnlineSaccadeDetection.

Open practices statement

The saccade data and code used for simulations, data collected throughout the experimental test, experimental code, and data analysis scripts are available on the Open Science Framework: https://osf.io/3pck5/. The experimental test was not preregistered.

Author contributions

R.S. implemented the algorithm and ran simulations. Validation procedure was conceptualized by R.S. and M.R. The experimental test was designed, run, and analyzed by R.S. under M.R.’s supervision. R.S. drafted the manuscript, and M.R. provided critical revisions.

References

  1. Arabadzhiyska, E., Tursun, O. T., Myszkowski, K., Seidel, H.-P., & Didyk, P. (2017). Saccade landing position prediction for gaze-contingent rendering. ACM Transactions on Graphics, 36, 50.CrossRefGoogle Scholar
  2. Balsdon, T., Schweitzer, R., Watson, T. L., & Rolfs, M. (2018). All is not lost: Post-saccadic contributions to the perceptual omission of intra-saccadic streaks. Consciousness and Cognition, 64, 19–31.CrossRefGoogle Scholar
  3. Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67, 1–48. doi: https://doi.org/10.18637/jss.v067.i01 CrossRefGoogle Scholar
  4. Bedell, H. E., & Yang, J. (2001). The attenuation of perceived image smear during saccades. Vision Research, 41, 521–528.CrossRefGoogle Scholar
  5. Burr, D. C., Holt, J., Johnstone, J. R., & Ross, J. (1982). Selective depression of motion sensitivity during saccades. Journal of Physiology, 333, 1–15.CrossRefGoogle Scholar
  6. Burr, D. C., Morrone, M. C., & Ross, J. (1994). Selective suppression of the magnocellular visual pathway during saccadic eye movements. Nature, 371, 511–513. doi: https://doi.org/10.1038/371511a0 CrossRefPubMedGoogle Scholar
  7. Campbell, F. W., & Wurtz, R. H. (1978). Saccadic omission: Why we do not see a grey-out during a saccadic eye movement. Vision Research, 18, 1297–1303.CrossRefGoogle Scholar
  8. Castet, E. (2010). Perception of intra-saccadic motion. In U. J. Ilg & G. S. Masson (Eds.), Dynamics of visual motion processing (pp. 213–238). Berlin, Germany: Springer.Google Scholar
  9. Castet, E., Jeanjean, S., & Masson, G. S. (2002). Motion perception of saccade-induced retinal translation. Proceedings of the National Academy of Sciences, 99, 15159–15163.CrossRefGoogle Scholar
  10. Castet, E., & Masson, G. S. (2000). Motion perception during saccadic eye movements. Nature Neuroscience, 2, 177–183.CrossRefGoogle Scholar
  11. Collewijn, H., Erkelens, C. J., & Steinman, R. M. (1988). Binocular co-ordination of human horizontal saccadic eye movements. Journal of Physiology, 404, 157–182.CrossRefGoogle Scholar
  12. Collins, T., Rolfs, M., Deubel, H., & Cavanagh, P. (2009). Post-saccadic location judgments reveal remapping of saccade targets to non-foveal locations. Journal of Vision, 9(5), 29. doi: https://doi.org/10.1167/9.5.29 CrossRefPubMedGoogle Scholar
  13. Cornelissen, F. W., Peters, E. M., & Palmer, J. (2002). The Eyelink Toolbox: Eye tracking with MATLAB and the Psychophysics Toolbox. Behavior Research Methods, Instruments, & Computers, 34, 613–617. doi: https://doi.org/10.3758/BF03195489 CrossRefGoogle Scholar
  14. Dalmaijer, E. S., Mathôt, S., & Van der Stigchel, S. (2014). PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior Research Methods, 46, 913–921. doi: https://doi.org/10.3758/s13428-013-0422-2 CrossRefPubMedGoogle Scholar
  15. Delorme, A., & Makeig, S. (2004). EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience Methods, 134, 9–21. doi: https://doi.org/10.1016/j.jneumeth.2003.10.009 CrossRefPubMedGoogle Scholar
  16. Deubel, H., Elsner, T., & Hauske, G. (1987). Saccadic eye movements and the detection of fast-moving gratings. Biological Cybernetics, 57, 37–45.CrossRefGoogle Scholar
  17. Deubel, H., Schneider, W. X., & Bridgeman, B. (1996). Postsaccadic target blanking prevents saccadic suppression of image displacement. Vision Research, 36, 985–996. doi: https://doi.org/10.1016/0042-6989(95)00203-0 CrossRefPubMedGoogle Scholar
  18. Diamond, M. R., Ross, J., & Morrone, M. C. (2000). Extraretinal control of saccadic suppression. Journal of Neuroscience, 20, 3449–3455.CrossRefGoogle Scholar
  19. Dimigen, O., Sommer, W., Hohlfeld, A., Jacobs, A. M., & Kliegl, R. (2011). Coregistration of eye movements and EEG in natural reading: Analyses and review. Journal of Experimental Psychology: General, 140, 552–572. doi: https://doi.org/10.1037/a0023885 CrossRefGoogle Scholar
  20. Dorr, M., Martinetz, T., Gegenfurtner, K. R., & Barth, E. (2010). Variability of eye movements when viewing dynamic natural scenes. Journal of Vision, 10(10), 28. doi: https://doi.org/10.1167/10.10.28 CrossRefPubMedGoogle Scholar
  21. Duyck, M., Collins, T., & Wexler, M. (2016). Masking the saccadic smear. Journal of Vision, 16(10), 1. doi: https://doi.org/10.1167/16.10.1 CrossRefPubMedGoogle Scholar
  22. Engbert, R., & Kliegl, R. (2003). Microsaccades uncover the orientation of covert attention. Vision Research, 43, 1035–1045. doi: https://doi.org/10.1016/S0042-6989(03)00084-1 CrossRefPubMedGoogle Scholar
  23. Engbert, R., & Mergenthaler, K. (2006). Microsaccades are triggered by low retinal image slip. Proceedings of the National Academy of Sciences, 103, 7192–7197.CrossRefGoogle Scholar
  24. Engbert, R., Rothkegel, L., Backhaus, D., & Trukenbrod, H. A. (2016). Evaluation of velocity-based saccade detection in the smi-etg 2W system [Technical Report]. Retrieved from http://read.psych.uni-potsdam.de/attachments/article/156/TechRep-16-1-Engbert.pdf
  25. García-Pérez, M. A., & Peli, E. (2011). Visual contrast processing is largely unaltered during saccades. Frontiers in Psychology, 2, 247. doi: https://doi.org/10.3389/fpsyg.2011.00247 CrossRefPubMedPubMedCentralGoogle Scholar
  26. Han, P., Saunders, D. R., Woods, R. L., & Luo, G. (2013). Trajectory prediction of saccadic eye movements using a compressed exponential model. Journal of Vision, 13(8), 27. doi: https://doi.org/10.1167/17.8.4 CrossRefPubMedPubMedCentralGoogle Scholar
  27. Higgins, E., & Rayner, K. (2015). Transsaccadic processing: Stability, integration, and the potential role of remapping. Attention, Perception, & Psychophysics, 77, 3–27. doi: https://doi.org/10.3758/s13414-014-0751-y CrossRefGoogle Scholar
  28. Hollingworth, A., Richard, A. M., & Luck, S. J. (2008). Understanding the function of visual short-term memory: Transsaccadic memory, object correspondence, and gaze correction. Journal of Experimental Psychology. General, 137, 163–181. doi: https://doi.org/10.1037/0096-3445.137.1.163 CrossRefPubMedPubMedCentralGoogle Scholar
  29. Kalogeropoulou, Z., & Rolfs, M. (2017). Saccadic eye movements do not disrupt the deployment of feature-based attention. Journal of Vision, 17(8), 4. doi: https://doi.org/10.1167/17.8.4 CrossRefPubMedGoogle Scholar
  30. Kleiner, M., Brainard, D., & Pelli, D. (2007). What’s new in Psychtoolbox-3? Perception, 36(ECVP Abstract Suppl), 14.Google Scholar
  31. Mathôt, S., Melmi, J., & Castet, E. (2015). Intrasaccadic perception triggers pupillary constriction. PeerJ, 3, e1150.CrossRefGoogle Scholar
  32. Matin, E., Clymer, A. B., & Matin, L. (1972). Metacontrast and saccadic suppression. Science, 178, 179–182.CrossRefGoogle Scholar
  33. Melcher, D., & Colby, C. L. (2008). Trans-saccadic perception. Trends in Cognitive Sciences, 12, 466–473. doi: https://doi.org/10.1016/j.tics.2008.09.003 CrossRefPubMedGoogle Scholar
  34. Nyström, M., Andersson, R., Holmqvist, K., & Van De Weijer, J. (2013). The influence of calibration method and eye physiology on eyetracking data quality. Behavior Research Methods, 45, 272–288. doi: https://doi.org/10.3758/s13428-012-0247-4 CrossRefPubMedGoogle Scholar
  35. Panouillères, M. T., Gaveau, V., Debatisse, J., Jacquin, P., LeBlond, M., & Pélisson, D. (2016). Oculomotor adaptation elicited by intra-saccadic visual stimulation: Time-course of efficient visual target perturbation. Frontiers in Human Neuroscience, 10, 91. doi: https://doi.org/10.3389/fnhum.2016.00091 CrossRefPubMedPubMedCentralGoogle Scholar
  36. Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision, 10, 437–442. doi: https://doi.org/10.1163/156856897X00366 CrossRefPubMedGoogle Scholar
  37. Poth, C. H., Foerster, R. M., Behler, C., Schwanecke, U., Schneider, W. X., & Botsch, M. (2018). Ultrahigh temporal resolution of visual presentation using gaming monitors and g-sync. Behavior Research Methods, 50, 26–38.CrossRefGoogle Scholar
  38. Press, W. H., Teukolsky, S. A., Vetterling, W. T., & Flannery, B. P. (2007). Numerical recipes: The art of scientific computing (3rd ed.). Cambridge, UK: Cambridge University Press.Google Scholar
  39. Prime, S. L., Vesia, M., & Crawford, J. D. (2011). Cortical mechanisms for trans-saccadic memory and integration of multiple object features. Philosophical Transactions of the Royal Society B, 366, 540–553.CrossRefGoogle Scholar
  40. Rayner, K. (1975). The perceptual span and peripheral cues in reading. Cognitive Psychology, 7, 65–81.CrossRefGoogle Scholar
  41. Ross, J., Morrone, M. C., Goldberg, M. E., & Burr, D. C. (2001). Changes in visual perception at the time of saccades. Trends in Neurosciences, 24, 113–121.CrossRefGoogle Scholar
  42. Schweitzer, R., & Rolfs, M. (2017). Intra-saccadic motion streaks as a cue to the localization of objects across eye movements [Abstract]. Journal of Vision, 17(10), 918. doi: https://doi.org/10.1167/17.10.918 CrossRefGoogle Scholar
  43. Schweitzer, R., Watson, T., Watson, J., & Rolfs, M. (2019). The joy of retinal painting: A build-it-yourself device for intrasaccadic presentations. Perception, 48, 1020–1025. doi: https://doi.org/10.1177/0301006619867868 CrossRefPubMedGoogle Scholar
  44. SR Research. (2005). EyeLink II user manual, version 2.14. Mississauga, ON: SR Research Ltd.Google Scholar
  45. SR Research. (2010). EyeLink 1000 user manual, version 1.5.2. Mississauga, ON: SR Research Ltd.Google Scholar
  46. SR Research. (2013). EyeLink 1000 plus user manual, version 1.0.12. Mississauga, ON: SR Research Ltd.Google Scholar
  47. Szinte, M., & Cavanagh, P. (2011). Spatiotopic apparent motion reveals local variations in space constancy. Journal of Vision, 11(2), 4. doi: https://doi.org/10.1167/11.2.4 CrossRefPubMedGoogle Scholar
  48. Tobii Technology AB. (2010). Timing guide for Tobii eye trackers and eye tracking software [Technical Report]. Retrieved from https://www.tobiipro.com/siteassets/tobii-pro/learn-and-support/design/eye-tracker-timing-performance/tobii-eye-tracking-timing.pdf
  49. Townsend, J. T., & Ashby, F. G. (1978). Methods of modeling capacity in simple processing systems. In J. N. J. Castellan & F. Restle (Eds.), Cognitive theory (Vol. 3, pp. 199–239). New York, NY: Erlbaum.Google Scholar
  50. Townsend, J. T., & Ashby, F. G. (1983). Stochastic modeling of elementary psychological processes. Cambridge University Press Archive.Google Scholar
  51. Volkmann, F. C. (1986). Human visual suppression. Vision Research, 26, 1401–1416.CrossRefGoogle Scholar
  52. Volkmann, F. C., Riggs, L. A., White, K. D., & Moore, R. K. (1978). Contrast sensitivity during saccadic eye movements. Vision Research, 18, 1193–1199.CrossRefGoogle Scholar
  53. VPixx Technologies. (2017). TRACKPIXX3 hardware manual version 1.0. Saint-Bruno, QC: VPixx Technologies Inc.Google Scholar
  54. Watson, T. L., & Krekelberg, B. (2009). The relationship between saccadic suppression and perceptual stability. Current Biology, 19, 1040–1043.CrossRefGoogle Scholar
  55. Watson, T., Schweitzer, R., Castet, E., Ohl, S., & Rolfs, M. (2017). Intra-saccadic localisation is consistently carried out in world-centered coordinates [Abstract]. Journal of Vision, 17(10), 1276. doi: https://doi.org/10.1167/17.10.1276 CrossRefGoogle Scholar
  56. Wolf, C., & Schütz, A. C. (2015). Trans-saccadic integration of peripheral and foveal feature information is close to optimal. Journal of Vision, 15(16), 1. doi: https://doi.org/10.1167/15.16.1 CrossRefPubMedGoogle Scholar
  57. Yuval-Greenberg, S., Merriam, E. P., & Heeger, D. J. (2014). Spontaneous microsaccades reflect shifts in covert attention. Journal of Neuroscience, 34, 13693–13700. doi: https://doi.org/10.1523/JNEUROSCI.0582-14.2014 CrossRefPubMedGoogle Scholar

Copyright information

© The Psychonomic Society, Inc. 2019

Authors and Affiliations

  1. 1.Department of PsychologyHumboldt-Universität zu BerlinBerlinGermany
  2. 2.Bernstein Center for Computational Neuroscience BerlinBerlinGermany
  3. 3.Berlin School of Mind and BrainHumboldt-Universität zu BerlinBerlinGermany

Personalised recommendations