Best practices: Two Web-browser-based methods for stimulus presentation in behavioral experiments with high-resolution timing requirements

  • Pablo Garaizar
  • Ulf-Dietrich Reips


The Web is a prominent platform for behavioral experiments, for many reasons (relative simplicity, ubiquity, and accessibility, among others). Over the last few years, many behavioral and social scientists have conducted Internet-based experiments using standard web technologies, both in native JavaScript and using research-oriented frameworks. At the same time, vendors of widely used web browsers have been working hard to improve the performance of their software. However, the goals of browser vendors do not always coincide with behavioral researchers’ needs. Whereas vendors want high-performance browsers to respond almost instantly and to trade off accuracy for speed, researchers have the opposite trade-off goal, wanting their browser-based experiments to exactly match the experimental design and procedure. In this article, we review and test some of the best practices suggested by web-browser vendors, based on the features provided by new web standards, in order to optimize animations for browser-based behavioral experiments with high-resolution timing requirements. Using specialized hardware, we conducted four studies to determine the accuracy and precision of two different methods. The results using CSS animations in web browsers (Method 1) with GPU acceleration turned off showed biases that depend on the combination of browser and operating system. The results of tests on the latest versions of GPU-accelerated web browsers showed no frame loss in CSS animations. The same happened in many, but not all, of the tests conducted using requestAnimationFrame (Method 2) instead of CSS animations. Unbeknownst to many researchers, vendors of web browsers implement complex technologies that result in reduced quality of timing. Therefore, behavioral researchers interested in timing-dependent procedures should be cautious when developing browser-based experiments and should test the accuracy and precision of the whole experimental setup (web application, web browser, operating system, and hardware).


Web animations Experimental software High-resolution timing iScience Browser 


  1. Bamberg, W. (2018a). Intensive JavaScript. MDN web docs. Retrieved from
  2. Bamberg, W. (2018b). Animating CSS properties. MDN web docs. Retrieved from
  3. Barnhoorn, J. S., Haasnoot, E., Bocanegra, B. R., & van Steenbergen, H. (2015). QRTEngine: An easy solution for running online reaction time experiments using Qualtrics. Behavior Research Methods, 47, 918–929. CrossRefGoogle Scholar
  4. Belshe, M., Peon, R., Thomson, M. (2015). Hypertext Transfer Protocol Version 2 (HTTP/2). Retrieved from
  5. Birnbaum, M. H. (2004). Human research and data collection via the Internet. Annual Review of Psychology, 55, 803–832. CrossRefGoogle Scholar
  6. Buchanan, T., & Reips, U.-D. (2001). Platform-dependent biases in online research: Do Mac users really think different? In K. J. Jonas, P. Breuer, B. Schauenburg, & M. Boos (Eds.), Perspectives on Internet research: Concepts and methods. Available at Accessed 26 Sept 2018
  7. Garaizar, P., Vadillo, M. A., & López-de-Ipiña, D. (2014). Presentation accuracy of the web revisited: Animation methods in the HTML5 era. PLoS ONE, 9, e109812. CrossRefPubMedPubMedCentralGoogle Scholar
  8. Götz, F. M., Stieger, S., & Reips, U.-D. (2017). Users of the main smartphone operating systems (iOS, Android) differ only little in personality. PLoS ONE, 12, e0176921. CrossRefPubMedPubMedCentralGoogle Scholar
  9. Grigorik, I., & Weiss, Y. (2018). W3C Preload API. Retrieved from
  10. Henninger, F., Mertens, U. K., Shevchenko, Y., & Hilbig, B. E. (2017). lab.js: Browser-based behavioral research.
  11. Honing, H., & Reips, U.-D. (2008). Web-based versus lab-based studies: A response to Kendall (2008). Empirical Musicology Review, 3, 73–77.
  12. Krantz, J., & Reips, U.-D. (2017). The state of web-based research: A survey and call for inclusion in curricula. Behavior Research Methods, 49, 1621–1629. CrossRefPubMedGoogle Scholar
  13. Kyöstilä, S. (2018). Clamp to 100us. Retrieved from
  14. de Leeuw, J. R. (2015). jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods, 47, 1–12. CrossRefGoogle Scholar
  15. de Leeuw, J. R., & Motz, B. A. (2016). Psychophysics in a Web browser? Comparing response times collected with JavaScript and Psychophysics Toolbox in a visual search task. Behavior Research Methods, 48, 1–12. CrossRefGoogle Scholar
  16. Lewis, P. (2018). Rendering performance. Retrieved from
  17. Mangan, M., & Reips, U.-D. (2007). Sleep, sex, and the Web: Surveying the difficult-to-reach clinical population suffering from sexsomnia. Behavior Research Methods, 39, 233–236. CrossRefGoogle Scholar
  18. Mozilla. (2018). Concurrency model and Event Loop. MDN web docs. Retrieved from
  19. Musch, J., & Reips, U.-D. (2000). A brief history of Web experimenting. In M. H. Birnbaum (Ed.), Psychological experiments on the Internet (pp. 61–88). San Diego: Academic Press. CrossRefGoogle Scholar
  20. Plant, R. R. (2016). A reminder on millisecond timing accuracy and potential replication failure in computer-based psychology experiments: An open letter. Behavior Research Methods, 48, 408–411. CrossRefPubMedGoogle Scholar
  21. Plant, R. R., Hammond, N., & Turner, G. (2004). Self-validating presentation and response timing in cognitive paradigms: How and why? Behavior Research Methods, Instruments, & Computers, 36, 291–303. CrossRefGoogle Scholar
  22. Reimers, S., & Stewart, N. (2015). Presentation and response timing accuracy in Adobe Flash and HTML5/JavaScript Web experiments. Behavior Research Methods, 47, 309–327. CrossRefPubMedGoogle Scholar
  23. Reips, U.-D. (2000). The Web experiment method: Advantages, disadvantages, and solutions. In M. H. Birnbaum (Ed.), Psychological experiments on the Internet (pp. 89–117). San Diego: Academic Press.
  24. Reips, U.-D. (2002). Standards for Internet-based experimenting. Experimental Psychology, 49, 243–256. CrossRefPubMedGoogle Scholar
  25. Reips, U.-D. (2007). Reaction times in Internet-based research. Invited symposium talk at the 37th Meeting of the Society for Computers in Psychology (SCiP) Conference, St. Louis.Google Scholar
  26. Reips, U.-D. (2012). Using the Internet to collect data. In H. Cooper, P. M. Camic, R. Gonzalez, D. L. Long, A. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological (pp. 291–310). Washington, DC: American Psychological Association. CrossRefGoogle Scholar
  27. Reips, U.-D., & Stieger, S. (2004). Scientific LogAnalyzer: A Web-based tool for analyses of server log files in psychological research. Behavior Research Methods, Instruments, & Computers, 36, 304–311. CrossRefGoogle Scholar
  28. Schmidt, W. C. (1997). World-Wide Web survey research: Benefits, potential problems, and solutions. Behavior Research Methods, Instruments, & Computers, 29, 274–279. CrossRefGoogle Scholar
  29. Schmidt, W. C. (2007). Technical considerations when implementing online research. In A. Joinson, K. McKenna, T. Postmes, & U.-D. Reips (Eds.), The Oxford handbook of Internet psychology (pp. 461–472). Oxford: Oxford University Press.Google Scholar
  30. Schneider, W., Eschman, A., and Zuccolotto, A. (2012). E-Prime user’s guide. Pittsburgh: Psychology Software Tools, Inc.Google Scholar
  31. Scholz, F. (2018). MDN web docs. Retrieved from:
  32. Schwarz, S., & Reips, U.-D. (2001). CGI versus JavaScript: A Web experiment on the reversed hindsight bias. In U.-D. Reips & M. Bosnjak (Eds.), Dimensions of Internet science (pp. 75–90). Lengerich: Pabst.Google Scholar
  33. van Steenbergen, H., & Bocanegra, B. R. (2016). Promises and pitfalls of Web-based experimentation in the advance of replicable psychological science: A reply to Plant (2015). Behavior Research Methods, 48, 1713–1717. CrossRefGoogle Scholar
  34. Stieger, S., & Reips, U.-D. (2010). What are participants doing while filling in an online questionnaire: A paradata collection tool and an empirical study. Computers in Human Behavior, 26, 1488–1495. CrossRefGoogle Scholar
  35. WHATWG (Apple, Google, Mozilla, Microsoft). (2018). HTML living standard: Event loops. Retrieved from
  36. Wolfe, C. R. (2017). Twenty years of Internet-based research at SCiP: A discussion of surviving concepts and new methodologies. Behavior Research Methods, 49, 1615–1620. CrossRefPubMedGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2018

Authors and Affiliations

  1. 1.University of DeustoBilbaoSpain
  2. 2.University of KonstanzKonstanzGermany

Personalised recommendations