Behavior Research Methods

, Volume 41, Issue 1, pp 1–12 | Cite as

Timing accuracy of Web experiments: A case study using the WebExp software package

  • Frank Keller
  • Subahshini Gunasekharan
  • Neil Mayo
  • Martin Corley
Article

Abstract

Although Internet-based experiments are gaining in popularity, most studies rely on directly evaluating participants’ responses rather than response times. In the present article, we present two experiments that demonstrate the feasibility of collecting response latency data over the World-Wide Web using WebExp—a software package designed to run psychological experiments over the Internet. Experiment 1 uses WebExp to collect measurements for known time intervals (generated using keyboard repetition). The resulting measurements are found to be accurate across platforms and load conditions. In Experiment 2, we use WebExp to replicate a lab-based self-paced reading study from the psycholinguistic literature. The data of the Web-based replication correlate significantly with those of the original study and show the same main effects and interactions. We conclude that WebExp can be used to obtain reliable response time data, at least for the self-paced reading paradigm.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Birnbaum, M. H. (Ed.) (2000). Psychological experiments on the Internet. San Diego: Academic Press.Google Scholar
  2. Cohen, J. D., MacWhinney, B., Flatt, M., & Provost, J. (1993). PsyScope: A new graphic interactive environment for designing psychology experiments. Behavior Research Methods, Instruments, & Computers, 25, 257–271.CrossRefGoogle Scholar
  3. Corley, M., & Scheepers, C. (2002). Syntactic priming in English sentence production: Categorical and latency evidence from an Internet-based study. Psychonomic Bulletin & Review, 9, 126–131.CrossRefGoogle Scholar
  4. East, H. (2005). Models and mechanisms of sentence processing: Accounting for distance effects in temporarily ambiguous English sentences. Unpublished doctoral dissertation, University of Cambridge.Google Scholar
  5. Eichstaedt, J. (2001). An inaccurate-timing filter for reaction-time measurement by JAVA applets implementing Internet-based experiments. Behavior Research Methods, Instruments, & Computers, 33, 179–186.CrossRefGoogle Scholar
  6. Featherston, S. (2005). Universals and grammaticality: Wh-constraints in German and English. Linguistics, 43, 667–711.CrossRefGoogle Scholar
  7. Forster, K. I., & Forster, J. C. (2003). DMDX: A Windows display program with millisecond accuracy. Behavior Research Methods, Instruments, & Computers, 35, 116–124.CrossRefGoogle Scholar
  8. Garnsey, S., Pearlmutter, N. J., Myers, E., & Lotocky, M. A. (1997). The contributions of verb bias and plausibility to the comprehension of temporarily ambiguous sentences. Journal of Memory & Language, 37, 58–93.CrossRefGoogle Scholar
  9. Gunasekharan, S. (2007). Evaluation of Web experiments. Unpublished master’s thesis, University of Edinburgh.Google Scholar
  10. Just, M. A., Carpenter, P. A., & Woolley, J. D. (1982). Paradigms and processes in reading comprehension. Journal of Experimental Psychology: General, 111, 228–238.CrossRefGoogle Scholar
  11. Keller, F., & Alexopoulou, T. (2001). Phonology competes with syntax: Experimental evidence for the interaction of word order and accent placement in the realization of information structure. Cognition, 79, 301–372.CrossRefPubMedGoogle Scholar
  12. Krantz, J. H., Ballard, J., & Scher, J. (1997). Comparing the results of laboratory and World-Wide Web samples on the determinants of female attractiveness. Behavior Research Methods, Instruments, & Computers, 29, 264–269.CrossRefGoogle Scholar
  13. Lodge, M. (1981). Magnitude scaling: Quantitative measurement of opinions. Beverly Hills, CA: Sage.CrossRefGoogle Scholar
  14. MacInnes, W., & Taylor, T. L. (2001). Millisecond timing on PCs and Macs. Behavior Research Methods, Instruments, & Computers, 33, 174–178.CrossRefGoogle Scholar
  15. Masson, M. E. J., & Loftus, G. R. (2003). Using confidence intervals for graphically based data interpretation. Canadian Journal of Experimental Psychology, 57, 203–220.CrossRefPubMedGoogle Scholar
  16. Mayo, N., Corley, M., & Keller, F. (2005). WebExp2 experimenter’s manual. Retrieved February 24, 2008, from www.webexp.info.Google Scholar
  17. McKinney, C. J., MacCormac, E. R., & Welsh-Bohmer, K. A. (1999). Hardware and software for tachistoscopy: How to make accurate measurements on any PC utilizing the Microsoft Windows operating system. Behavior Research Methods, Instruments, & Computers, 31, 129–136.CrossRefGoogle Scholar
  18. Myors, B. (1999). Timing accuracy of PC programs running under DOS and Windows. Behavior Research Methods, Instruments, & Computers, 31, 322–328.CrossRefGoogle Scholar
  19. Naumann, A., Brunstein, A., & Krems, J. F. (2007). DEWEX: A system for designing and conducting Web-based experiments. Behavior Research Methods, 39, 248–258.CrossRefPubMedGoogle Scholar
  20. Rademacher, J. D. M., & Lippke, S. (2007). Dynamic online surveys and experiments with the free open-source software dynQuest. Behavior Research Methods, 39, 415–426.CrossRefPubMedGoogle Scholar
  21. Reimers, S., & Stewart, N. (2007). Adobe Flash as a medium for onconfiguration line experimentation: A test of reaction time measurement capabilities. Behavior Research Methods, 39, 365–370.CrossRefPubMedGoogle Scholar
  22. Reips, U.-D. (2002). Standards for Internet-based experimenting. Experimental Psychology, 49, 243–256.CrossRefPubMedGoogle Scholar
  23. Reips, U.-D., & Neuhaus, C. (2002). WEXTOR: A Web-based tool for generating and visualizing experimental designs and procedures. Behavior Research Methods, Instruments, & Computers, 34, 234–240.CrossRefGoogle Scholar
  24. Schmidt, W. C. (2001). Presentation accuracy of Web animation methods. Behavior Research Methods, Instruments, & Computers, 33, 187–200.CrossRefGoogle Scholar
  25. Shaughnessy, J. J., & Zechmeister, E. B. (1994). Research methods in psychology (3rd ed.). New York: McGraw-Hill.Google Scholar
  26. Sturt, P., Pickering, M. J., & Crocker, M. W. (1999). Structural change and reanalysis difficulty in language comprehension. Journal of Memory & Language, 40, 136–150.CrossRefGoogle Scholar
  27. Thornton, R., MacDonald, M. C., & Arnold, J. E. (2000). The concomitant effects of phrase length and informational content in sentence comprehension. Journal of Psycholinguistic Research, 29, 195–203.CrossRefPubMedGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2009

Authors and Affiliations

  • Frank Keller
    • 1
  • Subahshini Gunasekharan
    • 1
  • Neil Mayo
    • 1
  • Martin Corley
    • 1
  1. 1.School of InformaticsUniversity of EdinburghEdinburghScotland

Personalised recommendations