Behavior Research Methods

, Volume 49, Issue 5, pp 1686–1695 | Cite as

Metronome LKM: An open source virtual keyboard driver to measure experiment software latencies

Article

Abstract

Experiment software is often used to measure reaction times gathered with keyboards or other input devices. In previous studies, the accuracy and precision of time stamps has been assessed through several means: (a) generating accurate square wave signals from an external device connected to the parallel port of the computer running the experiment software, (b) triggering the typematic repeat feature of some keyboards to get an evenly separated series of keypress events, or (c) using a solenoid handled by a microcontroller to press the input device (keyboard, mouse button, touch screen) that will be used in the experimental setup. Despite the advantages of these approaches in some contexts, none of them can isolate the measurement error caused by the experiment software itself. Metronome LKM provides a virtual keyboard to assess an experiment’s software. Using this open source driver, researchers can generate keypress events using high-resolution timers and compare the time stamps collected by the experiment software with those gathered by Metronome LKM (with nanosecond resolution). Our software is highly configurable (in terms of keys pressed, intervals, SysRq activation) and runs on 2.6–4.8 Linux kernels.

Keywords

Virtual keyboard Experimental software Open source software 

References

  1. Badle, S., Bakken, J., Barantsev, A., Beans, E., Berrada, D., Bevan, J., . . . Wagner-Hall, D. (2012). Selenium—Web Browser Automation. Retrieved from http://seleniumhq.org.
  2. Bhalla, M., & Bhalla, A. (2010). Comparative study of various touchscreen technologies. International Journal of Computer Applications, 6, 12–18.CrossRefGoogle Scholar
  3. Bovet, D.P., & Cesati, M. (2005). Understanding the Linux kernel (3rd ed.). O’Reilly.Google Scholar
  4. Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Vision, 10, 433–436. doi:https://doi.org/10.1163/156856897X00357 CrossRefPubMedGoogle Scholar
  5. Chapweske, A. (2003). The PS/2 mouse/keyboard protocol. Retrieved from www.computer-engineering.org/ps2protocol.
  6. Crosbie, J. (1990). The Microsoft mouse as a multipurpose response device for the IBM PC/XT/AT. Behavior Research Methods, 22, 305–316.CrossRefGoogle Scholar
  7. Damian, M. F. (2010). Does variability in human performance outweigh imprecision in response devices such as computer keyboards? Behavior Research Methods, 42, 205–211. doi:https://doi.org/10.3758/BRM.42.1.205 CrossRefPubMedGoogle Scholar
  8. de Leeuw, J. R. (2015). jsPsych: A JavaScript library for creating behavioral experiments in a web browser. Behavior Research Methods, 47, 1–12. doi:https://doi.org/10.3758/s13428-014-0458-y CrossRefPubMedGoogle Scholar
  9. Eichstaedt, J. (2001). An inaccurate-timing filter for reaction time measurement by JAVA applets implementing Internet-based experiments. Behavior Research Methods, Instruments, & Computers, 33, 179–186.CrossRefGoogle Scholar
  10. Häusler, J., Sommer, M., & Chroust, S. (2007). Optimizing technical precision of measurement in computerized psychological assessment on Windows platforms. Psychology Science, 49, 116–131.Google Scholar
  11. Henninger, F., Mertens, U. K., Shevchenko, Y., & Hilbig, B. E. (2017). lab.js: Browser-based behavioral research. doi:https://doi.org/10.5281/zenodo.597045
  12. Keller, F., Gunasekharan, S., Mayo, N., & Corley, M. (2009). Timing accuracy of web experiments: A case study using the WebExp software package. Behavior Research Methods, 41, 1–12. doi:https://doi.org/10.3758/BRM.41.1.12 CrossRefPubMedGoogle Scholar
  13. Mathôt, S., Schreij, D., & Theeuwes, J. (2012). OpenSesame: An open-source, graphical experiment builder for the social sciences. Behavior Research Methods, 44, 314–324. doi:https://doi.org/10.3758/s13428-011-0168-7 CrossRefPubMedGoogle Scholar
  14. Mann, J. (2012). High Resolution Time (W3C candidate recommendation 22 May 2012). Retrieved from www.w3.org/TR/2012/CR-hr-time-20120522/.
  15. Neath, I., Earle, A., Hallett, D., & Surprenant, A. M. (2011). Response time accuracy in Apple Macintosh computers. Behavior Research Methods, 43, 353–362. doi:https://doi.org/10.3758/s13428-011-0069-9 CrossRefPubMedGoogle Scholar
  16. Pearce, J. M. (2012). Building research equipment with free, open-source hardware. Science, 337, 1303–1304.CrossRefPubMedGoogle Scholar
  17. Peirce, J. W. (2007). PsychoPy—Psychophysics software in Python. Journal of Neuroscience Methods, 162, 8–13. doi:https://doi.org/10.1016/j.jneumeth.2006.11.017 CrossRefPubMedPubMedCentralGoogle Scholar
  18. Peirce, J. W. (2009). Generating stimuli for neuroscience using PsychoPy. Frontiers in Neuroinformatics. 2, 10. doi:https://doi.org/10.3389/neuro.11.010.2008 PubMedPubMedCentralGoogle Scholar
  19. Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision, 10, 437–442. doi:https://doi.org/10.1163/156856897X00366 CrossRefPubMedGoogle Scholar
  20. Pixley, T. (2000). Document object model events. Retrieved from www.w3.org/TR/DOM-Level-2-Events/events.html
  21. Plant, R. R. (2016). A reminder on millisecond timing accuracy and potential replication failure in computer-based psychology experiments: An open letter. Behavior Research Methods, 48, 408–411. doi:https://doi.org/10.3758/s13428-015-0577-0 CrossRefPubMedGoogle Scholar
  22. Plant, R. R., Hammond, N., & Turner, G. (2004). Self-validating presentation and response timing in cognitive paradigms: How and why? Behavior Research Methods, Instruments, & Computers, 36, 291–303. doi:https://doi.org/10.3758/BF03195575 CrossRefGoogle Scholar
  23. Plant, R. R., Hammond, N., & Whitehouse, T. (2003). How choice of mouse may affect response timing in psychological studies. Behavior Research Methods, Instruments, & Computers, 35, 276–284. doi:https://doi.org/10.3758/BF03202553 CrossRefGoogle Scholar
  24. Plant, R. R., & Turner, G. (2009). Millisecond precision psychological research in a world of commodity computers: New hardware, new problems? Behavior Research Methods, 41, 598–614. doi:https://doi.org/10.3758/BRM.41.3.598 CrossRefPubMedGoogle Scholar
  25. Reimers, S., & Stewart, N. (2007). Adobe Flash as a medium for online experimentation: A test of reaction time measurement capabilities. Behavior Research Methods, 39, 365–370. doi:https://doi.org/10.3758/BF03193004 CrossRefPubMedGoogle Scholar
  26. Robinson, J., & McCormack, C. (2015). Timing control for script-based animations (W3C Working Group Note, 22 September, 2015). Retrieved from www.w3.org/TR/2015/NOTE-animation-timing-20150922/
  27. Rorden, C., & Hanayik, T. (2014). StimSync: Open-source hardware for behavioral and MRI experiments. Journal of Neuroscience Methods, 227, 90–99.CrossRefPubMedGoogle Scholar
  28. Schneider, W., Eschman, A., & Zuccolotto, A. (2002). E-Prime user’s guide. Pittsburgh, PA: Psychology Software Tools.Google Scholar
  29. Schubert, T. W., D’Ausilio, A., & Canto, R. (2013). Using Arduino microcontroller boards to measure response latencies. Behavior Research Methods, 45, 1332–1346. doi:https://doi.org/10.3758/s13428-013-0336-z CrossRefPubMedGoogle Scholar
  30. Segalowitz, S. J., & Graves, R. E. (1990). Suitability of the IBM XT, AT, and PS/2 keyboard, mouse, and game port as response devices in reaction time paradigms. Behavior Research Methods, Instruments, & Computers, 22, 283–289.CrossRefGoogle Scholar
  31. van Steenbergen, H., & Bocanegra, B. R. (2016). Promises and pitfalls of Web-based experimentation in the advance of replicable psychological science: A reply to Plant (2015). Behavior Research Methods, 48, 1713–1717. doi:https://doi.org/10.3758/s13428-015-0677-x CrossRefPubMedGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2017

Authors and Affiliations

  1. 1.Faculty of EngineeringUniversity of DeustoBilbaoSpain
  2. 2.Universidad Autónoma de MadridMadridSpain

Personalised recommendations