Researchers are increasingly adopting Web-browsing software, such as Microsoft Edge, Apple Safari, Google Chrome, or Mozilla Firefox, for presenting stimuli and recording responses in behavioral experiments. One of the reasons for this trend is that online/Web-based experiments allow experimenters to efficiently reach large and diverse samples for psychological research. For example, Richter and Gast (2017) recruited their participants via the online platform Prolific Academic (https://www.prolific.co) and collected data from 378 participants aged 18 to 69. Van Steenbergen, Band, and Hommel (2015) also recruited 198 participants aged 19 to 69 via Amazon Mechanical Turk (https://www.mturk.com). These kinds of studies benefit more from online experiments than from traditional offline experiments that are conducted in a typical psychological laboratory.

Previous research has shown that the accuracy of stimulus presentation and participants’ response timing in an online experiment is sufficient for replication of classical laboratory-based studies. For example, it was reported that although the reaction time in a visual search task was slightly different between Web-based and lab-based experiments, the effect of the set size and the shapes of the distributions were almost the same (Chetverikov & Upravitelev, 2016; de Leeuw & Motz, 2016). Pauszek, Sztybel, and Gibson (2017) also successfully replicated spatial cuing effects, one of which is that the reaction time to a target cued by arrows was faster than that to a target cued by a word like “above” or “left”. Bazilinskyy and de Winter (2018) replicated the effects of stimulus onset asynchrony (SOA) between the onset of a visual stimulus and that of an auditory stimulus on reaction times. In addition, sequences of keystrokes have been evaluated with reliable timing accuracy (Pinet et al., 2017). However, it should be noted that some behavioral experiments which required a very short display duration could not be replicated (Crump, McDonnell, & Gureckis, 2013; Semmelmann & Weigelt, 2017). More recently, Sasaki and Yamada (2019) compared the threshold of luminance contrast measured online with that measured in a laboratory, and found that there was no significant difference between the thresholds. At present, it is generally agreed that online experiments have great potential in behavioral research (for review, see Stewart, Chandler, & Paolacci, 2017).

Many tools, libraries, and frameworks have been developed to create programs for Web-based experiments (e.g., Barnhoorn, Haasnoot, Bocanegra, & van Steenbergen, 2015; Reips & Neuhaus, 2002; Schubert, Murteira, Collins, & Lopes, 2013; von Bastian, Locher, & Ruflin, 2013). This article focuses on jsPsych, a JavaScript framework developed by de Leeuw (2015). The jsPsych framework includes a number of plugins for various types of online experiments. Just by setting the parameters of the plugins, experimenters can start their research on the Internet. It is not necessary for participants to install additional plugins or applets in their own computers, and jsPsych does not depend on Flash technologies, for which Adobe has decided to end development and support.

Although jsPsych seems to be a promising JavaScript library for online studies, there are some points that remain to be improved and evaluated if experiments require time-accurate and space-flexible stimulus presentation as in psychophysical research. First, this library does not provide a convenient method for setting different onset times for each stimulus, and the SOA has not yet been adequately evaluated. Second, the visual stimulus is presented automatically at the center of the window, and there is no efficient way to present it at specific coordinates. Lastly, jsPsych does not provide an easy way to present moving objects except by using movie files. These unimplemented methods might not have been needed, because it was considered that the studies were too complicated to conduct online. However, given the increase in online research and recent improvements in the performance of hardware and software, it is expected that online psychophysical experiments will be conducted in the coming years (for review, see Woods, Velasco, Levitan, Wan, & Spence, 2015).

The goal of this study is to develop an open-source plugin for jsPsych which enables experimenters to easily present multiple stimuli with SOAs, and evaluate the SOAs using a Black Box ToolKit, which is a device for accurately measuring presentation and response times. The actual SOAs should be identical to the intended times, but if the differences are consistent across browser–computer combinations, the onset and duration of the stimuli can be calibrated. However, if there were a large systematic bias, it would be problematic. For example, if women used the Edge browser more than men, and the stimulus duration in the Edge browser was significantly longer than that in other web browsers, the women would be presented with stimuli for longer than the men. As a result, the experiment might show a false positive based on the browser-dependent difference rather than the gender difference. Therefore, it is very important to reduce the variability in stimulus presentation across browser–computer combinations. The present study evaluated how accurately multiple stimuli are presented in online studies using the new jsPsych plugin.

The new jsPsych plugin for Web-based psychophysics studies

This article introduces a new open-source jsPsych plugin called jspsych-psychophysics, which is designed to set different onset times of geometric figures, images, sounds, and moving objects, and synchronize the onsets with the refresh rate of the display. The latest version of the plugin can be downloaded freely from https://jspsychophysics.hes.kyushu-u.ac.jp under the MIT license. It is compatible with jsPsych v.6.1.0.

The downloaded files include the plugin file “jspsych-psychophysics.js” and demonstration programs. To use the plugin, the plugin file must be included using the <script> tag like other jsPsych plugin files. The following sections explain how to present each type of stimulus.

Presenting rectangles with SOAs in a trial

First, all the stimuli used in the program with the jspsych-psychophysics plugin must be specified as a JavaScript object as follows:

figure a

This code means that a white rectangle is presented at coordinates (250, 360) in a canvas, which is an HTML element that provides many drawing tools. The origin of the coordinate is the top left of the canvas, and the unit is the pixel. The width and height of the rectangle are 300 and 200 pixels, respectively. The color can be specified using the HTML color names, hexadecimal (HEX) colors, and RGB values that are often used in a general HTML file. The show_start_time property is the most notable in this object, and this enables the stimulus to be presented at the intended time. In this example, a white rectangle is presented 1500 ms after the beginning of this trial. The experimenter can present circles, lines, and texts using a similar method.

Next, a trial object has to be specified as follows:

figure b

The stimuli property can include all the different objects to be presented in the trial. In this example, three rectangles are presented in a canvas whose width and height are 1000 and 800 pixels, respectively, and whose color is green (HEX: #008000). This “trial” object must be included as a timeline property of the jsPsych.init, which is a core function of jsPsych.

Presenting image and audio files

The way to present image and audio files is almost identical to that for presenting rectangles, except that the former types of files are preloaded at the beginning of the experiment. It is very important to preload the files in order to accurately present them at the intended times. The code snippet 1 shows how to preload image and audio files, and how to present them using the jspsych-psychophysics plugin. This snippet presents the happy_face_1.jpg, whose size is reduced to 80% at the center of the canvas 500 ms after the trial begins. After another 500 ms, the tone_1.mp3 is played.

figure c

Code snippet 1. JavaScript code to preload image and audio files, and to present them using the jspsych-psychophysics plugin.

Motion stimuli

The jspsych-psychophysics plugin can change the positions of visual stimuli in sync with the refresh rate of the display using the requestAnimationFrame method. Garaizar and Reips (2019) evaluated the accuracy of the method and showed no frame loss across almost all the combinations of browsers (Chrome and Firefox) and operating systems (Windows and Linux). Moreover, Pronk, Wiers, Molenkamp, and Murre (2019) reported that the presentation time using the requestAnimationFrame method was more accurate than using CSS animations. To move objects using the jspsych-psychophysics plugin, the horiz_pix_sec or horiz_pix_frame property must be specified. These properties represent how far in pixels the object moves horizontally per second or frame. Similarly, the experimenter can use the vert_pix_sec and vert_pix_frame properties to move the objects vertically. Although these properties seem to work properly at first glance, evaluating the presentation time accuracy is beyond the scope of this paper. The main properties which can be used in the jspsych-psychophysics plugin are summarized in Table 1.

Table 1 List of main properties of jspsych-psychophysics

Study 1: Evaluation of the SOAs between visual stimuli

In the jspsych-psychophysics program, visual stimuli are presented in sync with the refresh rate of the display using the requestAnimationFrame method. Therefore, it was predicted that the presentation of the visual stimuli would be temporally more accurate than without the plugin. This study compared the accuracy with and without the plugin using the Black Box ToolKit Version 2, which is a device for psychologists, neuroscientists, and vision researchers to confirm the validity of their experimental equipment (https://www.blackboxtoolkit.com). The Black Box can detect the onset and offset of the visual and auditory stimuli using photo-sensors and microphones. Using this specialist hardware, Reimers and Stewart (2015) evaluated the display durations of a white square presented in the program with Adobe Flash and JavaScript (HTML5). They showed that the durations were about 25 ms longer than the intended times, while within-machine variability was relatively small (generally less than 10 ms). The present study investigated whether the program with the jspsych-psychophysics plugin showed the same (or better) accuracy as that reported by Reimers and Stewart. In addition, this study focused on not only the display duration but also the SOAs between visual stimuli, which have not been adequately evaluated.

Method

Hardware and software

Measurements were conducted using two Windows computers and two Macintosh computers, whose specifications are shown in Table 2. No external monitor was attached to any of the computers.

Table 2 List of machines tested

Google Chrome (75.03770) and Mozilla Firefox (68.0) were used in all computers. Microsoft Edge (42.17134) was used in the HP ProBook and Dell OptiPlex. There were two versions of Safari; v.11.1.2 was used in the MacBook Pro and v.12.1.1 was used in the iMac. Note that 32-bit Web browsers were used only in the Dell OptiPlex, and 64-bit Web browsers were used in the other computers.

Procedures

One program was coded using the jspsych-psychophysics plugin, and the other program was coded without the plugin. In both the programs, two white squares (100 × 100 pixels) were presented asynchronously on a black screen at different coordinates; one square was presented before and for a longer duration than the other. When the square was presented for 150 ms, the other square was presented for 20 ms with SOA of 20, 50, 150, and 500 ms. Similarly, when the square was presented for 500 ms, the other square was presented for 50 ms with the same SOAs. These display durations were identical to those in Reimers and Stewart's (2015) study except for the 20-ms duration condition, which was adopted to test how briefly a stimulus could be presented in a Web-based experiment. Note that a general display at 60 Hz cannot present a stimulus for 20 ms because the duration of one frame is 16.7 ms. When the time of 20 ms was specified as a parameter of each program, it was expected that the stimulus would be presented for one or two frames. The purpose of this study was to compare the program using a new plugin with that without the plugin when the same timing parameter was given. Throughout this article, “intended” time refers to the time specified as a parameter in the programs.

The stimuli were presented 100 times in each of the eight conditions except for the Dell OptiPlex, because the test program in this machine with the jspsych-psychophysics plugin with Google Chrome hung up. Although the reason was not certain, it was likely related to the browser version. Another five 32-bit computers were also tested, which revealed that the test programs using 32-bit Chrome hung up after some trials (mean = 637 trials, standard deviation [SD] = 37.7). Because of this problem, the number of repetitions of the test program with the new plugin on 32-bit Chrome in the OptiPlex was reduced to 50Footnote 1.

All the programs were located on a Web server (see https://jspsychophysics.hes.kyushu-u.ac.jp), and an Internet connection was needed to make them run. Other applications were not run during the measurements.

Results and discussion

There were some cases in which the display duration of a white square to be presented for 20 ms could not be captured properly by the photo-sensors attached to the Black Box. Table 3 shows the number of missing measurements during 100 trials except for the jspsych-psychophysics plugin condition with 32-bit Chrome in the OptiPlex (50 trials). This suggests that the square could not be presented for a very short display duration in rare cases, that is, the display frame dropped. There was no systematic bias with respect to the missing measurements among the SOAs, browser types, plugins, or computers. There were no missing measurements in the other duration conditions. In addition, for the no-plugin condition with Firefox in the OptiPlex, there was a trial where an exceedingly long display duration of 2089.5 ms was measured, whereas the intended duration was 20 ms. The mean deviation from the intended display duration including and excluding the trial was 23.2 ms (SD = 103.2) and 18.0 ms (SD = 11.7), respectively. Such extreme values were not observed in the other conditions.

Table 3 Number of missing measurements in the 20-ms display duration condition

The measured display durations were more accurate and stable in the plugin conditions than in the no-plugin conditions. Table 4 shows the means and SDs of deviations from the intended display durations of the square calculated over different SOAs. The notable results are that (i) the large mean deviations (greater than 10 ms) in the no-plugin conditions were reduced in the plugin conditions, and (ii) the SDs in the plugin conditions were smaller than or almost the same as those in the no-plugin conditions. These results show that the jspsych-psychophysics plugin improved the accuracy and precision of the display duration of visual stimuli. This improvement can be seen more clearly in the histogram of the actual display duration in each condition. The histograms in the no-plugin condition show multimodal distributions, with an interval of one frame. On the other hand, the histograms in the plugin condition show unimodal distributions, especially in Windows. All histograms related to Table 4 are available at Open Science Framework (https://osf.io/pj4sb/wiki/home/). In addition, in the no-plugin condition with Windows, Chrome presented the stimuli more accurately than Firefox, as also reported by Pronk et al. (2019). However, the marked difference among the browsers, including Edge and Safari, was not observed in the plugin condition. The small variability across browsers is ideal for a tool for online studies because it means participants can select any type of browser. It should be noted that the mean deviations in almost all the plugin conditions were negative, and this result is not consistent with that of the no-plugin conditions or with Reimers and Stewart's (2015) study. This inconsistency seems to be related to the time-specific method: in previous research, the jQuery’s .show, .delay, and .hide methods were used, whereas in the no-plugin conditions in this study the setTimeout method was used, and in the jspsych-psychophysics-plugin conditions the requestAnimationFrame method was used, by which stimulus presentation was in sync with the refresh rate of the display.

Table 4 Means and SDs of deviations from intended display durations of the square

The measured SOAs were more accurate and stable in the plugin conditions than in the no-plugin conditions. Tables 5 and 6 show the mean deviations from the intended SOAs of the two squares for the two conditions, respectively: (a) one square was presented for 150 ms before the other was presented for 20 ms, and (b) one square was presented for 500 ms before the other was presented for 50 ms. It is clearly shown in both Tables 5 and 6 that the large deviations (greater than 10 ms) in the no-plugin conditions were reduced in the plugin conditions in almost all cases. However, it is noteworthy that even in the plugin conditions, the mean deviations were relatively large for the 20-ms SOA on all the browsers in the MacBook Pro and iMac, suggesting that the SOA of 20 ms was too short to adopt in a Web-based psychophysical experiment. This result might be related to the failure to replicate the priming effect (Crump et al., 2013; Semmelmann & Weigelt, 2017). Except for the 20-ms condition, the jspsych-psychophysics plugin worked properly, and improved the performance of the jsPsych program. All histograms related to Tables 5 and 6 are available at Open Science Framework (https://osf.io/pj4sb/wiki/home/).

Table 5 Means and SDs of deviations from intended SOAs between the onset of the 150-ms square and the 20-ms square
Table 6 Means and SDs of deviations from intended SOAs between the onset of the 500-ms square and the 50-ms square

Study 2: Evaluation of the SOAs between visual and auditory stimuli

The jspsych-psychophysics plugin presents audio stimuli using the same method as the jsPsych (de Leeuw, 2015) provides, that is, using the Web Audio API, which is a multifunctional library for playing audio on the Web. Reimers and Stewart (2016) reported that although the auditory duration was relatively accurate, the SOAs between visual and auditory stimuli varied substantially across browser–computer combinations (see Slote and Strand, 2016 for Web-based cognitive psychological research with auditory stimuli). The present study investigated whether the program with the jspsych-psychophysics plugin showed the same (or better) accuracy as that reported by Reimers and Stewart.

Methods

Hardware and software

Measurements were conducted using computers identical to those used in Study 1, whose specifications are shown in Table 2. An external speaker was attached to the MacBook Pro because the volume from the internal speaker was too low to measure. The versions of the Web browser were also identical to those used in Study 1.

Procedures

One program was coded using the jspsych-psychophysics plugin, and the other program was coded without the plugin. In the jspsych-psychophysics plugin condition, a white square (100 × 100 pixels) was presented for 500 ms on a black screen, ahead of or at the same time as a sine-wave sound of 440 Hz (100-ms duration). The SOAs between the onset of the square and that of the sound were 0, 20, 50, 150, and 500 ms. In each of the five conditions, the stimuli were presented 100 times.

In the no-plugin condition, a white square was presented for 500 ms on a black screen, and this was followed by the presentation of the sine-wave sound of 440 Hz for 100 ms. Because there was no simple way to present an auditory stimulus during presentation of visual stimuli using only the core methods of the jsPsych, the sound was presented immediately after the disappearance of the square; in other words, there was only a 500-ms SOA condition. However, a reviewer suggested a sophisticated way to present an image and an audio simultaneously, and the stimulus presentation in accordance with the comment is identical to that adopted by Bridges, Pitiot, MacAskill, and Peirce (2020). Using this method, the 0-ms SOA data in the no-plugin condition were recorded post hoc in the ProBook, OptiPlex, and iMac. Note also that the Web Audio API was turned off in the original (the 500-ms SOA) condition, and was turned on in the 0-ms SOA condition. Both the visual and audio stimuli were presented 100 times.

All the programs were located on a Web server (see https://jspsychophysics.hes.kyushu-u.ac.jp), and an Internet connection was needed to make them run. Other applications were not run during the measurements.

Results and discussion

There were some cases in which the sound could not be properly captured by the microphones attached to the Black Box. For the no-plugin condition with Edge in the ProBook, the program could not play back the sound stably. The reason was unknown, and the problem did not depend on the use of the internal or external speakers. Note that the jspsych-psychophysics program could play back the sound under the same settings. In addition, for the no-plugin condition with Firefox in the MacBook Pro, the sound was recorded as divided into three short parts. These data were eliminated, leaving a total of 97 trials in this condition.

The measured sound durations were more accurate and stable in the plugin conditions than in the no-plugin conditions. Table 7 shows the means and SDs of deviations from the intended display and sound durations. Because only the 500-ms SOA was used for the no-plugin condition, the means and SDs for the plugin condition were also calculated using the data for the 500-ms SOA. For both the no-plugin and plugin conditions, the visual and auditory stimuli were generally presented accurately and precisely. More importantly, the large means and SDs of the sound duration in the no-plugin conditions with Safari in the MacBook Pro and iMac were reduced in the plugin conditions. This result indicates that the jspsych-psychophysics plugin reduces variability between browsers and computers.

Table 7 Means and SDs of deviations from intended display and sound durations

The measured SOAs between the onset of the square and that of the sound were inaccurate in both the plugin and no-plugin conditions. As shown in Table 8, the means of deviations from the intended SOAs, including the data recorded post hoc, were substantial (from about 5 ms to 50 ms), and were variable across browser–computer combinations. This result is consistent with the reports by Reimers and Stewart (2016) and Bridges et al. (2020). However, this does not mean that visual and auditory interaction cannot be investigated in online studies. There are some alternative ways. First, it is possible that a movie file can be used to present visual and audio stimuli, as Reimers and Stewart suggested. Second, given that Bazilinskyy and de Winter (2018) were able to successfully replicate the effect of the SOA on reaction times, a within-participant experimental design is available. Lastly, although it takes additional time, the difference between the onset of a visual stimulus and that of an auditory stimulus can be determined by adopting a staircase procedure (Woods et al., 2015). More importantly, although the new plugin could not present the visual and audio stimuli accurately, the means of the deviations were lower than those for the no-plugin conditions, especially for Chrome in Windows and Safari in Mac. In that sense, the jspsych-psychophysics plugin improved the performance of the jsPsych program. All histograms related to Tables 7 and 8 are available at Open Science Framework (https://osf.io/pj4sb/wiki/home/).

Table 8 Means and SDs of deviations from intended SOAs between square and sound

Conclusion

This study introduced a new jsPsych plugin for conducting a Web-based psychophysical experiment, and evaluated both the stimulus duration and the SOAs, the latter of which have not been adequately evaluated previously. The new plugin reduces the variability in both the duration and SOA across browser–computer combinations, although the presentation of a sound file is somewhat limited. Moreover, after the submission of this paper, the plugin was updated to specify timing information not only in milliseconds but also in frames. The use of the plugin is recommended for any kind of experiments using jsPsych, because visual stimuli are presented in sync with the refresh rate of the display. In other words, the visual stimuli will be presented more accurately than when the plugin is not used.