Advertisement

Experimental Brain Research

, Volume 236, Issue 2, pp 347–354 | Cite as

Rapid temporal recalibration to visuo–tactile stimuli

  • Joachim Lange
  • Katharina Kapala
  • Holger Krause
  • Thomas J. Baumgarten
  • Alfons Schnitzler
Open Access
Research Article

Abstract

For a comprehensive understanding of the environment, the brain must constantly decide whether the incoming information originates from the same source and needs to be integrated into a coherent percept. This integration process is believed to be mediated by temporal integration windows. If presented with temporally asynchronous stimuli for a few minutes, the brain adapts to this new temporal relation by recalibrating the temporal integration windows. Such recalibration can occur even more rapidly after exposure to just a single trial of asynchronous stimulation. While rapid recalibration has been demonstrated for audio–visual stimuli, evidence for rapid recalibration of visuo–tactile stimuli is lacking. Here, we investigated rapid recalibration in the visuo–tactile domain. Subjects received visual and tactile stimuli with different stimulus onset asynchronies (SOA) and were asked to report whether the visuo–tactile stimuli were presented simultaneously. Our results demonstrate visuo–tactile rapid recalibration by revealing that subjects’ simultaneity reports were modulated by the temporal order of stimulation in the preceding trial. This rapid recalibration effect, however, was only significant if the SOA in the preceding trial was smaller than 100 ms, while rapid recalibration could not be demonstrated for SOAs larger than 100 ms. Since rapid recalibration in the audio–visual domain has been demonstrated for SOAs larger than 100 ms, we propose that visuo–tactile recalibration works at shorter SOAs, and thus faster time scales than audio–visual rapid recalibration.

Keywords

Temporal integration Integration windows Multisensory Perceptual cycles Simultaneity task Visual 

Introduction

Our sensory systems permanently receive multiple bits of information from complex natural events. Usually, this information is multimodal and thus the input is initially processed in different sensory systems. To generate a coherent perception of our environment, the brain must constantly decide whether the input comes from the same source and consequently need to be integrated to a single perceptual event. Alternatively, if the information originates from multiple, different sources, the sensory input needs to be segregated into multiple separate perceptual events.

One major determinant for integration or segregation is temporal proximity between the multiple sensory inputs. Stimuli that occur in close temporal proximity are more likely to originate from one source and thus are more likely to produce enhanced neural activity (Stein and Wallace 1996; Kayser and Logothetis 2007). Also, on a perceptual level, temporal proximity is an important determinant for integration. For example, when congruent lip movements and auditory speech reach the brain within a short time window of a few milliseconds, they are more likely to originate from the same source than lip movements and auditory speech separated by seconds. Thus, lip movements and speech signals in close temporal proximity are more likely to be perceived as one congruent speech event (van Wassenhove et al. 2007). The brain can also use information about temporal structure or rhythmicity as predictive cues for sensory integration (Luo and Poeppel 2007; Vroomen and Stekelenburg 2010). Such findings led to the hypothesis of temporal integration windows for multisensory perception (Pöppel 1997; van Wassenhove et al. 2007). If multiple stimuli fall within an integration window, they will be integrated and lead to a coherent perception of a single event (van Wassenhove et al. 2007; Cecere et al. 2015; Baumgarten et al. 2015; VanRullen 2016). Otherwise, the two stimuli will be processed and subsequently perceived as two separate events.

In natural events, multisensory stimuli often exhibit stimulus onset asynchronies (SOA), which can depend on several factors such as distance of the source, neural transduction latencies or neural processing times (Alais and Carlile 2005; Harrar and Harris 2005). It has been shown that the brain has developed mechanisms to adapt to such SOAs and to compensate for the variability of temporal information (Harrar and Harris 2005; Thorne et al. 2011). Such adaptation can occur within minutes. For example, in simultaneity judgments, the brain adapts to constant exposure to asynchronous stimuli after a few minutes. This adaptation mechanism shifts the point of subjective simultaneity (PSS) towards the modality that was the leading modality during the adaption phase. Such ‘recalibration’ of (learned) temporal dependencies has been repeatedly shown for a variety of multisensory stimuli (Fujisaki et al. 2004; Hanson et al. 2008; Harrar and Harris 2008).

Recently, van der Burg et al. (2013) demonstrated that recalibration can occur even more rapidly in the absence of a prolonged adaptation phase. In an audio–visual simultaneity task, the PSS was found to be quickly recalibrated by the modality order in the preceding trial. Such rapid recalibration in only a single trial, however, could only be shown for audio–visual simultaneity judgments, while no rapid recalibration effects were found in visuo–tactile and audio–tactile simultaneity judgments tasks (van der Burg et al. 2015b).

Van den Burg et al. (2015b) concluded that in contrast to long-term adaptation, rapid recalibration is unique to audio–visual stimuli. However, an alternative explanation for the uniqueness of audio–visual stimuli might be the temporal scale on which multisensory stimuli are processed. In their study, van den Burg et al. used SOAs of ≥ 100 ms. Such SOAs are well in the range of proposed temporal integration windows for visual or audio–visual integration (van Wassenhove et al. 2007; Romei et al. 2012; Cecere et al. 2015). Integration windows for tactile stimuli, however, have been shown to act on shorter time scales of ~ 50 ms (Baumgarten et al. 2015, 2017). Thus, temporal integration windows for other multisensory stimuli involving tactile stimuli might also act on shorter time scales (Harrar and Harris 2005; Gick et al. 2010).

In this study, we investigated rapid recalibration for visuo–tactile stimuli. We hypothesized that rapid recalibration of perception for visuo–tactile stimuli might act on shorter time scales compared to audio–visual stimuli. To this end, we adapted the paradigm of van der Burgh et al. (2015b), however, using also visuo–tactile stimuli with shorter SOAs. Our results demonstrate that rapid recalibration also occurs for visuo–tactile stimuli. The recalibration, however, was only found for short SOAs in the preceding, recalibrating trial.

Methods

Participants

Twenty-two subjects [six male, age: 24.6 ± 3.0 years (mean ± SD)] participated in this study after giving written informed consent in accordance to the declaration of Helsinki and the Ethical Committee of the Medical Faculty, Heinrich Heine University Düsseldorf. All participants had normal or corrected-to-normal vision and reported no somatosensory deficits or known history of neurological disorders.

Four subjects had to be excluded from analysis due to implausible response patterns during the task (see below) so that finally 18 subjects [six male, age 24.4 ± 3.3 years (mean ± SD)] were included in the analyses.

Stimuli and paradigm

Subjects were sitting comfortably in a sound-attenuated room with dimmed light. Subjects’ head was placed in a helmet-shaped inlay of a magnetoencephalograph (MEG). While we recorded neuronal activity with the MEG simultaneously to the task, in this study we will focus solely on the behavioral parameters of the task.

Subjects fixated a central grey dot presented via a projector (PT-D7700E, Panasonic, Japan) located outside the magnetically shielded room on a screen 57 cm in front of them. After 1000 ms the dot decreased in luminance indicating the start of the stimulation period (Fig. 1). After a jittered period of 800–1300 ms subjects received visuo–tactile stimulation. Electro-tactile stimulation was generated by a Stimulus Current Generator (DeMeTec GmbH, Germany) and applied by electrical pulses (duration 0.3 ms) via ring-electrodes attached to the tip of the left index finger. Stimulation amplitudes (2.8 ± 0.9 mA) were individually adjusted prior to the experiment to a level where subjects could clearly perceive stimulation, but below the pain threshold. Visual stimulation was applied via a light emitting diode (LED) attached to the tip of the left index finger, just below the electrodes for electrical stimulation. Stimulation intensity was set that the light was clearly visible and intensity was kept constant for all subjects (duration 5 ms). The left hand was placed comfortably on a table so that stimulation appeared at ~ 20° left and below the fixation dot. In each trial, one visual and one tactile stimulus were applied with varying stimulus onset asynchronies (SOAs) of ± 300, ± 150, ± 125, ± 100, ± 75, ± 50, ± 25 or 0 ms, with negative values indicating visual stimulation occurring before tactile stimulation and vice versa. SOAs with ± 300, ± 150 and 0 ms were presented in ten trials, the remaining SOAs were presented in 50 trials in pseudo-randomized order, each. Stimulation was followed by another jittered time period (600–1300 ms) during which only the fixation dot was visible, before response instructions were presented on the screen. In a forced-choice paradigm, subjects had to report whether they perceived the visuo–tactile stimulation as simultaneous or non-simultaneous by pressing respective buttons with the index and middle finger of their right hand. Button configurations were randomized from trial to trial to minimize response preparation. If subjects did not respond within 3000 ms after or before the response instructions were presented, a warning message was presented on the screen and the trial was repeated at the end. All in all, 550 trials were presented with self-paced breaks after every 150 trials.

Fig. 1

Experimental setup. Subjects fixated a central fixation dot. After a jittered period, they received one electro-tactile stimulus on their left index finger and one visual stimulus via an LED attached to the left index finger. Both stimuli were presented in varying order and with varying stimulus onset asynchronies (SOA). After another jittered period, subjects reported via button press with their right hand whether the visual and tactile stimuli were perceived as simultaneous or non-simultaneous. Button configuration was randomized from trial to trial

The task was preceded by a training phase of ~ 5 min to familiarize subjects with the task. Prior to the experiment, general instructions were visually presented on the screen. Subjects received no feedback on their responses and remained naïve towards the aim of the task.

Analysis of behavioral data, fitting procedure and statistical analysis

To analyze whether the subjective perception in one trial was influenced by the stimulation in the preceding trial, we separated trials according to the stimulation in the preceding trial. That is, one condition contained all trials for which the stimulation order of the preceding trial was tactile–visual (i.e., all trials with preceding trial having a positive SOAs, denoted tv) while one condition contained all trials for which the stimulation order in the preceding trial was visual–tactile (i.e., all trials with preceding trial having a negative SOAs, named vt). For the SOAs with ± 300, ± 150 and 0 ms, this resulted in 4.8 ± 1.2 trials (mean ± SD across subjects and SOAs, range 3.4–6.1) for the tv conditions and 5.1 ± 1.2 trials (mean ± SD, range 3.9–6.1) for the vt conditions. For the remaining SOAs, this resulted in 24.5 ± 1.7 trials (mean ± SD, range 21.5–26.6) for the tv conditions and 24.2 ± 1.9 trials (mean ± SD, range 21.7–27.6) trials for the vt conditions. For each condition, SOA and subject, we computed mean response rates by averaging the simultaneous reports across trials. Finally, we averaged mean response rates per condition and SOA across subjects.

Four subjects had to be excluded from analyses, because they showed an implausible response pattern, e.g., response distributions showing not the expected bell-like shape (indicating excessive guessing or not understanding the task), unusual high number of simultaneity reports for long SOAs (>25% simultaneity reports, indicating strong bias towards reporting “simultaneous”), or unusual low number of simultaneity reports for short SOAs (<75% simultaneity reports, indicating strong bias towards reporting “non-simultaneous”, Fig. S1).

The point of subjective simultaneity (PSS) denotes the SOA, at which subjects most likely report the stimulation as simultaneous. To estimate the PSS, we modelled each subject’s response distribution. Since response distributions showed an asymmetrical pattern (Figs. 2, S1), we used skewed Gaussian-like functions (Yarrow et al. 2011; van der Burg et al. 2015b):

Fig. 2

A Mean proportion of simultaneity reports as a function of SOA. Trials were split depending on the stimulation order in the preceding trial (visual–tactile or tactile–visual). Red dots represent subjects’ responses (mean ± SEM) if the stimulation order in the preceding trial was tactile–visual. Black dots represent responses if the stimulation order in the preceding trial was visual–tactile. Red and black curves show the best fitting skewed normal distribution fitted to the averaged data. Red and black vertical lines indicate the respective PSS (point of subjective simultaneity; i.e., the maximum of the fitted function). B PSS were determined for each subject and then averaged across subjects. Black stars and red dots represent PSS of individual subjects, the bars represent mean ± SEM across subjects. Black and red bars (and stars and dots, respectively) represent PSS if the stimulation order in the preceding trial was visual–tactile or tactile, visual, respectively

$${\text{SR}}\left( {{\text{SOA}}} \right)={\text{norm}}\left( {{\text{SOA}}} \right) \times e \times {\text{cdf}}\left( {{\text{SOA}}} \right)$$
with
$${\text{norm}}=a+~\frac{1}{b} \times \exp \left( {\frac{{ - {{\left( {{\text{SOA}} - c} \right)}^2}}}{{{d^2}}}} \right)$$
denoting the normally shaped distribution, cdf the cumulative density function of the normal function, a, b, c, d, e parameters to be fitted, and SR the simultaneity reports as a function of the SOA.

We estimated the PSS for trials in the vt and tv conditions (PSSvt and PSStv) for each subject separately. In addition, we estimated the PSS including all trials, i.e., before considering putative rapid recalibration effects. Finally, we statistically compared PSSvt and PSStv across subjects by means of a two-tailed paired samples t test (after confirming that the distribution of individual PSS did not significantly differ from a normal distribution by means of a Kolmogorov–Smirnov-test).

PSSvt and PSStv might not only be influenced by the stimulation order in the preceding trial (i.e., visual–tactile vs. tactile–visual), but also by the length of the SOA in the preceding trial. Therefore, we additionally split trials in conditions with the preceding trial having tactile–visual stimulation and an SOA ≥ 100 ms (denoted tv_long) and in conditions with the preceding trial having tactile–visual stimulation and an SOA < 100 ms (named tv_short). The same analysis was performed when preceding trials had visual–tactile stimulation (vt_long and vt_short). After confirming that the distribution of individual PSS did not significantly differ from a normal distribution by means of a Kolmogorov–Smirnov-test, the four resulting PSS were statistically compared by a 2 × 2 repeated measures ANOVA with factors stimulation order (tv or vt) and SOA duration (short or long), followed by post-hoc paired sample t tests.

Finally, we computed effect sizes (Cohen’s d) according to the formula
$$d=~\frac{{{M_1} - {M_2}}}{{\sqrt {\frac{{S_{1}^{2}+~S_{2}^{2}}}{2}} }}$$
with M1 and M2 denoting the means and S1 and S2 denoting the standard deviations of the two compared conditions.

All analyses were performed using Matlab (The Mathworks Inc., Natick/MA, USA).

Results

Behavioral results

Subjects received one visual and one tactile stimulus in varying order and with varying stimulus onset asynchronies (SOA) and had to report whether they perceived the two stimuli as simultaneous or non-simultaneous. We investigated whether the subjective perception of simultaneity depended on the stimulation pattern in the preceding trial, i.e., whether subjects show rapid recalibration in a visuo–tactile task. To this end, we separated trials in conditions depending on stimulation order in the preceding trial (visual leading tactile, denoted vt, or tactile leading visual, denoted tv).

For both conditions, subjects showed the lowest number of simultaneity reports for the largest SOAs (± 300 ms) and a peak of simultaneity reports at the point of subjective simultaneity (PSS, Fig. 2A). To determine the PSS, we fitted skewed Gaussian function to each distribution. PSSvt will denote the PSS for trials in which the stimulation order in the preceding trial was visual–tactile; and PSStv for stimulation order tactile–visual in preceding trials. In addition, we determined the PSS for the base simultaneity reports, i.e., including all trials irrespective of the stimulation order in the preceding trial.

We found that the modality order of the stimulation in the preceding trial influenced subjects’ perception of simultaneity in the current trial. When we fitted the mean simultaneity responses averaged across subjects, the PSSvt was − 29.8 ms and the PSStv 4.3 ms (Fig. 2A). When we fitted the mean simultaneity reports including all trials irrespective of stimulation order in the preceding trial, the PSS was 3.9 ms (Fig. S1A).

Additionally, we determined individual PSS by fitting each subject’s response distributions. Averaged across subjects, PSSvt were − 25.1 ± 5.5 ms (mean ± SEM) and PSStv were 10.5 ± 5.1 ms (Fig. 2B). Statistical analysis revealed a highly significant difference between conditions PSSvt and PSStv [t(17) = 4.971; p < .001). The effect size was d = 1.584.

Next, we investigated whether the PSS additionally depended on the length of the SOA in the preceding trial. Therefore, we additionally split the conditions in “long” (> 100 ms or < − 100 ms, respectively) and “short” SOAs (< 100 ms or > − 100 ms, respectively). When we fitted the mean simultaneity responses averaged across subjects, the PSSvt_long was 12.8 ms and the PSSvt_short was − 25.7 ms. The PSStv_long was − 0.8 ms and the PSStv_short was 39.5 ms.

Additionally, we determined individual PSS by fitting each subjects’ response distributions. Averaged across subjects, PSSvt_long were − 6.4 ± 7.9 ms (mean ± SEM) and PSSvt_short were − 26.4 ± 7.2 ms. The averaged PSStv_long was − 5.8 ± 10.1 ms (mean ± SEM) and the averaged PSStv_short was 32.5 ± 4.6 ms (Fig. 3).

Fig. 3

Same as Fig. 2B, but now trials were additionally split into conditions depending on the SOA in the preceding trial (long SOA indicating length of SOAs ≥ 100 ms; short SOA indicating length of SOAs < 100 ms)

A 2 × 2 ANOVA with factors modality order in the preceding trial (vt or tv) and SOA duration (long or short) revealed a highly significant main effect for order [F(1,53) = 28.280; p < .001] and a highly significant interaction effect [F(1,53) = 19.546; p < .001], while the main effect of duration was not significant [F(1,53) = 1.064; p = .317].

Post-hoc t tests revealed highly significant differences between the PSStv_short and PSStv_long [t(17) = 4.353; p < .001; d = 1.152] and between PSStv_short and PSSvt_short [t(17) = 8.104; p < .001; d = 2.292]. The other comparisons did not reach statistical significance (p ≥ .136; d ≤ 0.626).

Discussion

After an adaptation phase of sustained repetitive stimulation with asynchronous multisensory stimuli, the brain adapts by reducing the perceived stimulus onset synchrony (SOA) between the two asynchronous stimuli. This so-called temporal recalibration has been found for audio–visual, audio–tactile, and visuo–tactile stimuli (Fujisaki et al. 2004; Hanson et al. 2008). Recently, studies have shown that temporal recalibration can be induced even by a single trial, the so-called rapid recalibration (van der Burg et al. 2013, 2015b). Unlike recalibration following sustained adaptation, rapid recalibration has been reported only for audio–visual stimuli, while it was absent for visuo–tactile stimuli (van der Burg et al. 2015b; Alais et al. 2017). In the present study, however, we were able to demonstrate rapid recalibration for visuo–tactile stimuli. A detailed analysis revealed that significant rapid recalibration occurred most prominently if the SOA in the preceding, recalibrating trial was lower than 100 ms. In contrast, we could not find visuo–tactile rapid recalibration effects for SOAs ≥ 100 ms.

We found that visuo–tactile and tactile–visual stimulation had different effects on perception of the subsequent trials. When we included all trials in the analyses, i.e., before considering putative rapid recalibration effects, the point of subjective simultaneity (PSS) was 3.9 ms. While visuo–tactile in the preceding trial shifted the PSS towards negative SOAs (− 29.8 ms), tactile–visual stimulation in the preceding trial had virtually no effect on the PSS (4.3 ms). Only when we additionally analyzed only tactile–visual stimulation with short SOAs (< 100 ms) in the preceding trial, the PSS was shifted towards positive SOAs (39.5 ms). The different effects of visual–tactile and tactile–visual stimulation on simultaneity reports might be due to different mechanisms underlying visuo–tactile and tactile–visual integration. That is, the order of the stimulation might have an influence on integration processes. For example, a recent study reported that audiovisual temporal integration was different for auditory-leading vs. visual-leading stimulus pairs (Cecere et al. 2016). The reason for the difference might that audio–visual and visuo-auditory integration engages different neuronal mechanisms possibly due to different multisensory sampling mechanisms depending on leading sense (Cecere et al. 2017). Similar differences for visuo–tactile and tactile–visual stimulation might be responsible for the differences we found in the present study.

Previous studies reported rapid recalibration effects for audio–visual stimuli, but were unable to demonstrate rapid recalibration for visuo–tactile stimuli (van der Burg et al. 2015b; Alais et al. 2017). A major difference between our study and previous studies is the length of the SOAs between visual and tactile stimuli. Typically, previous studies used SOAs with a minimum length of 90–100 ms (Hanson et al. 2008; Harrar and Harris 2008; van der Burg et al. 2013, 2015b; Alais et al. 2017). Such a long SOA was sufficient to induce recalibration effects after sustained adaptation phases for audio–visual, audio–tactile and visuo–tactile stimuli. In addition, SOAs ≥ 100 ms have been shown to induce rapid recalibration for audio–visual stimuli but not for visuo–tactile stimuli (van der Burg et al. 2015b; Alais et al. 2017). Our results confirm these previous results. When we analyzed only those trials for which the preceding trial had an SOA of minimum 100 ms, we were unable to find significant rapid recalibration effects for our visuo–tactile stimuli. With smaller SOAs (< 100 ms), however, we could demonstrate rapid recalibration effects. We thus propose that rapid recalibration of visuo–tactile stimuli works on a shorter time scale (i.e., < 100 ms) than recalibration after a sustained adaptation phase (with SOAs ≥ 100 ms). In addition, we propose that rapid recalibration of visuo–tactile stimuli works on a different time scale (i.e., < 100 ms) than rapid recalibration of audio–visual stimuli (≥ 100 ms).

The length of the SOA is the major difference between our study and previous studies. Yet, there are a couple of additional differences that might affect rapid recalibration effects. While previous studies mostly used tactile stimulators that induced pressure to the finger, we used electrical stimulations. These two stimulation techniques might stimulate different receptors of the skin and thus might contribute differently to temporal tactile perception (Kandel et al. 2000). In addition, we used shorter stimulus durations than previous studies (~ 50 ms in previous studies, ≤ 5 ms in our study), which might have influenced simultaneity judgements (Boenke et al. 2009; Stevenson and Wallace 2013). Finally, we presented visual and tactile stimuli spatially aligned in the periphery, while others presented stimuli at different locations or centrally. While spatial congruency has been shown to be irrelevant for rapid recalibration in the audio–visual domain and for visuo–tactile recalibration after sustained adaptation (Keetels and Vroomen 2007; Ho et al. 2015), it cannot be excluded that for rapid recalibration spatial congruency is necessary. Yet, despite these differences between studies, we were unable to show rapid recalibration effects for SOAs larger than 100 ms. Therefore, we believe that a potential influence of any of these factors on rapid recalibration of visuo–tactile stimuli should be either negligible or only present for SOAs smaller than 100 ms. In conclusion, we propose that rapid recalibration depends predominantly on the SOA of the preceding trial. We could show that the critical SOA for rapid recalibration for visuo–tactile stimuli is < 100 ms, while the critical SOA is 100 ms (and potentially longer) for audio–visual stimuli (van der Burg et al. 2013, 2015b).

This new finding of rapid visuo–tactile recalibration raises two questions. First, why does rapid recalibration of audio–visual and visuo–tactile stimuli occur on different time scales? In addition, previous studies have shown that recalibration after a sustained adaptation phase with SOAs of 100 ms can induce adaptation phases for both, audio–visual and visuo–tactile stimuli. Therefore, the second question is why rapid and sustained recalibration of visuo–tactile stimuli occurs at different time scales?

For audio–visual stimuli, sustained adaptation produces a sustained recalibration effect that lasts up to a few minutes. Rapid recalibration, however, produces transient effects that change from trial to trial (Van der Burg et al. 2015a). Both effects have been found to occur in parallel in one experiment, but independently from each other. Thus, it has been suggested that both effects occur on separate and independent time scales. Yarrow et al. (2011) suggested that rapid and sustained recalibration reflect different level of a sensory-decisional process. Sustained recalibration may be caused by criterion shifts in higher level decisional process and thus might be considered as a supramodal effect that should affect audio–visual, visuo-tactile, and audio–tactile stimuli in a similar way (Yarrow et al. 2011). Indeed, it has been shown that audio–visual, visuo-tactile, and audio–tactile stimuli show sustained recalibration effects of comparable size (Fujisaki et al. 2004; Keetels and Vroomen 2007; Hanson et al. 2008). In contrast, rapid recalibration has been suggested to reflect shifts in temporal alignment in sensory processes (Kösem and van Wassenhove 2012).

A potential reason for the differences between audio–visual and visuo–tactile rapid recalibration might thus be found in the different temporal processing and integration of these stimuli. For example, temporal integration windows for visuo–tactile stimuli have been found to be smaller than for audio–visual stimuli (Harrar and Harris 2008; Fujisaki and Nishida 2009; Noel et al. 2015). It has been argued that audio–visual integration needs flexible integration mechanisms. For example, integration of lip movements and voices typically improves speech processing and perception. It would be beneficial if such integration mechanisms rapidly adapt to inter-individual differences, intra-individual changes of speed or rhythmicity, or changes of the distance between source and observer. In contrast to auditory signals, the distance between the source of touch and the observer cannot change significantly as touch is always applied to the skin. The major source of variability in visuo–tactile integration comes from different neural transduction times for touch at different body parts. Typically, the differences of transduction times are less than 100 ms (Harrar and Harris 2008). From this point of view, one might expect that visuo–tactile integration time windows should rapidly and flexibly adapt at time scales below 100 ms. Visuo–tactile recalibration might thus work on a different, shorter timescale than audio–visual rapid recalibration.

Due to the inter- and intra-individual factors, audio–visual integration needs to adapt to higher variability than visuo–tactile integration. We might speculate that due to this higher variability, audio–visual stimuli need longer integration windows, which might rely more strongly on the temporal resolution of the visual modality. On the other hand, visuo–tactile stimuli show less variation as the tactile source shows less variation (e.g., each location on body has fixed latency of neural transduction). Therefore, visuo–tactile stimuli need shorter integration windows, which additionally might rely more strongly on the temporal processing of tactile stimuli. The contribution of visual and tactile information on multisensory processing might vary and rely on reliability of the sources (Ernst and Banks 2002; Ernst and Bülthoff 2004).

Recent studies revealed that cycles of neuronal oscillations form the neuronal basis of these integration windows (Cecere et al. 2015; Baumgarten et al. 2015, 2017; VanRullen 2016). Different frequency bands seem to play a different role for temporal perception in different modalities, with the theta- (~ 4–7 Hz) and alpha- (~ 8–12 Hz) band playing a dominant role in the visual and audio–visual domain (Romei et al. 2012; Landau and Fries 2012; Cecere et al. 2015; VanRullen 2016), while the beta-band (~ 13–25 Hz) plays a dominant role in the somatosensory domain (Baumgarten et al. 2015, 2017). We might speculate that repetitive or even single stimulation triggers these frequencies relevant for temporal perception. Adaptation to the preceding trial(s) might be reflected by adaptation of the carrier frequency to slightly lower or higher frequencies, respectively. In this framework, adaptation would take place most prominently if the adaption stimulus is able to trigger the frequency relevant for temporal perception. Since the frequencies for temporal perception differ between visual and tactile modalities, we would expect different SOAs to trigger adaptation effects. Since in the (audio-)visual modality alpha frequencies with cycle lengths of ~ 100 ms seem to be relevant, we would expect maximal adaptation and recalibration effects in the visual modality with SOAs of ≥ 100 ms as found in van der Burg et al. (2013, 2015b). In contrast, we would expect maximal adaptation and recalibration effects in the somatosensory domain for SOAs corresponding to the beta-band (~ 13–25 Hz), i.e., for SOAs of ~ 40–80 ms, but lesser or even absent recalibration effects for SOAs ≥ 100 ms. The results of the present study are in line with the proposed framework as we found rapid visuo–tactile recalibration effects for SOAs < 100 ms, but not for SOAs ≥ 100 ms, while other studies found rapid audio–visual recalibration for SOAs ≥ 100 ms (van der Burg et al. 2013, 2015b).

In summary, we provide evidence for rapid recalibration in the visuo–tactile domain. The rapid recalibration, however, could only be demonstrated if the SOA in the preceding trial was smaller than 100 ms. Since rapid recalibration in the audio–visual domain has been demonstrated for SOAs ≥ 100 ms, we propose that visuo–tactile and audio–visual rapid recalibration work on different time scales. We suggest that the neural basis for these differences might be found in different frequency bands relevant for audio–visual and (visuo-) tactile temporal perception.

Notes

Compliance with ethical standards

Funding

JL was supported by the German Research Foundation (2400-4/1).

Supplementary material

221_2017_5132_MOESM1_ESM.tif (334 kb)
Supplementary Figure S1: Behavioral results without considering rapid recalibration. A) Proportion of simultaneity reports as a function of SOA, averaged across subjects. The black curve shows the best fitting skewed normal distribution fitted to the averaged data. The black vertical lines indicate the respective PSS (point of subjective simultaneity; i.e., the maximum of the fitted function). B) Same as A) but now showing the individual behavioral results for all 18 subjects included in A) and in the further analyses. C) Same as B) but now showing the four subjects excluded from further analyses either due to a strong bias towards reporting “non-simultaneous” (< 75% simultaneity reports for short SOAs, figures 1-3) or due to a strong bias towards reporting “simultaneous” (> 25% simultaneity reports for long SOAs, figure 4) (TIF 333 KB)

References

  1. Alais D, Carlile S (2005) Synchronizing to real events: subjective audiovisual alignment scales with perceived auditory depth and speed of sound. Proc Natl Acad Sci USA 102:2244–2247CrossRefPubMedPubMedCentralGoogle Scholar
  2. Alais D, Ho T, Han S, Van der Burg E (2017) A matched comparison across three different sensory pairs of cross-modal temporal recalibration from sustained and transient adaptation. Iperception 8:2041669517718697.  https://doi.org/10.1177/2041669517718697 PubMedPubMedCentralGoogle Scholar
  3. Baumgarten TJ, Schnitzler A, Lange J (2015) Beta oscillations define discrete perceptual cycles in the somatosensory domain. Proc Natl Acad Sci USA 112:12187–12192.  https://doi.org/10.1073/pnas.1501438112 CrossRefPubMedPubMedCentralGoogle Scholar
  4. Baumgarten TJ, Königs S, Schnitzler A, Lange J (2017) Subliminal stimuli modulate somatosensory perception rhythmically and provide evidence for discrete perception. Sci Rep.  https://doi.org/10.1038/srep43937 doiGoogle Scholar
  5. Boenke LT, Deliano M, Ohl FW (2009) Stimulus duration influences perceived simultaneity in audiovisual temporal-order judgment. Exp Brain Res 198:233–244CrossRefPubMedGoogle Scholar
  6. Cecere R, Rees G, Romei V (2015) Individual differences in alpha frequency drive crossmodal illusory perception. Curr Biol.  https://doi.org/10.1016/j.cub.2014.11.034 PubMedPubMedCentralGoogle Scholar
  7. Cecere R, Gross J, Thut G (2016) Behavioural evidence for separate mechanisms of audiovisual temporal binding as a function of leading sensory modality. Eur J Neurosci 43(12):1561–1568CrossRefPubMedPubMedCentralGoogle Scholar
  8. Cecere R, Gross J, Willis A, Thut G (2017) Being first matters: topographical representational similarity analysis of ERP signals reveals separate networks for audiovisual temporal binding depending on the leading sense. J Neurosci 37(21):5274–5287CrossRefPubMedPubMedCentralGoogle Scholar
  9. Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415:429–433.  https://doi.org/10.1038/415429a CrossRefPubMedGoogle Scholar
  10. Ernst MO, Bülthoff HH (2004) Merging the senses into a robust percept. Trends Cogn Sci 8:162–169.  https://doi.org/10.1016/j.tics.2004.02.002 CrossRefPubMedGoogle Scholar
  11. Fujisaki W, Nishida S (2009) Audio–tactile superiority over visuo–tactile and audio–visual combinations in the temporal resolution of synchrony perception. Exp Brain Res 198:245–259CrossRefPubMedGoogle Scholar
  12. Fujisaki W, Shimojo S, Kashino M, Nishida S (2004) Recalibration of audiovisual simultaneity. Nat Neurosci 7:773CrossRefPubMedGoogle Scholar
  13. Gick B, Ikegami Y, Derrick D (2010) The temporal window of audio-tactile integration in speech perception. J Acoust Soc Am 128:EL342–EL346CrossRefPubMedPubMedCentralGoogle Scholar
  14. Hanson JV, Heron J, Whitaker D (2008) Recalibration of perceived time across sensory modalities. Exp Brain Res 185:347–352CrossRefPubMedGoogle Scholar
  15. Harrar V, Harris LR (2005) Simultaneity constancy: detecting events with touch and vision. Exp Brain Res 166:465–473CrossRefPubMedGoogle Scholar
  16. Harrar V, Harris LR (2008) The effect of exposure to asynchronous audio, visual, and tactile stimulus combinations on the perception of simultaneity. Exp Brain Res 186:517–524CrossRefPubMedGoogle Scholar
  17. Ho HT, Orchard-Mills E, Alais D (2015) Visuotactile temporal recalibration transfers across different locations. Multisens Res 28:351–370.  https://doi.org/10.1163/22134808-00002498 CrossRefPubMedGoogle Scholar
  18. Kandel ER, Schwartz JH, Jessell TM et al (2000) Principles of neural science. McGraw-Hill, New YorkGoogle Scholar
  19. Kayser C, Logothetis NK (2007) Do early sensory cortices integrate cross-modal information? Brain Struct Funct 212:121–132CrossRefPubMedGoogle Scholar
  20. Keetels M, Vroomen J (2007) No effect of auditory–visual spatial disparity on temporal recalibration. Exp Brain Res 182:559–565CrossRefPubMedPubMedCentralGoogle Scholar
  21. Kösem A, van Wassenhove V (2012) Temporal structure in audiovisual sensory selection. PLoS One 7:e40936CrossRefPubMedPubMedCentralGoogle Scholar
  22. Landau AN, Fries P (2012) Attention samples stimuli rhythmically. Curr Biol 22:1000–1004.  https://doi.org/10.1016/j.cub.2012.03.054 CrossRefPubMedGoogle Scholar
  23. Luo H, Poeppel D (2007) Phase patterns of neuronal responses reliably discriminate speech in human auditory cortex. Neuron 54:1001–1010CrossRefPubMedPubMedCentralGoogle Scholar
  24. Noel J-P, Wallace MT, Orchard-Mills E et al (2015) True and perceived synchrony are preferentially associated with particular sensory pairings. Sci Rep 5:17467CrossRefPubMedPubMedCentralGoogle Scholar
  25. Pöppel E (1997) A hierarchical model of temporal perception. Trends Cogn Sci 1:56–61CrossRefPubMedGoogle Scholar
  26. Romei V, Gross J, Thut G (2012) Sounds reset rhythms of visual cortex and corresponding human visual perception. Curr Biol 22:807–813.  https://doi.org/10.1016/j.cub.2012.03.025 CrossRefPubMedPubMedCentralGoogle Scholar
  27. Stein BE, Wallace MT (1996) Comparisons of cross-modality integration in midbrain and cortex. Prog Brain Res 112:289–299CrossRefPubMedGoogle Scholar
  28. Stevenson RA, Wallace MT (2013) Multisensory temporal integration: task and stimulus dependencies. Exp Brain Res 227:249–261CrossRefPubMedPubMedCentralGoogle Scholar
  29. Thorne JD, Vos MD, Viola FC, Debener S (2011) Cross-modal phase reset predicts auditory task performance in humans. J Neurosci 31:3853–3861.  https://doi.org/10.1523/JNEUROSCI.6176-10.2011 CrossRefPubMedGoogle Scholar
  30. van Wassenhove V, Grant KW, Poeppel D (2007) Temporal window of integration in auditory-visual speech perception. Neuropsychologia 45:598–607CrossRefPubMedGoogle Scholar
  31. van der Burg E, Alais D, Cass J (2013) Rapid recalibration to audiovisual asynchrony. J Neurosci 33:14633–14637CrossRefPubMedGoogle Scholar
  32. Van der Burg E, Alais D, Cass J (2015a) Audiovisual temporal recalibration occurs independently at two different time scales. Sci Rep 5:14526CrossRefPubMedPubMedCentralGoogle Scholar
  33. van der Burg E, Orchard-Mills E, Alais D (2015b) Rapid temporal recalibration is unique to audiovisual stimuli. Exp Brain Res 233:53–59CrossRefPubMedGoogle Scholar
  34. VanRullen R (2016) Perceptual cycles. Trends Cogn Sci 20:723–735CrossRefPubMedGoogle Scholar
  35. Vroomen J, Stekelenburg JJ (2010) Visual anticipatory information modulates multisensory interactions of artificial audiovisual stimuli. J Cogn Neurosci 22:1583–1596CrossRefPubMedGoogle Scholar
  36. Yarrow K, Jahn N, Durant S, Arnold DH (2011) Shifts of criteria or neural timing? The assumptions underlying timing perception studies. Conscious Cogn 20:1518–1531CrossRefPubMedGoogle Scholar

Copyright information

© The Author(s) 2017

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  1. 1.Medical Faculty, Institute of Clinical Neuroscience and Medical PsychologyHeinrich Heine UniversityDüsseldorfGermany

Personalised recommendations