Abstract
It has been suggested that judgments about the temporal–spatial order of successive tactile stimuli depend on the perceived direction of apparent motion between them. Here we manipulated tactile apparent-motion percepts by presenting a brief, task-irrelevant auditory stimulus temporally in-between pairs of tactile stimuli. The tactile stimuli were applied one to each hand, with varying stimulus onset asynchronies (SOAs). Participants reported the location of the first stimulus (temporal order judgments: TOJs) while adopting both crossed and uncrossed hand postures, so we could scrutinize skin-based, anatomical, and external reference frames. With crossed hands, the sound improved TOJ performance at short (≤300 ms) and at long (>300 ms) SOAs. When the hands were uncrossed, the sound induced a decrease in TOJ performance, but only at short SOAs. A second experiment confirmed that the auditory stimulus indeed modulated tactile apparent motion perception under these conditions. Perceived apparent motion directions were more ambiguous with crossed than with uncrossed hands, probably indicating competing spatial codes in the crossed posture. However, irrespective of posture, the additional sound tended to impair potentially anatomically coded motion direction discrimination at a short SOA of 80 ms, but it significantly enhanced externally coded apparent motion perception at a long SOA of 500 ms. Anatomically coded motion signals imply incorrect TOJ responses with crossed hands, but correct responses when the hands are uncrossed; externally coded motion signals always point toward the correct TOJ response. Thus, taken together, these results suggest that apparent-motion signals are likely taken into account when tactile temporal–spatial information is reconstructed.
Similar content being viewed by others
Time and space are not processed separately. For example, temporal synchrony has been found to influence whether or not two stimuli are perceived as spatially aligned (Lewald & Guski, 2003; Slutsky & Recanzone, 2001), whereas perceived simultaneity depended on the spatial locations of the involved stimuli (Lewald & Guski, 2004; Stevenson, Fister, Barnett, Nidiffer, & Wallace, 2012). Moreover, the perception of time and space seems to be influenced by the motion system, which combines temporal and spatial information. For example, it has been shown that task-irrelevant tactile (Craig, 2003; Craig & Busey, 2003) or visual (Shibuya, Takahashi, & Kitazawa, 2007) motion cues interfere with performance in a temporal order judgment (TOJ) task, in which participants have to indicate the temporal–spatial order of two tactile stimuli applied with varying stimulus onset asynchronies (SOAs).
Tactile TOJ performance is also altered by the posture of the stimulated hands: When two tactile stimuli are applied, one to each hand, TOJ performance is impaired when the hands are crossed over the body midline, as compared to an uncrossed hand posture (Shore, Spry, & Spence, 2002; Yamamoto & Kitazawa, 2001). This crossing effect most likely emerges from a conflict between anatomical and external coordinates in the crossed posture, in which the left hand (anatomical coordinate) is located in the right hemifield (external coordinate) and the right hand is in the left hemifield. Thus, the external location of the stimuli influences task performance, even though the TOJ task could in principle be solved by relying on the anatomical coordinates alone (Azañón, Mihaljevic, & Longo, 2016; Badde, Heed, & Röder, 2016; Heed & Azañón, 2014; Heed, Buchholz, Engel, & Röder, 2015).
Kitazawa and colleagues (2007) have argued that the effects of apparent motion and posture on tactile TOJ performance might be related to each other: According to their hypothesis, the two successive tactile stimuli presented in the TOJ task are integrated into an apparent-motion signal, which is subsequently projected onto the skin. In the crossed-hands condition, the motion projection is assumed to be inverted because the direction of motion is presumably derived from the anatomical coordinates of the stimuli. In turn, the inverted-motion signal leads to a wrong reconstruction of the temporal order of the two stimuli. This motion projection hypothesis might explain a theoretically challenging finding from tactile TOJ tasks: At short SOAs below 300 ms—at which tactile apparent motion was found to be most prevalent (Kirman, 1974; Takahashi, Kansaku, Wada, Shibuya, & Kitazawa, 2013)—some participants more likely reported the reversed rather than the correct temporal order of the tactile stimuli, resulting in an N-shaped psychometric function (Wada, Yamamoto, & Kitazawa, 2004; Yamamoto & Kitazawa, 2001). The motion projection hypothesis (Kitazawa et al., 2007) has been supported by recent functional magnetic resonance imaging results showing an activation of motion-sensitive areas of the perisylvian cortex in a tactile TOJ task as compared to a control task (Takahashi et al., 2013).
The goal of the present study was to provide evidence that posture effects on tactile TOJ performance are related to tactile apparent motion perception (Kitazawa et al., 2007; Takahashi et al., 2013). To this end, we changed the apparent-motion percept induced by the two successive tactile stimuli of the TOJ task. Consequently, we tested the role of apparent motion between the to-be-judged tactile stimuli. Previous studies have demonstrated that additional motion cues influence TOJ performance. However, these motion cues were independent of the apparent motion induced by the TOJ stimuli (Craig, 2003; Craig & Busey, 2003; Shibuya et al., 2007), and the effect could have been driven by decisional rather than perceptual processes (Sanabria, Spence, & Soto-Faraco, 2007). We addressed the reference frames involved by asking participants to assume crossed and uncrossed hand postures during the task.
Tactile apparent motion percepts were manipulated by adding a brief, task-irrelevant auditory stimulus. Crucially, the static sound was presented both temporally and spatially in-between the two tactile stimuli and thus was uninformative for the TOJ task. Previous studies have demonstrated that presenting a brief static sound at an intervening time between two alternating visual or tactile stimuli modulates both the strength and the perceived direction of the resulting apparent-motion stream (Bruns & Getzmann, 2008; Chen, Shi, & Müller, 2011; Chen & Zhou, 2011; Freeman & Driver, 2008; Getzmann, 2007; Kafaligonul & Stoner, 2010; Shi, Chen, & Müller, 2010). At long SOAs, temporally intervening sounds facilitated the perception of apparent motion (Bruns & Getzmann, 2008; Getzmann, 2007), whereas at short SOAs the ability to discriminate the direction of apparent motion was impaired (Chen & Zhou, 2011). This effect of static sounds on apparent-motion percepts in another modality has been attributed to temporal ventriloquism. Temporal ventriloquism refers to the finding that the perceived timing of a visual stimulus is shifted toward a slightly asynchronous auditory stimulus (Bausenhart, de la Rosa, & Ulrich, 2014; Fendrich & Corballis, 2001; Morein-Zamir, Soto-Faraco, & Kingstone, 2003; Shimojo et al., 2001; Vroomen & de Gelder, 2004). Temporally intervening sounds, consequently, shorten the subjectively perceived SOA between two successive visual stimuli, resulting in impaired performance when judging the temporal order of the visual stimuli (Morein-Zamir et al., 2003; Shimojo et al., 2001).
With respect to apparent motion, shortening of the perceived SOA might have the effect of moving the tactile stimulus pair either outside (at short SOAs) or inside (at long SOAs) the range of SOAs in which apparent motion is typically perceived (Freeman & Driver, 2008; Getzmann, 2007). In other words, the sound should weaken tactile apparent-motion percepts at shorter SOAs, while potentially enhancing apparent-motion percepts at long SOAs both by means of temporal ventriloquism. If crossing effects in tactile TOJ are related to inverted-motion signals at short SOAs (Kitazawa et al., 2007; Takahashi et al., 2013), weakening apparent motion should be associated with improved performance in the crossed posture but worse performance in the uncrossed posture.
Experiment 1
Method
Participants
Sixteen members of the community of Hamburg (14 right-handed, seven male; 21–44 years old, mean 27 years) took part in the first experiment. Two additional participants were replaced due to difficulties with following the experimental instructions. All participants reported normal or corrected-to-normal vision, normal hearing abilities, and being free of tactile impairments. In return for their attendance, they received course credit or were compensated with €7/h. Written informed consent was obtained from all participants prior to the start of the experiment, which was conducted in accordance with the ethical guidelines of the Declaration of Helsinki.
Apparatus and stimuli
Participants sat at a table, resting their hands and elbows on the table surface. Their index fingers lay on the response devices, which were placed at a distance of 25 cm from each other. Participants’ arms lay in either a crossed or an uncrossed posture. In the crossed condition, a foam cushion was placed underneath the upper arm to avoid skin contact between the hands and arms.
Tactile stimulators (Oticon bone conductors, type BC 461-012, Oticon Ltd, Milton Keynes, UK, about 1.6 × 1 × 0.8 cm in size) were taped to the middle fingers, covering the whole fingernail and some proximate skin. For stimulation, they were driven with a frequency of 167 Hz (i.e., a square wave with a cycle duration of 6 ms) for 10 ms. In each trial, a pair of tactile stimuli was presented, one stimulus to each hand. To shield participants from any auditory cues produced by the tactile stimulators, participants wore ear plugs as well as headphones that were constantly playing white noise. In some conditions, an additional sound stimulus was superimposed on the background noise, consisting of a discrete white-noise burst of 10 ms that was always presented exactly in-between the presentation of the first and second tactile stimuli. All participants reported clearly perceiving the sound stimuli, notwithstanding the auditory shielding. The experiment was controlled by Presentation, version 14.5 (Neurobehavioral Systems, Berkeley, CA, USA), which interfaced with custom-built hardware in order to drive stimulators and record responses.
Task
Participants were asked to indicate the temporal order of the two tactile stimuli by responding with the index finger of the hand that had received the first touch. Responses had to be withheld until the presentation of the second tactile stimulus had ended. Participants were informed that the auditory stimuli were task-irrelevant. No feedback was provided.
Design
Three factors were varied within participants: (a) the posture of the hands (factor: Crossing Status; levels: crossed and uncrossed), (b) the presentation of the additional sound (factor: Sound; levels: sound present and absent), and (c) the time interval between the two tactile stimuli (factor: SOA; levels: – 1,000, – 700, – 500, – 300, – 200, – 110, – 80, – 50, 50, 80, 110, 200, 300, 500, 700, and 1,000 ms, with negative values indicating “left hand first” stimulation and positive values indicating “right hand first” stimulation).
Procedure
Each trial lasted 2,500 ms longer than the SOA of the two tactile stimuli. Trials were repeated at the end of the block if participants had failed to respond within this time window. The experiment was divided into 16 blocks with 48 trials each; that is, each condition was repeated 12 times. The experiment took on average 100 min. SOA was varied within blocks, crossing status was changed every four blocks, and sound condition was changed after eight blocks. The order of conditions was counterbalanced across participants.
Data analysis
Trials with reaction times (RTs) shorter than 200 ms or longer than two standard deviations above the individual mean (across conditions) were excluded from all analyses (2.5% of all trials). Responses were transformed into “right hand first” values, indicating whether or not participants perceived the first touch at the right hand (Sternberg & Knoll, 1973; Yamamoto & Kitazawa, 2001). These binary values were fitted using a generalized linear mixed model (GLMM), which allowed for performing a logistic regression while accounting for repeated measurements per participant (see also Heed, Backhaus, Röder, & Badde, 2016). The factors crossing status and sound were included in the model as summation contrast-coded predictors, and SOA was included as a covariate. To compensate for deviations of the resulting sigmoid (cf. Fig. 1) from the logistic distribution, SOA values were power-transformed using the parameter 2/3. Type III Wald chi-square tests were used to test for significant deviations of the estimated parameters from zero, and significant interactions were followed up by fitting submodels.
With crossed hands, “right hand first” responses at SOAs up to 300 ms often exhibit an N-shaped pattern—that is, an increase in incorrect responses with increasing SOA (Wada et al., 2004; Yamamoto & Kitazawa, 2001). In contrast, for SOAs longer than 300 ms, crossing status tends to affect the asymptotes of the psychometric function. To separately address these two changes in TOJ performance, we split the data into trials with short SOAs (shorter than or equal to 300 ms) and trials with long SOAs (longer than 300 ms). We conducted separate GLMMs predicting single-trial accuracy values for each data subset. Only the categorical predictors crossing status and sound were included in these models. Thus, the predictions were independent of the distribution of accuracies across SOAs.
To analyze RTs, we fitted a linear mixed model with the categorical predictors crossing status and sound to single-trial data. To capture the unimodal, discontinuous distribution of RTs across SOAs, we included two continuous predictors—linear and quadratic absolute SOA values.
Results
In the crossed-hands conditions, TOJ response accuracy improved when an additional sound was presented (Fig. 1). In contrast, TOJ responses with uncrossed hands were less accurate when an additional sound was presented than in the sound-absent condition.
This interaction was confirmed by the statistical analyses: Parameter estimation revealed a significant three-way interaction between crossing status, sound, and SOA, χ 2(1) = 23.08, p < .001; significant two-way interactions between all three variable pairings [Crossing Status × Sound: χ 2(1) = 6.06, p = .014; Crossing Status × SOA: χ 2(1) = 855.01, p < .001; Sound × SOA: χ 2(1) = 20.74, p < .001]; and significant main effects of crossing status, χ 2(1) = 8.35, p = .004, and SOA, χ 2(1) = 1,284.28, p < .001. Follow-up analyses of the data from the crossed conditions revealed a significant two-way interaction between sound and SOA, χ 2(1) = 29.60, p < .001, and a significant main effect of SOA, χ 2(1) = 848.67, p < .001. Thus, the interaction between sound and SOA indicates a change in discrimination performance between the sound and no-sound conditions. Follow-up analyses of the data from the uncrossed conditions revealed a significant two-way interaction between sound and SOA, χ 2(1) = 14.91, p < .001, and significant main effects of both factors: sound, χ 2(1) = 7.09, p = .008, and SOA, χ 2(1) = 5.76, p < .001. The main effect of sound was due to a slight shift toward “right hand first” responses in the sound as compared to the sound-absent conditions—that is, a shift in the point of subjective simultaneity of the tactile stimuli. The interaction of sound with the covariate SOA indicated a decline in discrimination performance with sound—that is, an increase in the just noticeable difference.
To further scrutinize these effects, we estimated separate submodels on the responses in trials with short (≤300 ms) and long (>300 ms) SOAs by fitting the accuracy values as a function of crossing status and sound condition. Both submodels revealed a (marginal) two-way interaction between crossing status and sound condition—at short SOAs, χ 2(1) = 15.17, p < .001; at long SOAs, χ 2(1) = 3.31, p = .068; a main effect of crossing status—at short SOAs, χ 2(1) = 998.97, p < .001; at long SOAs, χ 2(1) = 96.90, p < .001; and for long SOAs, a main effect of sound, χ 2(1) = 27.14, p < .001. Follow-up comparisons revealed significant effects of sound for trials with short SOAs and uncrossed hands, χ 2(1) = 11.43, p < .001; for trials with short SOAs and crossed hands, χ 2(1) = 4.04, p = .045; and for trials with long SOAs and crossed hands, χ 2(1) = 29.00, p < .001; but not for trials with long SOAs and uncrossed hands, χ 2(1) = 1.55, p = .214.
An analysis of RTs (Fig. 2) revealed significant main effects of crossing status, χ 2(1) = 2,885.42, p < .001, and sound, χ 2(1) = 19.33, p < .001, as well as linear, χ 2(1) = 740.00, p < .001, and quadratic, χ 2(1) = 326.80, p < .001, effects of absolute SOA. Furthermore, crossing status interacted significantly with both the linear, χ 2(1) = 161.73, p < .001, and quadratic, χ 2(1) = 129.43, p < .001, SOA terms. RTs were longer when the hands were crossed than when they were uncrossed, and when a sound was presented as compared to when it was not. Furthermore, RTs became longer with decreasing SOA of the two stimuli, and this increase of RTs was steeper for uncrossed than for crossed trials.
Discussion
Experiment 1 tested the influence of a task-irrelevant sound on tactile TOJ performance. Introducing an additional sound between the two tactile stimuli had opposing effects on TOJ performance with crossed and uncrossed hands. At short SOAs, performance with crossed hands improved with the additional sound, whereas performance with uncrossed hands decreased. This pattern of results at short SOAs is consistent with our hypotheses. For short SOAs of around 70 ms, it has been shown that temporally intervening sounds impair the ability to discriminate the direction of tactile apparent motion, as compared to a condition without sounds (Chen & Zhou, 2011). Assuming that the temporal–spatial order of two tactile stimuli is calculated on the basis of a motion signal coded in anatomical coordinates (Kitazawa et al., 2007; Takahashi et al., 2013), weakening the apparent-motion percept should have opposite effects on TOJ performance with crossed and uncrossed hands. When the hands are crossed, an anatomically coded apparent-motion signal implies the wrong response. Thus, in line with our results, TOJ performance with crossed hands should benefit from an impaired ability to discriminate the direction of tactile apparent motion. By contrast, when the hands are uncrossed, anatomical and external coordinates of touch coincide. Consequently, the apparent-motion signal points toward the correct response, and decreasing the motion signal should lead to impaired performance.
However, crossed-hands performance not only improved for short SOAs, but for long SOAs as well. For long SOAs, a perceptual shortening of the time interval between the tactile stimuli should have fostered, rather than interfered with the apparent-motion percept by moving the perceived SOA into the preferred range of apparent motion (Bruns & Getzmann, 2008; Getzmann, 2007). A strengthened apparent-motion percept could have improved participants’ performance with crossed hands only if it contains the correct temporal–spatial information needed to perform the task—that is, if it is not inverted due to anatomical spatial coding.
Thus, if the observed changes in TOJ performance are related to changes in tactile apparent motion, motion signals at short and long SOAs must be coded in different references frames. We conducted a second experiment to directly test the effect of the sound on tactile apparent-motion perception as a function of SOA and hand posture.
Experiment 2
In a second experiment, we directly assessed the influence of the intervening static sound on the perceived direction of tactile apparent motion at both short and long SOAs, as well as with crossed and uncrossed hands. We expected the intervening sound to reduce apparent-motion percepts for short SOAs (Chen & Zhou, 2011) while amplifying apparent-motion percepts for longer SOAs (Bruns & Getzmann, 2008; Getzmann, 2007). The motion projection hypothesis, and thus the interpretation of Experiment 1, assume that for short SOAs tactile apparent motion is coded in anatomical coordinates. However, studies probing the time course of tactile remapping have indicated that external coding of touch starts to dominate relative to anatomical coding 60 to 190 ms after stimulus onset (Azañón & Soto-Faraco, 2008; Brandes & Heed, 2015; Ley, Steinberg, Hanganu-Opatz, & Röder, 2015; Overvliet, Azañón, & Soto-Faraco, 2011). Thus, at longer SOAs, tactile apparent-motion signals might be coded with respect to external space.
Method
Participants
Twenty members of the local community of the city of Hamburg were recruited for the second experiment. Three additional participants were replaced due to technical problems during data acquisition. All participants reported normal or corrected-to-normal vision, normal hearing abilities, and being free of tactile impairments. In return for their attendance, they received course credit or were compensated with 7 €/h. Written informed consent was obtained from all participants prior to the start of the experiment, which was conducted in accordance with the ethical guidelines of the Declaration of Helsinki.
Apparatus and stimuli
The stimulations consisted of multiple rather than two (as in Exp. 1) tactile stimuli, alternately presented on the two hands with an SOA of either 80 or 500 ms. Each stimulus sequence lasted 5,000 ms, and thus comprised either 62 (at the 80-ms SOA) or 10 (at the 500-ms SOA) tactile stimuli in total. In the sound condition, auditory stimuli (identical to those used in Exp. 1) were interleaved between every other pair of tactile stimuli; that is, the first auditory stimulus was presented in the time interval between the first and second tactile stimuli, the second auditory stimulus in the time interval between the third and fourth tactile stimuli, and so on (see Fig. 3). In all other aspects, the apparatus and stimuli were identical to the setup of Experiment 1.
Note that presenting multiple rather than only two tactile stimuli made the stimulation sequences ambiguous with respect to the perceived direction of apparent motion, and thus allowed us to assess the effect of additional auditory stimuli on motion direction discrimination (for similar procedures, see Chen et al., 2011; Freeman & Driver, 2008). The stimulation sequences typically resulted in the perception of multiple short, unidirectional sweeps moving either from the left to the right hand or vice versa. Thus, perceptually the stimuli were grouped in either left–right or right–left pairs. Previous research has shown that participants often show a bias to perceive the motion direction that is congruent with the direction indicated by the first two stimuli in the sequence, rather than the opposite direction (Chen et al., 2011). This start motion direction effect (see also the Data Analysis section below) was expected to be enhanced by the additional auditory stimuli (Freeman & Driver, 2008).
Task
Participants were asked to indicate the direction of the perceived tactile motion sweeps by responding with the index finger of the hand at which the motion was directed. If participants perceived no motion or had changing motion percepts during one trial, they were asked to report which motion direction they had most likely or predominantly perceived. Responses had to be withheld until all tactile stimuli had been presented. As in Experiment 1, participants were told that the auditory stimuli were task-irrelevant. No feedback was provided.
Design
Four factors were varied within participants: (a) the posture of the hands (factor: Crossing Status; levels: crossed and uncrossed), (b) the presentation of the additional sound (factor: Sound; levels: sound present and absent), (c) the time interval between the tactile stimuli (factor: SOA; levels: 80 and 500 ms), and (d) the initial direction of the first two tactile stimuli presented at the beginning of a trial (factor: Start Motion Direction; levels: “from left to right hand,” “from right to left hand”).
Procedure
Each trial lasted 5,000 ms. Participants responded after the last stimulus had been presented, and their RT was not restricted in duration. The intertrial interval started after the response had been given and lasted 500 ms. The experiment was divided into 20 blocks of 32 trials each, resulting from 40 repetitions per condition. Participants completed the experiment in two sessions of 80 min each. SOA, start motion direction, and sound were varied within blocks, and crossing status changed every two blocks. Condition order was counterbalanced across participants.
Data analysis
Responses were recoded with respect to the start motion direction effect, indicating whether or not participants perceived the apparent-motion sweeps in the same direction as the first two stimuli in a trial. Even though the tactile stimulation was ambiguous with respect to the perceived direction of apparent motion, it is well-known that participants often show a bias for reporting motion in the initial direction from the first to the second stimulus in the sequence (Chen et al., 2011). This start motion direction effect relies on the perceived motion of the first two stimuli and, thus, reflects the same motion percept that was presumably elicited in Experiment 1. The resulting binary values were fitted using a GLMM, which allowed for performing a logistic regression while accounting for repeated measurements per participant. The factors crossing status, sound, and SOA were included in the model as summation contrast-coded categorical predictors. Type III Wald chi-square tests were used to test for significant deviations of the estimated parameters from zero, and significant interactions were followed up by fitting submodels. Trials with RTs shorter than 200 ms or longer than two standard deviations above the individual mean (across conditions) were excluded from the analysis (7.0% of all trials).
Results
For both SOAs, the effect of start motion direction was more likely to be perceived with uncrossed than with crossed hands. In the short-SOA condition, participants’ apparent-motion percept was numerically closer to chance level in sound than in no-sound conditions (see Fig. 4). In the long-SOA condition, the perceived direction of tactile apparent motion was more likely to agree with the start motion direction in sound than in no-sound conditions.
Parameter estimation revealed a marginally significant three-way interaction between crossing status, sound, and SOA, χ 2(1) = 3.27, p = .071; significant two-way interactions between crossing status and SOA, χ 2(1) = 27.52, p < .001, and between sound and SOA, χ 2(1) = 5.13, p = .023; as well as significant main effects of crossing status, χ 2(1) = 105.81, p < .001, sound, χ 2(1) = 4.21, p = .040, and SOA, χ 2(1) = 89.24, p < .001. Follow-up analysis of the data from the short-SOA conditions revealed only a significant main effect of crossing status, χ 2(1) = 13.52, p < .001. Follow-up analysis of the data from the long-SOA conditions revealed significant main effects for both factors: crossing status, χ 2(1) = 115.92, p < .001, and sound, χ 2(1) = 8.76, p = .003.
Discussion
The results of Experiment 2 confirmed that task-irrelevant static sounds alter the perceived direction of ambiguous tactile apparent-motion streams, dependent on SOA and hand posture. For a short SOA of 80 ms, only posture significantly influenced whether the tactile apparent-motion streams were perceived in the same direction as the starting direction. Systematically reversed TOJ responses for short SOAs with crossed hands have been observed in fewer than 50% of participants (Kóbor, Füredi, Kovács, Spence, & Vidnyánszky, 2006; Wada et al., 2004; Yamamoto & Kitazawa, 2001). Thus, similar interindividual variability might have obscured the effect of sounds on motion discrimination performance in the present experiment.
At a long SOA of 500 ms, participants were more likely to perceive movement in the starting direction from the first to the second tactile stimulus with the additional sounds, as compared to a baseline condition without the sounds. This result supports our assumption that a sound can, most likely by means of temporal ventriloquism, induce or amplify apparent-motion percepts at long SOAs. The effect of the sound did not differ between the two hand posture conditions. Although the start motion effect was stronger with uncrossed than with crossed hands, we found no evidence of inverted-motion signals with crossed hands at long SOAs. This suggests that apparent motion at long SOAs was based on external rather than on anatomical coding of tactile stimuli.
General discussion
The present study examined the relationship between posture effects on judgments on the temporal–spatial order of two tactile events and on apparent-motion signals induced by these stimuli. It has been suggested that judgments about the temporal–spatial order of two tactile stimuli are based on the perceived direction of apparent motion between them. Thus, tactile temporal–spatial perception has been proposed to critically depend on the integration of the single stimuli into a motion signal (Kitazawa et al., 2007; Takahashi et al., 2013). To test this hypothesis, we manipulated the apparent-motion percept induced by two successive tactile stimuli by adding a brief task-irrelevant auditory stimulus temporally in-between the two stimuli. We observed sound-associated gains in TOJ performance with crossed hands at both short and long SOAs; by contrast, TOJ performance with uncrossed hands was impaired at short SOAs, but unaffected (and nearly perfect) at long SOAs (Exp. 1). In accordance with a motion-based interpretation of these effects and in line with previous findings (Bruns & Getzmann, 2008; Chen et al., 2011; Chen & Zhou, 2011; Freeman & Driver, 2008; Getzmann, 2007), the addition of the sound increased apparent-motion percepts in the same direction as the initial two stimuli at long SOAs of 500 ms (Exp. 2). In both sound conditions, motion direction discrimination was affected by the posture of the hands.
In both experiments, the intermitting sound most likely induced a temporal ventriloquism effect, that is, a perceived shortening of the SOA between leading and trailing tactile stimulus (Fendrich & Corballis, 2001; Morein-Zamir, Soto-Faraco, & Kingstone, 2003; Shimojo et al., 2001; Vroomen & de Gelder, 2004). In line with this assumption, in both experiments the effect of the sound on the perceived temporal order and the induced apparent motion of these tactile stimuli varied with the time interval between them.
For short SOAs, the sound-induced shortening of the perceived time interval between the tactile stimuli likely moved the stimuli outside of the range of SOAs optimal for apparent-motion perception. Thus, the sound should have reduced the strength of the apparent-motion percept between the two tactile stimuli. Attenuating the apparent-motion percept might have facilitated TOJ performance at short SOAs in the crossed-hands condition of Experiment 1 by suppressing incorrect responses based on an inverted apparent-motion percept. In contrast, for uncrossed hands, apparent motion is typically perceived as moving in the “correct” direction. Thus, apparent-motion signals contain additional evidence for the correct response, and attenuating apparent-motion signals would result in lower TOJ performance. In sum, the sound-induced reduction of crossing effects at short SOAs agrees with the motion projection hypothesis put forward by Kitazawa and colleagues (2007). Consistently, in Experiment 2 adding an intermitting sound tended to make the perceived motion direction more ambiguous at short SOAs. However, this observation was not statistically confirmed (for corroborating findings, see also Chen & Zhou, 2011).
Even though our results agree with a connection between (inverted) apparent-motion percepts and spatial TOJ performance, we cannot rule out that the sound affected TOJ performance and apparent-motion perception independent of each other. A sound-induced shortening of the perceived SOA between two stimuli makes it more difficult to temporally discriminate the first from the second stimulus. Therefore, temporal ventriloquism results in TOJ performance closer to chance level (Morein-Zamir et al., 2003; Shimojo et al., 2001). At short SOAs, tactile TOJ responses with crossed hands formed an N-shaped pattern when the sound was absent (Fig. 1; see also Wada et al., 2004; Yamamoto & Kitazawa, 2001); that is, performance was below chance level for some short SOAs. A shift toward chance level would result in improved TOJ performance for these SOAs. In turn, TOJ performance with uncrossed hands was above chance and should decrease when the temporal order of the stimuli becomes more ambiguous. Thus, it is possible that for short SOAs the observed changes in tactile TOJ performance were not driven by changes in apparent-motion perception, but solely by changes in the perceived time interval between the stimuli.
However, sounds facilitated TOJ performance in the crossed-hands condition not only at short SOAs, but additionally at long SOAs. In contrast to short SOAs, TOJ performance at long SOAs was above chance level irrespective of whether the sound was present or absent. Thus, a shortening of the perceived time interval between the stimuli, which should bring performance closer to chance level, cannot explain the observed improvement of TOJ performance at long SOAs. Moreover, in contrast to TOJs with crossed hands that are based on spatial features (i.e., which hand was stimulated first, as tested in the present study), nonspatial tactile TOJ performance (e.g., requiring the discrimination of tactile patterns or intensity) is usually perfect for SOAs longer than 150 ms (Craig & Baihua, 1990). Thus, it seems very unlikely that sound-induced changes affecting only the temporal perception of the tactile stimuli caused the improvements in TOJ performance with crossed hands at long SOAs. In contrast to changes in temporal perception only, a sound-induced increase in apparent-motion percepts in the “correct direction” would predict the observed improvement in TOJ performance. Consistently, Experiment 2 indicated that for both hand postures the intermitting sound increased apparent-motion percepts in the same direction as the first two stimuli for long SOAs.
Takahashi et al. (2013) measured the strength of apparent-motion percepts induced by two tactile stimuli, one applied to each hand, across a wide range of SOAs. With an SOA of 500 ms between the two tactile stimuli and independent of hand posture, participants still perceived apparent motion in about 20% of trials. The probability of apparent-motion percepts rose steeply when SOAs decreased. Thus, even a small sound-induced shortening of the perceived SOA (i.e., a temporal ventriloquism effect) could have induced a considerable change in the strength of apparent motion between the tactile stimuli. With uncrossed hands, TOJ performance at long SOAs might have been unaffected by the sounds because TOJ performance was already at ceiling without the sounds, so that the effect of sound on TOJ performance at long SOAs became apparent only in the crossed-hands condition.
In sum, our results are in agreement with an interaction between posture effects on apparent-motion percepts and on the perception of the temporal–spatial order of two succeeding tactile stimuli. However, at short SOAs it is possible that both motion and temporal-order perception changed with the sound, but independent of each other. At long SOAs, motion-independent explanations for the observed change in TOJ performance seem less likely. Consequently, a strong interpretation of the motion projection hypothesis (Kitazawa et al., 2007), stating that TOJs are entirely based on apparent-motion signals, seems unlikely. In fact, the motion projection hypothesis cannot explain crossing effects in tasks involving only one tactile stimulus (Badde et al., 2016). Rather, we suggest that tactile apparent-motion signals provided an additional cue for TOJ responses. The results consistent with this interpretation have recently been obtained in the visual domain: TOJs about two visual stimuli were more accurate when these stimuli induced an apparent-motion percept, as when no motion percept emerged (Baruch, Yeshurun, & Shore, 2013).
In both experiments, we scrutinized the involved reference frames by manipulating hand posture. Tactile stimuli are initially represented in skin-based, anatomical coordinates and are then transformed into an external reference frame by integration of posture information (for recent reviews, see Badde & Heed, 2016; Heed et al., 2015). After remapping, both spatial codes are retained (Buchholz, Jensen, & Medendorp, 2011, 2013; Heed & Röder, 2010; Ley et al., 2015) and probably integrated to estimate tactile location (Badde, Heed, & Röder, 2014, 2016; Badde, Röder, & Heed, 2014, 2015; Cadieux, Barnett-Cowan, & Shore, 2010; Shore et al., 2002). When the hands are crossed, anatomical and external codes are in conflict. Weighted integration of these conflicting codes, in turn, can lead to potentially erroneous location estimates. Apparent-motion signals using such ambiguous location estimates as start and end location might be more ambiguous than those based on conflict-free location estimates (cf. Kirman, 1974). Consistently, the perceived direction of apparent motion was more ambiguous for crossed than uncrossed postures in Experiment 2. Similar effects of posture on apparent-motion perception have been demonstrated in the so-called cross-modal dynamic capture effect, in which the perceived direction of tactile apparent motion is biased toward the direction of a concurrent visual (Craig, 2006; Lyons, Sanabria, Vatakis, & Spence, 2006) or auditory (Soto-Faraco, Spence, & Kingstone, 2004) apparent-motion stream. Cross-modal capture effects were found to be influenced by the hands’ crossing status (Chen, Wang, & Bao, 2014; Jiang & Chen, 2013; Sanabria, Soto-Faraco, & Spence, 2005), suggesting that the discrimination of tactile motion direction was impaired when a nondefault posture was adapted (Zampini, Harris, & Spence, 2005). Importantly, posture only affected the perceived direction of apparent motion, whereas the strength of the motion percept was rated similarly in crossed and uncrossed postures (Jiang & Chen, 2013; Takahashi et al., 2013).
The relative dominance of anatomical or external tactile codes varies over time. Shortly after their application, tactile stimuli are coded in an anatomical reference frame, whereas later on external tactile codes seem to be weighted higher (Azañón & Soto-Faraco, 2008; Brandes & Heed, 2015; Ley et al., 2015). Similarly, the reference frame of tactile apparent motion might be changing with the time interval between stimuli. Kitazawa et al. (2007) based the motion projection hypothesis on the assumption that tactile apparent motion is coded anatomically at short SOAs. However, experiments on the effect of local on global tactile motion (Craig, 2003; Craig & Busey, 2003) and tactile motion aftereffects (Kuroki, Watanabe, Mabuchi, Tachi, & Nishida, 2012) are rather in accordance with coding of tactile apparent motion in an external reference frame. In Experiment 2, we tested the reference frame of tactile apparent motion by means of hand crossing for a short and a long SOA. At short SOAs, the direction of apparent motion was numerically inverted when the hands were crossed. This inversion points toward anatomical coding of apparent motion, although the overarching effect did not reach significance. At long SOAs, with crossed hands the direction of the apparent-motion percept rather agreed with the direction of the start motion. Thus, if the time interval between the stimuli is longer, tactile apparent motion might be coded in an external rather than an anatomical reference frame, but corroborating results would be needed to draw a final conclusion.
Taken together, the present results suggest a link between motion perception and perception of temporal–spatial order. The perception of apparent motion, or more generally high-level motion, obviously needs to be inferred from the spatio-temporal characteristics of the involved stimuli, such as their temporal separation and spatial distance (Kirman, 1974; Strybel, Manligas, Chan, & Perrott, 1990). Thereby, the reference frame dominating the resulting percepts seems to crucially depend on the SOA between the stimuli. In turn, the present and other (Baruch et al., 2013; Takahashi et al., 2013) findings suggest that explicit temporal order judgments take into account information from the motion system. Thus, both the perception of apparent motion and the discrimination of temporal order might be mediated by a common underlying mechanism.
References
Azañón, E., Mihaljevic, K., & Longo, M. R. (2016). A three-dimensional spatial characterization of the crossed-hands deficit. Cognition, 157, 289–295.
Azañón, E., & Soto-Faraco, S. (2008). Changing reference frames during the encoding of tactile events. Current Biology, 18, 1044–1049.
Badde, S., & Heed, T. (2016). Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cognitive Neuropsychology, 33, 26–47.
Badde, S., Heed, T., & Röder, B. (2014). Processing load impairs coordinate integration for the localization of touch. Attention, Perception, & Psychophysics, 76, 1136–1150. https://doi.org/10.3758/s13414-013-0590-2
Badde, S., Heed, T., & Röder, B. (2016). Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach. Psychonomic Bulletin & Review, 23, 387–404.
Badde, S., Röder, B., & Heed, T. (2014). Multiple spatial representations determine touch localization on the fingers. Journal of Experimental Psychology: Human Perception and Performance, 40, 784–801.
Badde, S., Röder, B., & Heed, T. (2015). Flexibly weighted integration of tactile reference frames. Neuropsychologia, 70, 367–374.
Baruch, O., Yeshurun, Y., & Shore, D. I. (2013). Space and time: An impact of spatial separation, apparent motion, and perceptual grouping on TOJ performance. Perception, 42, 551–561.
Bausenhart, K. M., de la Rosa, M. D., & Ulrich, R. (2014). Multimodal integration of time. Experimental Psychology, 61, 310.
Brandes, J., & Heed T. (2015). Reach trajectories characterize tactile localization for sensorimotor decision making. Journal of Neuroscience, 35, 13648–13658. https://doi.org/10.1523/JNEUROSCI.1873-14.2015
Bruns, P., & Getzmann, S. (2008). Audiovisual influences on the perception of visual apparent motion: Exploring the effect of a single sound. Acta Psychologica, 129, 273–283.
Buchholz, V. N., Jensen, O., & Medendorp, W. P. (2011). Multiple reference frames in cortical oscillatory activity during tactile remapping for saccades. Journal of Neuroscience, 31, 16864–16871.
Buchholz, V. N., Jensen, O., & Medendorp, W. P. (2013). Parietal oscillations code nonvisual reach targets relative to gaze and body. Journal of Neuroscience, 33, 3492–3499.
Cadieux, M. L., Barnett-Cowan, M., & Shore, D. I. (2010). Crossing the hands is more confusing for females than males. Experimental Brain Research, 204, 431–446.
Chen, L., Shi, Z., & Müller, H. J. (2011). Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion. PLoS ONE, 6, e17130. https://doi.org/10.1371/journal.pone.0017130
Chen, L., Wang, Q., & Bao, M. (2014). Spatial references and audio-tactile interaction in cross-modal dynamic capture. Multisensory Research, 27, 55–70.
Chen, L., & Zhou, X. (2011). Capture of intermodal visual/tactile apparent motion by moving and static sounds. Seeing and Perceiving, 24, 369–389.
Craig, J. C. (2003). The effect of hand position and pattern motion on temporal order judgments. Perception & Psychophysics, 65, 779–788.
Craig, J. C. (2006). Visual motion interferes with tactile motion perception. Perception, 35, 351–367.
Craig, J. C., & Baihua, X. U. (1990). Temporal order and tactile patterns. Perception & Psychophysics, 47, 22–34.
Craig, J. C., & Busey, T. A. (2003). The effect of motion on tactile and visual temporal order judgments. Perception & Psychophysics, 65, 81–94.
Fendrich, R., & Corballis, P. M. (2001). The temporal cross-capture of audition and vision. Perception & Psychophysics, 63, 719–725.
Freeman, E., & Driver, J. (2008). Direction of visual apparent motion driven solely by timing of a static sound. Current Biology, 18, 1262–1266. https://doi.org/10.1016/j.cub.2008.07.066
Getzmann, S. (2007). The effect of brief auditory stimuli on visual apparent motion. Perception, 36, 1089–1103.
Heed, T., & Azañón, E. (2014). Using time to investigate space: A review of tactile temporal order judgments as a window onto spatial processing in touch. Frontiers in Psychology, 5, 76. https://doi.org/10.3389/fpsyg.2014.00076
Heed, T., Backhaus, J., Röder, B., & Badde, S. (2016). Disentangling the external reference frames relevant to tactile localization. PLoS ONE, 11, e0158829. https://doi.org/10.1371/journal.pone.0158829
Heed, T., Buchholz, V. N., Engel, A. K., & Röder, B. (2015). Tactile remapping: From coordinate transformations to integration in sensorimotor processing. Trends in Cognitive Sciences, 19, 251–258.
Heed, T., & Röder, B. (2010). Common anatomical and external coding for hands and feet in tactile attention: Evidence from event-related potentials. Journal of Cognitive Neuroscience, 22, 184–202.
Jiang, Y., & Chen, L. (2013). Mutual influences of intermodal visual/tactile apparent motion and auditory motion with uncrossed and crossed arms. Multisensory Research, 26, 19–51.
Kafaligonul, H., & Stoner, G. R. (2010). Auditory modulation of visual apparent motion with short spatial and temporal intervals. Journal of Vision, 10(12), 31. https://doi.org/10.1167/10.12.31
Kirman, J. H. (1974). Tactile apparent movement: The effects of interstimulus onset interval and stimulus duration. Perception & Psychophysics, 15, 1–6.
Kitazawa, S., Moizumi, S., Okuzumi, A., Saito, F., Shibuya, S., Takahashi, T., Wada, M., & Yamamoto, S. (2007). Reversal of subjective temporal order due to sensory and motor integrations. In P. Haggard, Y. Rossetti, & M. Kawato (Eds.), Sensorimotor foundations of higher cognition: Attention and performance XXII (pp. 73–97). New York, NY: Oxford University Press.
Kóbor, I., Füredi, L., Kovács, G., Spence, C., & Vidnyánszky, Z. (2006). Back-to-front: Improved tactile discrimination performance in the space you cannot see. Neuroscience Letters, 400, 163–167.
Kuroki, S., Watanabe, J., Mabuchi, K., Tachi, S., & Nishida, S. (2012). Directional remapping in tactile inter-finger apparent motion: A motion aftereffect study. Experimental Brain Research, 216, 311–320. https://doi.org/10.1007/s00221-011-2936-0
Lewald, J., & Guski, R. (2003). Cross-modal perceptual integration of spatially and temporally disparate auditory and visual stimuli. Cognitive Brain Research, 16, 468–478.
Lewald, J., & Guski, R. (2004). Auditory–visual temporal integration as a function of distance: No compensation for sound-transmission time in human perception. Neuroscience Letters, 357, 119–122.
Ley, P., Steinberg, U., Hanganu-Opatz, I. L., & Röder, B. (2015). Event-related potential evidence for a dynamic (re-)weighting of somatotopic and external coordinates of touch during visual-tactile interactions. European Journal of Neuroscience, 41, 1466–1474.
Lyons, G., Sanabria, D., Vatakis, A., & Spence, C. (2006). The modulation of crossmodal integration by unimodal perceptual grouping: A visuotactile apparent motion study. Experimental Brain Research, 174, 510–516.
Morein-Zamir, S., Soto-Faraco, S., & Kingstone, A. (2003). Auditory capture of vision: Examining temporal ventriloquism. Cognitive Brain Research, 17, 154–163.
Overvliet, K. E., Azañón, E., & Soto-Faraco, S. (2011). Somatosensory saccades reveal the timing of tactile spatial remapping. Neuropsychologia, 49, 3046–3052.
Sanabria, D., Soto-Faraco, S., & Spence, C. (2005). Spatiotemporal interactions between audition and touch depend on hand posture. Experimental Brain Research, 165, 505–514.
Sanabria, D., Spence, C., & Soto-Faraco, S. (2007). Perceptual and decisional contributions to audiovisual interactions in the perception of apparent motion: A signal detection study. Cognition, 102, 299–310. https://doi.org/10.1016/j.cognition.2006.01.003
Shi, Z., Chen, L., & Müller, H. J. (2010). Auditory temporal modulation of the visual Ternus effect: The influence of time interval. Experimental Brain Research, 203, 723–735.
Shibuya, S., Takahashi, T., & Kitazawa, S. (2007). Effects of visual stimuli on temporal order judgments of unimanual finger stimuli. Experimental Brain Research, 179, 709–721.
Shimojo, S., Scheier, C., Nijhawan, R., Shams, L., Kamitani, Y., & Watanabe, K. (2001). Beyond perceptual modality: Auditory effects on visual perception. Acoustical Science and Technology, 22, 2.
Shore, D. I., Spry, E., & Spence, C. (2002). Confusing the mind by crossing the hands. Cognitive Brain Research, 14, 153–163.
Slutsky, D. A., & Recanzone, G. H. (2001). Temporal and spatial dependency of the ventriloquism effect. NeuroReport, 12, 7–10.
Soto-Faraco, S., Spence, C., & Kingstone, A. (2004). Congruency effects between auditory and tactile motion: Extending the phenomenon of cross-modal dynamic capture. Cognitive, Affective, & Behavioral Neuroscience, 4, 208–217. https://doi.org/10.3758/CABN.4.2.208
Sternberg, S., & Knoll, R. L. (1973). The perception of temporal order: Fundamental issues and a general model. In S. Kornblum (Ed.), Attention and performance IV (pp. 629–685). New York, NY: Academic Press.
Stevenson, R. A., Fister, J. K., Barnett, Z. P., Nidiffer, A. R., & Wallace, M. T. (2012). Interactions between the spatial and temporal stimulus factors that influence multisensory integration in human performance. Experimental Brain Research, 219, 121–137.
Strybel, T. Z., Manligas, C. L., Chan, O., & Perrott, D. R. (1990). A comparison of the effects of spatial separation on apparent motion in the auditory and visual modalities. Perception & Psychophysics, 47, 439–448. https://doi.org/10.3758/BF03208177
Takahashi, T., Kansaku, K., Wada, M., Shibuya, S., & Kitazawa, S. (2013). Neural correlates of tactile temporal-order judgment in humans: An fMRI study. Cerebral Cortex, 23, 1952–1964.
Vroomen, J., & de Gelder, B. (2004). Temporal ventriloquism: Sound modulates the flash-lag effect. Journal of Experimental Psychology: Human Perception and Performance, 30, 513–518.
Wada, M., Yamamoto, S., & Kitazawa, S. (2004). Effects of handedness on tactile temporal order judgment. Neuropsychologia, 42, 1887–1895.
Yamamoto, S., & Kitazawa, S. (2001). Reversal of subjective temporal order due to arm crossing. Nature Neuroscience, 4, 759–765.
Zampini, M., Harris, C., & Spence, C. (2005). Effect of posture change on tactile perception: Impaired direction discrimination performance with interleaved fingers. Experimental Brain Research, 166, 498–508. https://doi.org/10.1007/s00221-005-2390-y
Author note
This work was supported by the German Research Foundation (DFG) (Grant Numbers BA 5600/1-1 to S.B., TRR 169 subproject A1 to B.R., and BR 4913/2-1 to P.B.). We thank Samantha Schröder for help collecting data.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Badde, S., Röder, B. & Bruns, P. Task-irrelevant sounds influence both temporal order and apparent-motion judgments about tactile stimuli applied to crossed and uncrossed hands. Atten Percept Psychophys 80, 773–783 (2018). https://doi.org/10.3758/s13414-017-1476-5
Published:
Issue Date:
DOI: https://doi.org/10.3758/s13414-017-1476-5