Abstract
Passive movement through an environment is typically perceived by integrating information from different sensory signals, including visual and vestibular information. A wealth of previous research in the field of multisensory integration has shown that if different sensory signals are spatially or temporally discrepant, they may not combine in a statistically optimal fashion; however, this has not been well explored for visual–vestibular integration. Self-motion perception involves the integration of various movement parameters including displacement, velocity, acceleration and higher derivatives such as jerk. It is often assumed that the vestibular system is optimized for the processing of acceleration and higher derivatives, while the visual system is specialized to process position and velocity. In order to determine the interactions between different spatiotemporal properties for self-motion perception, in Experiment 1, we first asked whether the velocity profile of a visual trajectory affects discrimination performance in a heading task. Participants performed a two-interval forced choice heading task while stationary. They were asked to make heading discriminations while the visual stimulus moved at a constant velocity (C-Vis) or with a raised cosine velocity (R-Vis) motion profile. Experiment 2 was designed to assess how the visual and vestibular velocity profiles combined during the same heading task. In this case, participants were seated on a Stewart motion platform and motion information was presented via visual information alone, vestibular information alone or both cues combined. The combined condition consisted of congruent blocks (R-Vis/R-Vest) in which both visual and vestibular cues consisted of a raised cosine velocity profile and incongruent blocks (C-Vis/R-Vest) in which the visual motion profile consisted of a constant velocity motion, while the vestibular motion consisted of a raised cosine velocity profile. Results from both Experiments 1 and 2 demonstrated that visual heading estimates are indeed affected by the velocity profile of the movement trajectory, with lower thresholds observed for the R-Vis compared to the C-Vis. In Exp. 2 when visual–vestibular inputs were both present, they were combined in a statistically optimal fashion irrespective of the type of visual velocity profile, thus demonstrating robust integration of visual and vestibular cues. The study suggests that while the time course of the velocity did affect visual heading judgments, a moderate conflict between visual and vestibular motion profiles does not cause a breakdown in optimal integration for heading.
Similar content being viewed by others
References
Ash A, Palmisano S (2012) Vection during conflicting multisensory information about the axis, magnitude, and direction of self-motion. Perception 41:253–267
Barnett-Cowan M, Meilinger T, Vidal M, Teufel H, Bülthoff HH (2012) MPI CyberMotion Simulator: implementation of a novel motion simulator to investigate multisensory path integration in three dimensions. J Vis Exp. doi:10.3791/3436
Benson AJ, Spencer MB, Stott JR (1986) Thresholds for the detection of the direction of whole-body, linear movement in the horizontal plane. Aviat Space Environ Med 57:1088–1096
Bentvelzen A, Leung J, Alais D (2009) Discriminating audiovisual speed: optimal integration of speed defaults to probability summation when component reliabilities diverge. Perception 38:966–987
Berger DR, Bülthoff HH (2009) The role of attention on the integration of visual and inertial cues. Exp Brain Res 198:287–300. doi:10.1007/s00221-009-1767-8
Berger DR, Schulte-Pelkum J, Bülthoff HH (2010) Simulating believable forward accelerations on a Stewart motion platform. ACM transactions on applied perception 7(Artn 5). doi:10.1145/1658349.1658354
Berthoz A, Pavard B, Young LR (1975) Perception of linear horizontal self-motion induced by peripheral vision (linearvection) basic characteristics and visual–vestibular interactions. Exp Brain Res 23:471–489
Bles W, Groen E (2009) The DESDEMONA motion facility: applications for space research. Microgravity Sci Technol 21:281–286. doi:10.1007/s12217-009-9120-1
Brouwer AM, Brenner E, Smeets JB (2002) Perception of acceleration with short presentation times: can acceleration be used in interception? Percept Psychophys 64:1160–1168
Butler JS, Smith ST, Beykirch K, Bülthoff HH (2006) Visual vestibular interactions for self motion estimation. In: Driving simulation conference, Paris
Butler JS, Smith ST, Campos JL, Bülthoff HH (2010) Bayesian integration of visual and vestibular signals for heading. J Vis 10:23. doi:10.1167/10.11.23
Butler JS, Campos JL, Bülthoff HH, Smith ST (2011a) The role of stereo vision in visual–vestibular integration. Seeing Perceiving 24:453–470. doi:10.1163/187847511X588070
Butler JS, Molholm S, Fiebelkorn IC, Mercier MR, Schwartz TH, Foxe JJ (2011b) Common or redundant neural circuits for duration processing across audition and touch. J Neurosci 31:3400–3406. doi:10.1523/JNEUROSCI.3296-10.2011
Butler JS, Foxe JJ, Fiebelkorn IC, Mercier MR, Molholm S (2012) Multisensory representation of frequency across audition and touch: high density electrical mapping reveals early sensory-perceptual coupling. J Neurosci 32:15338–15344. doi:10.1523/JNEUROSCI.1796-12.2012
Campos JL, Butler JS, Bülthoff HH (2012) Multisensory integration in the estimation of walked distances. Exp Brain Res. doi:10.1007/s00221-012-3048-1
Chen A, DeAngelis GC, Angelaki DE (2011) Representation of vestibular and visual cues to self-motion in ventral intraparietal cortex. J Neurosci 31:12036–12052. doi:10.1523/JNEUROSCI.0395-11.2011
Crane BT (2012) Direction specific biases in human visual and vestibular heading perception. PLoS One 7:e51383. doi:10.1371/journal.pone.0051383
Crowell JA, Banks MS (1993) Perceiving heading with different retinal regions and types of optic flow. Percept Psychophys 53:325–337
Crowell JA, Banks MS (1996) Ideal observer for heading judgments. Vis Res 36:471–490
Cuturi LF, MacNeilage PR (2013) Systematic biases in human heading estimation. PLoS One 8:e56862. doi:10.1371/journal.pone.0056862
de Winkel KN, Weesie J, Werkhoven PJ, Groen EL (2010) Integration of visual and inertial cues in perceived heading of self-motion. J Vis 10:1. doi:10.1167/10.12.1
Drugowitsch J, DeAngelis GC, Klier EM, Angelaki DE, Pouget A (2014) Optimal multisensory decision-making in a reaction-time task. Elife:e03005 doi: 10.7554/eLife.03005
Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415:429–433. doi:10.1038/415429a
Ernst MO, Bülthoff HH (2004) Merging the senses into a robust percept. Trends Cogn Sci 8:162–169. doi:10.1016/j.tics.2004.02.002
Festl F, Recktenwald F, Yuan C, Mallot HA (2012) Detection of linear ego-acceleration from optic flow. J Vis 12 doi: 10.1167/12.7.10
Fetsch CR, Turner AH, DeAngelis GC, Angelaki DE (2009) Dynamic reweighting of visual and vestibular cues during self-motion perception. J Neurosci 29:15601–15612. doi:10.1523/JNEUROSCI.2574-09.2009
Fetsch CR, Deangelis GC, Angelaki DE (2010a) Visual–vestibular cue integration for heading perception: applications of optimal cue integration theory. Eur J Neurosci 31:1721–1729. doi:10.1111/j.1460-9568.2010.07207.x
Fetsch CR, Rajguru SM, Karunaratne A, Gu Y, Angelaki DE, DeAngelis GC (2010b) Spatiotemporal properties of vestibular responses in area MSTd. J Neurophysiol 104:1506–1522. doi:10.1152/jn.91247.2008
Fetsch CR, Pouget A, DeAngelis GC, Angelaki DE (2012) Neural correlates of reliability-based cue weighting during multisensory integration. Nat Neurosci 15:146–154. doi:10.1038/nn.2983
Frenz H, Lappe M (2005) Absolute travel distance from optic flow. Vis Res 45:1679–1692. doi:10.1016/j.visres.2004.12.019
Frissen I, Campos JL, Souman JL, Ernst MO (2011) Integration of vestibular and proprioceptive signals for spatial updating. Exp Brain Res 212:163–176. doi:10.1007/s00221-011-2717-9
Gepshtein S, Burge J, Ernst MO, Banks MS (2005) The combination of vision and touch depends on spatial proximity. J Vis 5:1013–1023. doi:10.1167/5.11.7
Gibson JJ (1950) The perception of the visual world. Houghton Mifflin, Boston
Grant PR, Reid LD (1997) Motion washout filter tuning : rules and requirements. American Institute of Aeronautics and Astronautics, Reston
Gu Y, Liu S, Fetsch CR et al (2011) Perceptual learning reduces interneuronal correlations in macaque visual cortex. Neuron 71:750–761. doi:10.1016/j.neuron.2011.06.015
Hartcher-O’Brien J, Di Luca M, Ernst MO (2014) The duration of uncertain times: audiovisual information about intervals is integrated in a statistically optimal fashion. PLoS One 9:e89339. doi:10.1371/journal.pone.0089339
Hillis JM, Ernst MO, Banks MS, Landy MS (2002) Combining sensory information: mandatory fusion within, but not between, senses. Science 298:1627–1630. doi:10.1126/science.1075396
Kording KP, Beierholm U, Ma WJ, Quartz S, Tenenbaum JB, Shams L (2007) Causal inference in multisensory perception. PLoS One 2:e943. doi:10.1371/journal.pone.0000943
MacNeilage PR, Banks MS, Berger DR, Bülthoff HH (2007) A Bayesian model of the disambiguation of gravitoinertial force by visual cues. Exp Brain Res 179:263–290. doi:10.1007/s00221-006-0792-0
MacNeilage PR, Banks MS, DeAngelis GC, Angelaki DE (2010) Vestibular heading discrimination and sensitivity to linear acceleration in head and world coordinates. J Neurosci 30:9084–9094. doi:10.1523/JNEUROSCI.1304-10.2010
Nolan H, Butler JS, Whelan R, Foxe JJ, Bülthoff HH, Reilly RB (2012) Neural correlates of oddball detection in self-motion heading: a high-density event-related potential study of vestibular integration. Exp Brain Res. doi:10.1007/s00221-012-3059-y
Palmisano S, Gillam BJ, Blackburn SG (2000) Global-perspective jitter improves vection in central vision. Perception 29:57–67
Palmisano S, Burke D, Allison RS (2003) Coherent perspective jitter induces visual illusions of self-motion. Perception 32:97–110
Palmisano S, Bonato F, Bubka A, Folder J (2007) Vertical display oscillation effects on forward vection and simulator sickness. Aviat Space Environ Med 78:951–956
Palmisano S, Allison RS, Pekin F (2008) Accelerating self-motion displays produce more compelling vection in depth. Perception 37:22–33
Palmisano S, Allison RS, Kim J, Bonato F (2011) Simulated viewpoint jitter shakes sensory conflict accounts of vection. Seeing Perceiving 24:173–200. doi:10.1163/187847511X570817
Prsa M, Gale S, Blanke O (2012) Self-motion leads to mandatory cue fusion across sensory modalities. J Neurophysiol 108:2282–2291. doi:10.1152/jn.00439.2012
Raposo D, Sheppard JP, Schrater PR, Churchland AK (2012) Multisensory decision-making in rats and humans. J Neurosci 32:3726–3735. doi:10.1523/JNEUROSCI.4998-11.2012
Roach NW, Heron J, McGraw PV (2006) Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration. Proc Biol Sci 273:2159–2168. doi:10.1098/rspb.2006.3578
Royden CS, Banks MS, Crowell JA (1992) The perception of heading during eye movements. Nature 360:583–585. doi:10.1038/360583a0
Royden CS, Crowell JA, Banks MS (1994) Estimating heading during eye movements. Vis Res 34:3197–3214
Schlack A, Albright TD (2007) Remembering visual motion: neural correlates of associative plasticity and motion recall in cortical area MT. Neuron 53:881–890. doi:10.1016/j.neuron.2007.02.028
Schlack A, Krekelberg B, Albright TD (2007) Recent history of stimulus speeds affects the speed tuning of neurons in area MT. J Neurosci 27:11009–11018. doi:10.1523/JNEUROSCI.3165-07.2007
Schlack A, Krekelberg B, Albright TD (2008) Speed perception during acceleration and deceleration. J Vis 8(9):1–11. doi:10.1167/8.8.9
Sheppard JP, Raposo D, Churchland AK (2013) Dynamic weighting of multisensory stimuli shapes decision-making in rats and humans. J Vis 13 doi: 10.1167/13.6.4
Taylor JR (1997) An introduction to error analysis : the study of uncertainties in physical measurements. University Science Books, Sausalito
Telford L, Howard IP, Ohmi M (1995) Heading judgments during active and passive self-motion. Exp Brain Res 104:502–510
Teufel H, Nusseck H-GG, Beykirch K, Butler JS, Kerger M, Bülthoff HH (2007) MPI motion simulator: development and analysis of a novel motion simulator. In: AIAA modeling and simulation technologies conference and exhibit, South Carolina, pp 1–11
Wallace MT, Roberson GE, Hairston WD, Stein BE, Vaughan JW, Schirillo JA (2004) Unifying multisensory signals across time and space. Exp Brain Res 158:252–258. doi:10.1007/s00221-004-1899-9
Wallis G, Chatziastros A, Bülthoff H (2002) An unexpected role for visual feedback in vehicle steering control. Curr Biol 12:295–299
Warren WH, Hannon DJ (1988) Direction of self-motion is perceived from optical flow. Nature 336:162–163
Werkhoven P, Snippe HP, Toet A (1992) Visual processing of optic acceleration. Vis Res 32:2313–2329
Wichmann FA, Hill NJ (2001a) The psychometric function: I. Fitting, sampling, and goodness of fit. Percept Psychophys 63:1293–1313
Wichmann FA, Hill NJ (2001b) The psychometric function: II. Bootstrap-based confidence intervals and sampling. Percept Psychophys 63:1314–1329
Wozny DR, Shams L (2011) Recalibration of auditory space following milliseconds of cross-modal discrepancy. J Neurosci 31:4607–4612. doi:10.1523/JNEUROSCI.6079-10.2011
Wozny DR, Beierholm UR, Shams L (2010) Probability matching as a computational strategy used in perception. PLoS Comput Biol 6 doi: 10.1371/journal.pcbi.1000871
Yau JM, Olenczak JB, Dammann JF, Bensmaia SJ (2009) Temporal frequency channels are linked across audition and touch. Curr Biol 19:561–566. doi:10.1016/j.cub.2009.02.013
Yau JM, Weber AI, Bensmaia SJ (2010) Separate mechanisms for audio-tactile pitch and loudness interactions. Front Psychol 1:160. doi:10.3389/fpsyg.2010.00160
Zacharias GL, Young LR (1981) Influence of combined visual and vestibular cues on human perception and control of horizontal rotation. Exp Brain Res 41:159–171
Acknowledgments
This research was supported by the Max Planck Society and by the Brain Korea 21 PLUS Program through the National Research Foundation of Korea funded by the Ministry of Education. We are grateful to the participants for their time. We are also grateful to Paul “Pogen” MacNeilage, Julian Hoffmann, Edel Flynn, Karl Beykirch, Stuart Smith and Jack Loomis for their help at different stages of this project.
Author information
Authors and Affiliations
Corresponding authors
Rights and permissions
About this article
Cite this article
Butler, J.S., Campos, J.L. & Bülthoff, H.H. Optimal visual–vestibular integration under conditions of conflicting intersensory motion profiles. Exp Brain Res 233, 587–597 (2015). https://doi.org/10.1007/s00221-014-4136-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00221-014-4136-1