Advertisement

Experimental Brain Research

, Volume 180, Issue 4, pp 641–654 | Cite as

Multi-sensory integration of spatio-temporal segmentation cues: one plus one does not always equal two

  • Feng Zhou
  • Victoria Wong
  • Robert SekulerEmail author
Research Article

Abstract

How are multiple, multi-sensory stimuli combined for use in segmenting spatio-temporal events? For an answer, we measured the effect of various auditory or visual stimuli, in isolation or in combination, on a bistable percept of visual motion (“bouncing” vs. “streaming”). To minimize individual differences, the physical properties of stimuli were adjusted to reflect individual subjects’ sensitivity to each cue in isolation. When put into combination, perceptual influences that had been equipotent in isolation were substantially altered. Specifically, auditory cues that had been strong when presented alone were greatly reduced in combination. Evaluation of alternative models of sensory integration showed that the state of the visual bistable percept could not be accounted for by probability summation among cues, as might occur at the level of decision processes. Instead, the state of the bistable percept was well predicted from a weighted sum of cues, with visual cues strongly dominating auditory cues. Finally, when cue weights were compared for individual subjects, it was found that subjects differ somewhat in the strategy they use for integrating multi-sensory information.

Keywords

Root Mean Square Difference Psychometric Function Multisensory Integration Weibull Function Auditory Motion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

We thank Larry Abbott, Yuko Yotsumoto, Takeo Watanabe, Allison B. Sekuler, and Kristina Visscher for excellent suggestions. Feng Zhou is now at the Department of Psychological and Brain Sciences, the Johns Hopkins University. Victoria Wong was supported by an NSF IGERT fellowship, and by an undergraduate research grant; she is currently at the School of Medicine, University of Hawaii. Research supported by AFOSR grant F49620-03-1-0376 and National Institutes of Health grant MH-55687. e-mail: sekuler@brandeis.edu.

Supplementary material

221_2007_897_MOESM1_ESM.pdf (25 kb)
Table 1: Mean P(bouncing) for various cues and cue combinations (PDF 24.9 kb)

References

  1. Agam Y, Bullock D, Sekuler R (2005) Imitating unfamiliar sequences of connected linear motions. J Neurophysiol 94:2832–2843PubMedCrossRefGoogle Scholar
  2. Akaike H (1973) A new look at statistical model identification. IEEE Trans Automatic Control 19:716–723CrossRefGoogle Scholar
  3. Alais D, Burr D (2004a) No direction-specific bimodal facilitation for audiovisual motion detection. Cogn Brain Res 19:185–194CrossRefGoogle Scholar
  4. Alais D, Burr D (2004b) The ventriloquist effect results from near-optimal bimodal integration. Curr Biol 14:257–262CrossRefGoogle Scholar
  5. Beauchamp M (2005) Statistical criteria in fMRI studies of multisensory integration. Neuroinformatics 3:93–113PubMedCrossRefGoogle Scholar
  6. Berryhill ME, Chiu T, Hughes HC (2006) Smooth pursuit of nonvisual motion. J Neurophysiol 96:461–465PubMedCrossRefGoogle Scholar
  7. Bertenthal BI, Banton T, Bradbury A (1993) Directional bias in the perception of translating patterns. Perception 22:193–207PubMedCrossRefGoogle Scholar
  8. Brainard DH (1997) The psychophysics toolbox. Spat Vis 10:433–436PubMedGoogle Scholar
  9. Bülthoff HH, Mallot HA (1988) Integration of depth modules: stereo and shading. J Opt Soc Am A 5:1749–1758PubMedCrossRefGoogle Scholar
  10. Bushara K, Hanakawa T, Immisch I, Toma K, Kansaku K, Hallett M (2003) Neural correlates of cross-modal binding. Nat Neurosci 6:190–195PubMedCrossRefGoogle Scholar
  11. Colavita FB (1974) Human sensory dominance. Percept Psychophys 16:409–412Google Scholar
  12. De Gelder B, Bertelson P (2003) Multisensory integration, perception and ecological validity. Trends Cogn Sci 7:460–467PubMedCrossRefGoogle Scholar
  13. Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415:429–433PubMedCrossRefGoogle Scholar
  14. Ernst MO, Bülthoff HH (2004) Merging the senses into a robust percept. Trends Cogn Sci 8:162–169PubMedCrossRefGoogle Scholar
  15. Ernst MO, Banks MS, Bülthoff HH (2000) Touch can change visual slant perception. Nat Neurosci 3:69–73PubMedCrossRefGoogle Scholar
  16. Guest S, Spence C (2003) What role does multisensory integration play in the visuotactile perception of texture? Int J Psychophysiol 50:63–80PubMedCrossRefGoogle Scholar
  17. Heron J, Whitaker D, McGraw P (2004) Sensory uncertainty governs the extent of audiovisual interaction. Vis Res 44:2875–2884PubMedCrossRefGoogle Scholar
  18. Hillis J, Ernst M, Banks M, Landy M (2002) Combining sensory information: mandatory fusion within, but not between, senses. Science 298:1627–1630PubMedCrossRefGoogle Scholar
  19. Knill D, Saunders J (2003) Do humans optimally integrate stereo and texture information for judgments of surface slant? Vis Res 43:2539–2558PubMedCrossRefGoogle Scholar
  20. Lewis JW, Beauchamp MS, DeYoe EA (2000) A comparison of visual and auditory motion processing in human cerebral cortex. Cereb Cortex 10:873–888PubMedCrossRefGoogle Scholar
  21. Lewkowicz DJ (1996) Perception of auditory-visual temporal synchrony in human infants. J Exp Psychol Hum Percept Perform 22:1094–1106PubMedCrossRefGoogle Scholar
  22. McDonald JJ, Teder-Salejarvi WA, DiRusso F, Hillyard SA (2005) Neural basis of auditory-induced shifts in visual time-order perception. Nat Neurosci 8:1197–1202PubMedCrossRefGoogle Scholar
  23. Meredith MA, Stein BE (1986) Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J Neurophysiol 56:640–662PubMedGoogle Scholar
  24. Metzger W (1934) Beobachtungen über phaenomenale Identität (Observations on phenomenal identity). Psychol Forsch 19:1–60CrossRefGoogle Scholar
  25. Meyer G, Wuerger S, Röhrbein F, Zetzsche C (2005) Low-level integration of auditory and visual motion signals requires spatial co-localisation. Exp Brain Res 166:538–547PubMedCrossRefGoogle Scholar
  26. Middlebrooks JC, Green DM (1991) Sound localization by human listeners. Annu Rev Psychol 42:135–159PubMedCrossRefGoogle Scholar
  27. Mortensen U (2002) Additive noise, Weibull functions and the approximation of psychometric functions. Vis Res 42:2371–2393PubMedCrossRefGoogle Scholar
  28. Myung IJ, Pitt MA (2002) Mathematical modeling. In: Wixted J (ed), Stevens’ Handbook of experimental psychology, 3rd edn. Wiley, New York, p. 429–459Google Scholar
  29. Myung J, Pitt MA, Kim W (2005) Model evaluation, testing and selection. In: Lambert K, Goldstone R (eds) Handbook of cognition. Sage Publication, LondonGoogle Scholar
  30. Nelder JA, Mead R (1965) A simplex method for function minimization. Comput J 7:308–313Google Scholar
  31. Pelli DG (1997) The Videotoolbox software for visual psychophysics: transforming numbers into movies. Spat Vis 10:437–442PubMedGoogle Scholar
  32. Posner M, Nissen M, Klein R (1976) Visual dominance: an information-processing account of its origins and significance. Psychol Rev 83:157–171PubMedCrossRefGoogle Scholar
  33. Remijn GB, Ito H, Nakajima Y (2004) Audiovisual integration: an investigation of the “streaming-bouncing” phenomenon. J Physiol Anthropol Appl Hum Sci 23:243–247CrossRefGoogle Scholar
  34. Roach N, Heron J, McGraw P (2006) Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio–visual integration. Proc R Soc Biol 273:2159–2168CrossRefGoogle Scholar
  35. Roskies AL (1999) The binding problem. Neuron 24:7–9PubMedCrossRefGoogle Scholar
  36. Rui Y, Anandan P (2000) Segmenting visual actions based on spatio-temporal motion patterns. In: IEEE proceedings of conference on computer vision and pattern recognition. Hilton Head, p. 1111–1118Google Scholar
  37. Rui Y, Gupta A, Acero A (2000) Automatically extracting highlights for TV baseball programs. In: Proceedings of ACM Multimedia. Los Angeles, p. 105–115Google Scholar
  38. Sakurai K, Grove PM (2006) Auditory induced bounce perception when visual trajectories are inconsistent with motion reversal. In: European conference on visual perception. St. PetersburgGoogle Scholar
  39. Sanabria D, Correa A, Lupianez J, Spence C (2004) Bouncing or streaming? Exploring the infuence of auditory cues on the interpretation of ambiguous visual motion. Exp Brain Res 157:537–541PubMedCrossRefGoogle Scholar
  40. Sanabria D, Spence C, Soto-Faraco S (2007) Perceptual and decisional contributions to audiovisual interactions in the perception of apparent motion: a signal detection study. Cognition 102:299–310PubMedCrossRefGoogle Scholar
  41. Scheier C, Lewkowicz DJ, Shimojo S (2003) Sound induces perceptual reorganization of an ambiguous motion display in human infants. Dev Sci 6:233–241CrossRefGoogle Scholar
  42. Sekuler AB, Sekuler R (1999) Collisions between moving visual targets: what controls alternative ways of seeing an ambiguous display? Perception 28:415–432PubMedCrossRefGoogle Scholar
  43. Sekuler R, Sekuler AB, Lau R (1997) Sound alters visual motion perception. Nature 385:308PubMedCrossRefGoogle Scholar
  44. Shams L, Kamitani Y, Shimojo S (2002) Visual illusion induced by sound. Cogn Brain Res 14:147–152CrossRefGoogle Scholar
  45. Shimojo S, Shams L (2001) Sensory modalities are not separate modalities: plasticity and interactions. Curr Opin Neurobiol 11:505–509PubMedCrossRefGoogle Scholar
  46. Soto-Faraco S, Lyons J, Gazzaniga M, Spence C, Kingstone A (2002) The ventriloquist in motion: illusory capture of dynamic information across sensory modalities. Cogn Brain Res 14:139–146CrossRefGoogle Scholar
  47. Soto-Faraco S, Spence C, Kingstone A (2004a) Cross-modal dynamic capture: congruency effects in the perception of motion across sensory modalities. J Exp Psychol Hum Percept Perform 30:330–345CrossRefGoogle Scholar
  48. Soto-Faraco S, Spence C, Lloyd D, Kingstone A (2004b) Moving multisensory research along: motion perception across sensory modalities. Curr Dir Psychol Sci 13:29–32CrossRefGoogle Scholar
  49. Stanford T, Quessy S, Stein B (2005) Evaluating the operations underlying multisensory integration in the cat superior colliculus. J Neurosci 25:6499–6508PubMedCrossRefGoogle Scholar
  50. Treutwein B (1995) Adaptive psychophysical procedures. Vis Res 35:2503–2522PubMedCrossRefGoogle Scholar
  51. Tripathy SP, Barrett BT (2003) Gross misperceptions in the perceived trajectories of moving dots. Perception 32:1403–1408PubMedCrossRefGoogle Scholar
  52. Wagenmakers EJ, Farrell S (2004) AIC model selection using Akaike weights. Psychon Bull Rev11:192–196PubMedGoogle Scholar
  53. Wallace M, Meredith M, Stein B (1998) Multisensory integration in the superior colliculus of the alert cat. J Neurophysiol 80:1006–1010PubMedGoogle Scholar
  54. Watanabe K, Shimojo S (2001a) Postcoincidence trajectory duration affects motion event perception. Percept Psychophys 63:16–28Google Scholar
  55. Watanabe K, Shimojo S (2001b) When sound affects vision: effects of auditory grouping on visual motion perception. Psychol Sci 12:109–116CrossRefGoogle Scholar
  56. Watson A, Hu J (1999) Showtime: a quicktime-based infrastructure for vision research displays. Perception (Supplement) 28:45Google Scholar
  57. Watson AB, Pelli DG (1983) QUEST: a Bayesian adaptive psychometric method. Percept Psychophys 33:113–120PubMedGoogle Scholar
  58. Welch R, Warren D (1986) Intersensory interactions. In: Boff KR, Kaufman L, Thomas JP (eds) Handbook of perception and human performance. Wiley, New York City, pp. 25.1–25.36Google Scholar
  59. Witten IB, Knudsen EI (2005) Why seeing is believing: merging auditory and visual worlds. Neuron 48:489–496PubMedCrossRefGoogle Scholar
  60. Wuerger S, Hofbauer M, Meyer G (2003) The integration of auditory and visual motion signals at threshold. Percept Psychophys 65:1188–1196PubMedGoogle Scholar
  61. Zacks JM, Tversky B (2001) Event structure in perception and conception. Psychol Bull 127:3–21PubMedCrossRefGoogle Scholar
  62. Zacks JM, Braver TS, Sheridan MA, Donaldson DI, Snyder AZ, Ollinger JM, Buckner RL, Raichle ME (2001) Human brain activity time-locked to perceptual event boundaries. Nat Neurosci 4:651–655PubMedCrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2007

Authors and Affiliations

  1. 1.Brandeis UniversityWalthamUSA

Personalised recommendations