Perception & Psychophysics

, Volume 70, Issue 7, pp 1207–1216 | Cite as

Auditory and visual attention-based apparent motion share functional parallels

  • Wendy E. Huddleston
  • James W. Lewis
  • Raymond E. Phinney
  • Edgar A. DeYoe
Article

Abstract

A perception of coherent motion can be obtained in an otherwise ambiguous or illusory visual display by directing one's attention to a feature and tracking it. We demonstrate an analogous auditory effect in two separate sets of experiments. The temporal dynamics associated with the attention-dependent auditory motion closely matched those previously reported for attention-based visual motion. Since attention-based motion mechanisms appear to exist in both modalities, we also tested for multimodal (audiovisual) attention-based motion, using stimuli composed of interleaved visual and auditory cues. Although subjects were able to track a trajectory using cues from both modalities, no one spontaneously perceived (“multimodal motion”) across both visual and auditory cues. Rather, they reported motion perception only within each modality, thereby revealing a spatiotemporal limit on putative cross-modal motion integration. Together, results from these experiments demonstrate the existence of attention-based motion in audition, extending current theories of attention-based mechanisms from visual to auditory systems.

References

  1. Berman, R. A., & Colby, C. L. (2002). Auditory and visual attention modulate motion processing in area MT+. Cognitive Brain Research, 14, 64–74.PubMedCrossRefGoogle Scholar
  2. Bremmer, F., Schlack, A., Shah, N. J., Zafiris, O., Kubischik, M., Hoffmann, K., et al. (2001). Polymodal motion processing in posterior parietal and premotor cortex: A human fMRI study strongly implies equivalencies between humans and monkeys. Neuron, 29, 287–296.PubMedCrossRefGoogle Scholar
  3. Cavanagh, P. (1992). Attention-based motion perception. Science, 257, 1563–1565.PubMedCrossRefGoogle Scholar
  4. Driver, J. (1996). Enhancement of selective listening by illusory mislocation of speech sounds due to lip-reading. Nature, 381, 66–68.PubMedCrossRefGoogle Scholar
  5. Driver, J., & Spence, C. (2000). Multisensory perception: Beyond modularity and convergence. Current Biology, 10, R731-R735.PubMedCrossRefGoogle Scholar
  6. Eimer, M., van Velzen, J., & Driver, J. (2004). ERP evidence for cross-modal audiovisual effects of endogenous spatial attention within hemifields. Journal of Cognitive Neuroscience, 16, 272–288.PubMedCrossRefGoogle Scholar
  7. Farah, M. J., Wong, A. B., Monheit, M. A., & Morrow, L. A. (1989). Parietal lobe mechanisms of spatial attention: Modality-specific or supramodal? Neuropsychologia, 27, 461–470.PubMedCrossRefGoogle Scholar
  8. Fort, A., Delpuech, C., Pernier, J., & Giard, M.-H. (2002). Dynamics of cortico-subcortical cross-modal operations involved in audio—visual object detection in humans. Cerebral Cortex, 12, 1031–1039.PubMedCrossRefGoogle Scholar
  9. Frassinetti, F., Bolognini, N., & Làdavas, E. (2002). Enhancement of visual perception by crossmodal visuo-auditory interaction. Experimental Brain Research, 147, 332–343.CrossRefGoogle Scholar
  10. Fujisaki, W., & Nishida, S. (2007). Feature-based processing of audiovisual synchrony perception revealed by random pulse trains. Vision Research, 47, 1075–1093.PubMedCrossRefGoogle Scholar
  11. Giard, M. H., & Peronnet, F. (1999). Auditory—visual integration during multimodal object recognition in humans: A behavioral and electrophysiological study. Journal of Cognitive Neuroscience, 11, 473–490.PubMedCrossRefGoogle Scholar
  12. Good, P. (2000). Permutation tests: A practical guide to resampling methods for testing hypotheses (2nd ed.). New York: Springer.Google Scholar
  13. Huddleston, W. E., & DeYoe, E. A. (2003). First-order and second-order spectral (‘motion’) mechanisms in the human auditory system. Perception, 32, 1141–1149.PubMedCrossRefGoogle Scholar
  14. King, A. J., & Calvert, G. A. (2001). Multisensory integration: Perceptual grouping by eye and ear. Current Biology, 11, R322-R325.PubMedCrossRefGoogle Scholar
  15. Kitagawa, N., & Ichihara, S. (2002). Hearing visual motion in depth. Nature, 416, 172–174.PubMedCrossRefGoogle Scholar
  16. Lakatos, S. (1995). The influence of visual cues on the localisation of circular auditory motion. Perception, 24, 457–465.PubMedCrossRefGoogle Scholar
  17. Lakatos, S., & Shepard, R. N. (1997). Constraints common to apparent motion in visual, tactile, and auditory space. Journal of Experimental Psychology: Human Perception & Performance, 23, 1050–1060.CrossRefGoogle Scholar
  18. Lewis, J. W., Beauchamp, M. S., & DeYoe, E. A. (2000). A comparison of visual and auditory motion processing in human cerebral cortex. Cerebral Cortex, 10, 873–888.PubMedCrossRefGoogle Scholar
  19. Lu, Z.-L., & Sperling, G. (1995). Attention-generated apparent motion. Nature, 377, 237–239.PubMedCrossRefGoogle Scholar
  20. Lu, Z.-L., & Sperling, G. (2001). Three-systems theory of human visual motion perception: Review and update. Journal of the Optical Society of America A, 18, 2331–2370.CrossRefGoogle Scholar
  21. Macaluso, E., Frith, C. D., & Driver, J. (2002). Directing attention to locations and to sensory modalities: Multiple levels of selective processing revealed with PET. Cerebral Cortex, 12, 357–368.PubMedCrossRefGoogle Scholar
  22. McDonald, J. J., Teder-Sälejärvi, W. A., & Hillyard, S. A. (2000). Involuntary orienting to sound improves visual perception. Nature, 407, 906–908.PubMedCrossRefGoogle Scholar
  23. Meyer, G. F., Wuerger, S. M., Röhrbein, F., & Zetzsche, C. (2005). Low-level integration of auditory and visual motion signals requires spatial co-localisation. Experimental Brain Research, 166, 538–547.CrossRefGoogle Scholar
  24. Shimojo, S., & Shams, L. (2001). Sensory modalities are not separate modalities: Plasticity and interactions. Current Opinion in Neurobiology, 11, 505–509.PubMedCrossRefGoogle Scholar
  25. Soto-Faraco, S., Kingstone, A., & Spence, C. (2003). Multisensory contributions to the perception of motion. Neuropsychologia, 41, 1847–1862.PubMedCrossRefGoogle Scholar
  26. Soto-Faraco, S., Lyons, J., Gazzaniga, M., Spence, C., & Kingstone, A. (2002). The ventriloquist in motion: Illusory capture of dynamic information across sensory modalities. Cognitive Brain Research, 14, 139–146.PubMedCrossRefGoogle Scholar
  27. Spence, C., & Driver, J. (1996). Audiovisual links in endogenous covert spatial attention. Journal of Experimental Psychology: Human Perception & Performance, 22, 1005–1030.CrossRefGoogle Scholar
  28. Verstraten, F. A. J., & Ashida, H. (2005). Attention-based motion perception and motion adaptation: What does attention contribute? Vision Research, 45, 1313–1319.PubMedCrossRefGoogle Scholar
  29. Verstraten, F. A. J., Cavanagh, P., & Labianca, A. T. (2000). Limits of attentive tracking reveal temporal properties of attention. Vision Research, 40, 3651–3664.PubMedCrossRefGoogle Scholar
  30. Ward, L. M., McDonald, J. J., & Lin, D. (2000). On asymmetries in cross-modal spatial attention orienting. Perception & Psychophysics, 62, 1258–1264.CrossRefGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2008

Authors and Affiliations

  • Wendy E. Huddleston
    • 1
  • James W. Lewis
    • 2
  • Raymond E. Phinney
    • 3
  • Edgar A. DeYoe
    • 4
  1. 1.Department of Human Movement SciencesUniversity of WisconsinMilwaukee
  2. 2.West Virginia UniversityMorgantown
  3. 3.Wheaton CollegeWheaton
  4. 4.Medical College of WisconsinMilwaukee

Personalised recommendations