Experimental Brain Research

, Volume 183, Issue 3, pp 399–404 | Cite as

Attention to touch weakens audiovisual speech integration

  • Agnès Alsius
  • Jordi Navarra
  • Salvador Soto-Faraco
Research Note

Abstract

One of the classic examples of multisensory integration in humans occurs when speech sounds are combined with the sight of corresponding articulatory gestures. Despite the longstanding assumption that this kind of audiovisual binding operates in an attention-free mode, recent findings (Alsius et al. in Curr Biol, 15(9):839–843, 2005) suggest that audiovisual speech integration decreases when visual or auditory attentional resources are depleted. The present study addressed the generalization of this attention constraint by testing whether a similar decrease in multisensory integration is observed when attention demands are imposed on a sensory domain that is not involved in speech perception, such as touch. We measured the McGurk illusion in a dual task paradigm involving a difficult tactile task. The results showed that the percentage of visually influenced responses to audiovisual stimuli was reduced when attention was diverted to a tactile task. This finding is attributed to a modulatory effect on audiovisual integration of speech mediated by supramodal attention limitations. We suggest that the interactions between the attentional system and crossmodal binding mechanisms may be much more extensive and dynamic than it was advanced in previous studies.

Keywords

Attention Multisensory integration Speech perception Touch 

References

  1. Alais D, Morrone C, Burr D (2006) Separate attentional resources for vision and audition. Proc Biol Sci 273(1592):1339–1345PubMedCrossRefGoogle Scholar
  2. Amedi A, Malach R, Hendler T, Peled S, Zohary E (2001) Visuo-haptic object-related activation in the ventral visual pathway. Nat Neurosci 4:324–330PubMedCrossRefGoogle Scholar
  3. Alsius A, Navarra J, Campbell R, Soto-Faraco S (2005) Audiovisual integration of speech falters under high attention demands. Curr Biol 15(9):839–843PubMedCrossRefGoogle Scholar
  4. Bernstein LE, Auer ET Jr, Moore JK (2004) Audiovisual speech binding: convergence or association? In: Calvert GA, Spence C, Stein BE (eds) The handbook of multisesensory processes. The MIT Press, Cambridge, pp 203–224Google Scholar
  5. Bertelson P, Radeau M (1981) Cross-modal bias and perceptual fusion with auditory-visual spatial discordance. Percept Psychophys 29:578–584PubMedGoogle Scholar
  6. Bertelson P, Vroomen J, de Gelder B, Driver J (2000) The ventriloquist effect does not depend on the direction of deliberate visual attention. Percept Psychophys 62(2):321–332PubMedGoogle Scholar
  7. Burnham D, Dodd B (2004) Auditory-visual speech integration by prelinguistic infants: perception of an emergent consonant in the McGurk effect. Dev Psychobiol 45(4):204–220PubMedCrossRefGoogle Scholar
  8. Calvert GA, Brammer MJ, Bullmore ET, Campbell R, Iversen SD, David AS (1999) Response amplification in sensory-specific cortices during cross-modal binding. Neuroreport 10:2619–2623PubMedCrossRefGoogle Scholar
  9. Calvert GA, Campbell R, Brammer MJ (2000) Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex. Curr Biol 10(11):649–657PubMedCrossRefGoogle Scholar
  10. Colin C, Radeau M, Soquet A, Demolin D, Colin F, Deltenre P (2002) Mismatch negativity evoked by the McGurk–MacDonald effect: a phonetic representation within short-term memory. Clin Neurophysiol 113:495–506PubMedCrossRefGoogle Scholar
  11. de Gelder B, Bertelson P (2003) Multisensory integration, perception and ecological validity. Trends Cogn Sci 7(10):460–467PubMedCrossRefGoogle Scholar
  12. Degerman A, Rinne T, Pekkola J, Autti T, Jääskeläinen I, Sams M, Alho K (2007) Human brain activity associated with audiovisual perception and attention. Neuroimage 34(4):1683–1691PubMedCrossRefGoogle Scholar
  13. Duncan J, Martens S, Ward R (1997) Restricted attentional capacity within but not between sensory modalities. Nature 387:808–810PubMedCrossRefGoogle Scholar
  14. Eimer M, Van Velzen J (2002) Crossmodal links in spatial attention are mediated by supramodal control processes: evidence from event-related brain potentials. Psychophysiol 39:437–449CrossRefGoogle Scholar
  15. Eimer M, van Velzen J, Driver J (2002) Crossmodal interactions between audition, touch and vision in endogenous spatial attention: ERP evidence on preparatory states and sensory modulations. J Cogn Neurosci 14:254–271PubMedCrossRefGoogle Scholar
  16. Fujisaki W, Koene A, Arnold D, Johnston A, Nishida S (2006) Visual search for a target changing in synchrony with an auditory signal. Proc R Soc B 273:865–874PubMedCrossRefGoogle Scholar
  17. Ghazanfar AA, Logothetis NK (2003) Facial expressions linked to monkey calls. Nature 423:937–938PubMedCrossRefGoogle Scholar
  18. Green KP, Kuhl PK (1991) Integral processing of visual place and auditory voicing information during phonetic perception. J Exp Psychol Hum Percept Perform 17:278–288PubMedCrossRefGoogle Scholar
  19. Hillyard SA, Simpson GV, Woods DL, VanVoorhis S, Münte TF (1984) Event-related brain potentials and selective attention to different modalities. In: Reinoso-Suarez F, Aimone-Marsan C (eds) Cortical integration. Raven, New York, pp 395–413Google Scholar
  20. Kaiser J, Hertrich L, Ackermann H, Mathiak K, Lutzenberger W (2004) Hearing lips: gamma-band activity during audiovisual speech perception. Cereb Cortex 15:646–653PubMedCrossRefGoogle Scholar
  21. Kanwisher N, Wojciulik E (2000) Visual attention: insights from brain imaging. Nat Rev Neurosci 1:91–100PubMedCrossRefGoogle Scholar
  22. Khul PK, Meltzoff AN (1982) The bimodal perception of speech in infancy. Science 218:1138–1141CrossRefGoogle Scholar
  23. Lavie N (1995) Perceptual load as a necessary condition for selective attention. J Exp Psychol Hum Percept Perform 21:451–468PubMedCrossRefGoogle Scholar
  24. Lewkowicz DJ, Ghazanfar AA (2006) The decline of cross-species intersensory perception in human infants. Proc Natl Acad Sci USA 103:6771–6774PubMedCrossRefGoogle Scholar
  25. Macaluso E, Frith CD, Driver J (2002) Supramodal effects of covert spatial orienting triggered by visual or tactile events. J Cogn Neurosci 14(3):389–401PubMedCrossRefGoogle Scholar
  26. McDonald JJ, Ward LM (2000) Involuntary listening aids seeing: evidence from human electrophysiology. Psychol Sci 11:167–171PubMedCrossRefGoogle Scholar
  27. Mattys S, Bernstein LE, Auer ET (2002) Stimulus-based lexical distinctiveness as a general word recognition mechanism. Percept Psychophys 64(4):667–679PubMedGoogle Scholar
  28. Massaro DW (1987) Speech perception by ear and eye. LEA, HillsdaleGoogle Scholar
  29. Massaro DW (1998) Perceiving talking faces: from speech perception to a behavioral principle. MIT Press, CambridgeGoogle Scholar
  30. McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 265:746–748CrossRefGoogle Scholar
  31. Pick HL Jr, Warren DH, Hay JC (1969) Sensory conflict in judgements of spatial direction. Percept Psychophys 6:203–205Google Scholar
  32. Rees G, Frith CD, Lavie N (2001) Perception of irrelevant visual motion during performance of an auditory task. Neuropsychologia 39:937–949PubMedCrossRefGoogle Scholar
  33. Soto-Faraco S, Alsius A (2006) Conscious access to the unisensory components of a cross-modal illusion. Neuroreport 18:347–350CrossRefGoogle Scholar
  34. Soto-Faraco S, Navarra J, Alsius A (2004) Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task. Cognition 92:B13–B23PubMedCrossRefGoogle Scholar
  35. Spence C, Driver J (eds) (2004) Crossmodal space and crossmodal attention. Oxford University Press, OxfordGoogle Scholar
  36. Talsma D, Woldorff MG (2005) Selective attention and multisensory integration: multiple phases of effects on the evoked brain activity. J Cogn Neurosci 7(17):1098–1114CrossRefGoogle Scholar
  37. Talsma D, Doty T, Woldorff MG (2007) Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? Cereb Cortex 17:679–690PubMedCrossRefGoogle Scholar
  38. Tiippana K, Andersen TS, Sams M (2004) Visual attention modulates audiovisual speech perception. Eur J Cogn Psychol 16:457–472CrossRefGoogle Scholar
  39. Tuomainen J, Andersen TS, Tiippana K, Sams M (2005) Audio-visual speech perception is special. Cognition 96(1):B13–B22PubMedCrossRefGoogle Scholar
  40. Van Atteveldt NM, Formisano E, Goebel R, Blomert L (2007) Top–down task effects overrule automatic multisensory responses to letter–sound pairs in auditory association cortex. Neuroimage 36(4):1345–1360PubMedCrossRefGoogle Scholar
  41. Vroomen J., Driver J, de Gelder B (2001a) Is cross-modal integration of emotional expressions independent of attentional resources? Cogn Affect Behav Neurosci 1:382–387PubMedCrossRefGoogle Scholar
  42. Vroomen J, Bertelson P, de Gelder B (2001b) The ventriloquist effect does not depend on the direction of automatic visual attention. Percept Psychophys 63:651–659PubMedGoogle Scholar
  43. Wickens CD (1984) Processing resources in attention. In: Parasuraman R, Daves DR (eds) Varieties of attention. Academic Press, Orlando, pp 63–101Google Scholar

Copyright information

© Springer-Verlag 2007

Authors and Affiliations

  • Agnès Alsius
    • 1
    • 2
  • Jordi Navarra
    • 2
    • 3
  • Salvador Soto-Faraco
    • 1
    • 2
    • 4
  1. 1.Departament de Psicologia BàsicaUniversitat de BarcelonaBarcelonaSpain
  2. 2.Parc Científic de BarcelonaHospital Sant Joan de Déu (Edifici Docent)Esplugues de Llob regat (Barcelona)Spain
  3. 3.Department of Experimental Psychology (Crossmodal Research Laboratory)University of OxfordOxfordUK
  4. 4.Insitució Catalana de Recerca i Estudis AvançatsBarcelonaSpain

Personalised recommendations