Skip to main content
Log in

Attention to touch weakens audiovisual speech integration

  • Research Note
  • Published:
Experimental Brain Research Aims and scope Submit manuscript

Abstract

One of the classic examples of multisensory integration in humans occurs when speech sounds are combined with the sight of corresponding articulatory gestures. Despite the longstanding assumption that this kind of audiovisual binding operates in an attention-free mode, recent findings (Alsius et al. in Curr Biol, 15(9):839–843, 2005) suggest that audiovisual speech integration decreases when visual or auditory attentional resources are depleted. The present study addressed the generalization of this attention constraint by testing whether a similar decrease in multisensory integration is observed when attention demands are imposed on a sensory domain that is not involved in speech perception, such as touch. We measured the McGurk illusion in a dual task paradigm involving a difficult tactile task. The results showed that the percentage of visually influenced responses to audiovisual stimuli was reduced when attention was diverted to a tactile task. This finding is attributed to a modulatory effect on audiovisual integration of speech mediated by supramodal attention limitations. We suggest that the interactions between the attentional system and crossmodal binding mechanisms may be much more extensive and dynamic than it was advanced in previous studies.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. Due to intrinsic constraints of the lexicon, there were just a few exemplars in our lists where a new word could be created by the given pair of dubbed words. For most of the pairs, the expected resulting combination between the acoustic and the visual stimulus would correspond to the visual word. For this reason, fusion and visual responses were treated together in the analyses.

  2. A within-subjects design was used after the evidence provided by a previous pilot experiment in which, as in Alsius et al (2005), the effects of task were tested in a between-participants design. This previous study showed a trend towards the same attentional effects reported here, but these effects did not reach significance. As attentional competition across modalities has been shown to be less strong between than within a sensory modality (e.g., Eimer and Van Velzen 2002; Hillyard et al. 1984; McDonald and Ward 2000), a within-participants design was implemented in the present experiment, in order to gain statistical power and detect any small but reliable effects of attention potentially affecting audiovisual integration.

  3. Note that, if demands of the concurrent task prevented both the auditory and visual unimodal processing, one would expect to observe a reduction of the visually influenced responses in the visual condition, but an increase of these responses in the auditory displays (participants’ misunderstanding of the auditory words would lead to an increase of their visual counterparts, due to phonological similarity).

References

  • Alais D, Morrone C, Burr D (2006) Separate attentional resources for vision and audition. Proc Biol Sci 273(1592):1339–1345

    Article  PubMed  Google Scholar 

  • Amedi A, Malach R, Hendler T, Peled S, Zohary E (2001) Visuo-haptic object-related activation in the ventral visual pathway. Nat Neurosci 4:324–330

    Article  PubMed  CAS  Google Scholar 

  • Alsius A, Navarra J, Campbell R, Soto-Faraco S (2005) Audiovisual integration of speech falters under high attention demands. Curr Biol 15(9):839–843

    Article  PubMed  CAS  Google Scholar 

  • Bernstein LE, Auer ET Jr, Moore JK (2004) Audiovisual speech binding: convergence or association? In: Calvert GA, Spence C, Stein BE (eds) The handbook of multisesensory processes. The MIT Press, Cambridge, pp 203–224

    Google Scholar 

  • Bertelson P, Radeau M (1981) Cross-modal bias and perceptual fusion with auditory-visual spatial discordance. Percept Psychophys 29:578–584

    PubMed  CAS  Google Scholar 

  • Bertelson P, Vroomen J, de Gelder B, Driver J (2000) The ventriloquist effect does not depend on the direction of deliberate visual attention. Percept Psychophys 62(2):321–332

    PubMed  CAS  Google Scholar 

  • Burnham D, Dodd B (2004) Auditory-visual speech integration by prelinguistic infants: perception of an emergent consonant in the McGurk effect. Dev Psychobiol 45(4):204–220

    Article  PubMed  Google Scholar 

  • Calvert GA, Brammer MJ, Bullmore ET, Campbell R, Iversen SD, David AS (1999) Response amplification in sensory-specific cortices during cross-modal binding. Neuroreport 10:2619–2623

    Article  PubMed  CAS  Google Scholar 

  • Calvert GA, Campbell R, Brammer MJ (2000) Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex. Curr Biol 10(11):649–657

    Article  PubMed  CAS  Google Scholar 

  • Colin C, Radeau M, Soquet A, Demolin D, Colin F, Deltenre P (2002) Mismatch negativity evoked by the McGurk–MacDonald effect: a phonetic representation within short-term memory. Clin Neurophysiol 113:495–506

    Article  PubMed  CAS  Google Scholar 

  • de Gelder B, Bertelson P (2003) Multisensory integration, perception and ecological validity. Trends Cogn Sci 7(10):460–467

    Article  PubMed  Google Scholar 

  • Degerman A, Rinne T, Pekkola J, Autti T, Jääskeläinen I, Sams M, Alho K (2007) Human brain activity associated with audiovisual perception and attention. Neuroimage 34(4):1683–1691

    Article  PubMed  Google Scholar 

  • Duncan J, Martens S, Ward R (1997) Restricted attentional capacity within but not between sensory modalities. Nature 387:808–810

    Article  PubMed  CAS  Google Scholar 

  • Eimer M, Van Velzen J (2002) Crossmodal links in spatial attention are mediated by supramodal control processes: evidence from event-related brain potentials. Psychophysiol 39:437–449

    Article  Google Scholar 

  • Eimer M, van Velzen J, Driver J (2002) Crossmodal interactions between audition, touch and vision in endogenous spatial attention: ERP evidence on preparatory states and sensory modulations. J Cogn Neurosci 14:254–271

    Article  PubMed  Google Scholar 

  • Fujisaki W, Koene A, Arnold D, Johnston A, Nishida S (2006) Visual search for a target changing in synchrony with an auditory signal. Proc R Soc B 273:865–874

    Article  PubMed  Google Scholar 

  • Ghazanfar AA, Logothetis NK (2003) Facial expressions linked to monkey calls. Nature 423:937–938

    Article  PubMed  CAS  Google Scholar 

  • Green KP, Kuhl PK (1991) Integral processing of visual place and auditory voicing information during phonetic perception. J Exp Psychol Hum Percept Perform 17:278–288

    Article  PubMed  CAS  Google Scholar 

  • Hillyard SA, Simpson GV, Woods DL, VanVoorhis S, Münte TF (1984) Event-related brain potentials and selective attention to different modalities. In: Reinoso-Suarez F, Aimone-Marsan C (eds) Cortical integration. Raven, New York, pp 395–413

    Google Scholar 

  • Kaiser J, Hertrich L, Ackermann H, Mathiak K, Lutzenberger W (2004) Hearing lips: gamma-band activity during audiovisual speech perception. Cereb Cortex 15:646–653

    Article  PubMed  Google Scholar 

  • Kanwisher N, Wojciulik E (2000) Visual attention: insights from brain imaging. Nat Rev Neurosci 1:91–100

    Article  PubMed  CAS  Google Scholar 

  • Khul PK, Meltzoff AN (1982) The bimodal perception of speech in infancy. Science 218:1138–1141

    Article  Google Scholar 

  • Lavie N (1995) Perceptual load as a necessary condition for selective attention. J Exp Psychol Hum Percept Perform 21:451–468

    Article  PubMed  CAS  Google Scholar 

  • Lewkowicz DJ, Ghazanfar AA (2006) The decline of cross-species intersensory perception in human infants. Proc Natl Acad Sci USA 103:6771–6774

    Article  PubMed  CAS  Google Scholar 

  • Macaluso E, Frith CD, Driver J (2002) Supramodal effects of covert spatial orienting triggered by visual or tactile events. J Cogn Neurosci 14(3):389–401

    Article  PubMed  Google Scholar 

  • McDonald JJ, Ward LM (2000) Involuntary listening aids seeing: evidence from human electrophysiology. Psychol Sci 11:167–171

    Article  PubMed  CAS  Google Scholar 

  • Mattys S, Bernstein LE, Auer ET (2002) Stimulus-based lexical distinctiveness as a general word recognition mechanism. Percept Psychophys 64(4):667–679

    PubMed  Google Scholar 

  • Massaro DW (1987) Speech perception by ear and eye. LEA, Hillsdale

    Google Scholar 

  • Massaro DW (1998) Perceiving talking faces: from speech perception to a behavioral principle. MIT Press, Cambridge

    Google Scholar 

  • McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 265:746–748

    Article  Google Scholar 

  • Pick HL Jr, Warren DH, Hay JC (1969) Sensory conflict in judgements of spatial direction. Percept Psychophys 6:203–205

    Google Scholar 

  • Rees G, Frith CD, Lavie N (2001) Perception of irrelevant visual motion during performance of an auditory task. Neuropsychologia 39:937–949

    Article  PubMed  CAS  Google Scholar 

  • Soto-Faraco S, Alsius A (2006) Conscious access to the unisensory components of a cross-modal illusion. Neuroreport 18:347–350

    Article  Google Scholar 

  • Soto-Faraco S, Navarra J, Alsius A (2004) Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task. Cognition 92:B13–B23

    Article  PubMed  Google Scholar 

  • Spence C, Driver J (eds) (2004) Crossmodal space and crossmodal attention. Oxford University Press, Oxford

    Google Scholar 

  • Talsma D, Woldorff MG (2005) Selective attention and multisensory integration: multiple phases of effects on the evoked brain activity. J Cogn Neurosci 7(17):1098–1114

    Article  Google Scholar 

  • Talsma D, Doty T, Woldorff MG (2007) Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? Cereb Cortex 17:679–690

    Article  PubMed  Google Scholar 

  • Tiippana K, Andersen TS, Sams M (2004) Visual attention modulates audiovisual speech perception. Eur J Cogn Psychol 16:457–472

    Article  Google Scholar 

  • Tuomainen J, Andersen TS, Tiippana K, Sams M (2005) Audio-visual speech perception is special. Cognition 96(1):B13–B22

    Article  PubMed  Google Scholar 

  • Van Atteveldt NM, Formisano E, Goebel R, Blomert L (2007) Top–down task effects overrule automatic multisensory responses to letter–sound pairs in auditory association cortex. Neuroimage 36(4):1345–1360

    Article  PubMed  Google Scholar 

  • Vroomen J., Driver J, de Gelder B (2001a) Is cross-modal integration of emotional expressions independent of attentional resources? Cogn Affect Behav Neurosci 1:382–387

    Article  PubMed  CAS  Google Scholar 

  • Vroomen J, Bertelson P, de Gelder B (2001b) The ventriloquist effect does not depend on the direction of automatic visual attention. Percept Psychophys 63:651–659

    PubMed  CAS  Google Scholar 

  • Wickens CD (1984) Processing resources in attention. In: Parasuraman R, Daves DR (eds) Varieties of attention. Academic Press, Orlando, pp 63–101

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Salvador Soto-Faraco.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Alsius, A., Navarra, J. & Soto-Faraco, S. Attention to touch weakens audiovisual speech integration. Exp Brain Res 183, 399–404 (2007). https://doi.org/10.1007/s00221-007-1110-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00221-007-1110-1

Keywords

Navigation