Experimental Brain Research

, Volume 193, Issue 4, pp 603–614 | Cite as

The dog’s meow: asymmetrical interaction in cross-modal object recognition

Research Article

Abstract

Little is known on cross-modal interaction in complex object recognition. The factors influencing this interaction were investigated using simultaneous presentation of pictures and vocalizations of animals. In separate blocks, the task was to identify either the visual or the auditory stimulus, ignoring the other modality. The pictures and the sounds were congruent (same animal), incongruent (different animals) or neutral (animal with meaningless stimulus). Performance in congruent trials was better than in incongruent trials, regardless of whether subjects attended the visual or the auditory stimuli, but the effect was larger in the latter case. This asymmetry persisted with addition of a long delay after the stimulus and before the response. Thus, the asymmetry cannot be explained by a lack of processing time for the auditory stimulus. However, the asymmetry was eliminated when low-contrast visual stimuli were used. These findings suggest that when visual stimulation is highly informative, it affects auditory recognition more than auditory stimulation affects visual recognition. Nevertheless, this modality dominance is not rigid; it is highly influenced by the quality of the presented information.

Keywords

Auditory Visual Human Multisensory Object recognition Conflict 

Supplementary material

221_2008_1664_MOESM1_ESM.pdf (109 kb)
Supplemental Fig. 1: Facilitation and interference compared to neutral. For reaction time, the congruent (C, crossed bars) or incongruent (IC, dotted bars) RTs were subtracted from the neutral RT. For accuracy, the neutral hit rate was subtracted from the congruent or incongruent hit rate. Thus, facilitation is upwards, interference downwards for both measures. Dark bars attend-visual, light bars attend-auditory. Error bars reflect the standard error. a Performance on the short-delay condition of Experiment 1. b Performance on the long delay condition of Experiment 1. c Performance on Experiment 2 (PDF 108 kb)

References

  1. Alais D, Burr D (2004) The ventriloquist effect results from near-optimal bimodal integration. Curr Biol 14(3):257–262PubMedGoogle Scholar
  2. Albrecht DG, Geisler WS, Frazor RA, Crane AM (2002) Visual cortex neurons of monkeys and cats: temporal dynamics of the contrast response function. J Neurophysiol 88(2):888–913PubMedGoogle Scholar
  3. Beauchamp MS, Lee KE, Argall BD, Martin A (2004) Integration of auditory and visual information about objects in superior temporal sulcus. Neuron 41(5):809–823PubMedCrossRefGoogle Scholar
  4. Ben-Artzi E, Marks LE (1995) Visual–auditory interaction in speeded classification: role of stimulus difference. Percept Psychophys 57(8):1151–1162PubMedGoogle Scholar
  5. Bermant RI, Welch RB (1976) Effect of degree of separation of visual–auditory stimulus and eye position upon spatial interaction of vision and audition. Percept Mot Skills 42(43):487–493PubMedGoogle Scholar
  6. Bertelson P (1999) Ventriloquism: a case of crossmodal perceptual grouping. In: Aschersleben G (ed) Cognitive contributions to the perception of spatial and temporal events. North-Holland/Elsevier, Amsterdam, pp 347–362CrossRefGoogle Scholar
  7. Bertelson P, Aschersleben G (2003) Temporal ventriloquism: crossmodal interaction on the time dimension. 1. Evidence from auditory–visual temporal order judgment. Int J Psychophysiol 50(1–2):147–155PubMedCrossRefGoogle Scholar
  8. Bertelson P, Radeau M (1981) Cross-modal bias and perceptual fusion with auditory–visual spatial discordance. Percept Psychophys 29(6):578–584PubMedGoogle Scholar
  9. Bullmore E, Horwitz B, Honey G, Brammer M, Williams S, Sharma T (2000) How good is good enough in path analysis of fMRI data? Neuroimage 11(4):289–301PubMedCrossRefGoogle Scholar
  10. Calvert J, Manahilov V, Simpson WA, Parker DM (2005) Human cortical responses to contrast modulations of visual noise. Vision Res 45(17):2218–2230PubMedCrossRefGoogle Scholar
  11. Driver J, Baylis GC (1993) Cross-modal negative priming and interference in selective attention. Bull Psychon Soc 31(1):45–48Google Scholar
  12. Elliott EM, Cowan N, Valle-Inclan F (1998) The nature of cross-modal color-word interference effects. Percept Psychophys 60(5):761–767PubMedGoogle Scholar
  13. Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415(6870):429–433PubMedCrossRefGoogle Scholar
  14. Ernst MO, Banks MS, Bulthoff HH (2000) Touch can change visual slant perception. Nat Neurosci 3(1):69–73PubMedCrossRefGoogle Scholar
  15. Greene AJ, Easton RD, LaShell LS (2001) Visual–auditory events: cross-modal perceptual priming and recognition memory. Conscious Cogn 10(3):425–435PubMedCrossRefGoogle Scholar
  16. Hairston WD, Laurienti PJ, Mishra G, Burdette JH, Wallace MT (2003) Multisensory enhancement of localization under conditions of induced myopia. Exp Brain ResGoogle Scholar
  17. Hairston WD, Wallace MT, Vaughan JW, Stein BE, Norris JL, Schirillo JA (2003b) Visual localization ability influences cross-modal bias. J Cogn Neurosci 15(1):20–29PubMedCrossRefGoogle Scholar
  18. Hall SD, Holliday IE, Hillebrand A, Furlong PL, Singh KD, Barnes GR (2005) Distinct contrast response functions in striate and extra-striate regions of visual cortex revealed with magnetoencephalography (MEG). Clin Neurophysiol 116(7):1716–1722PubMedCrossRefGoogle Scholar
  19. Heron J, Whitaker D, McGraw F (2004) Sensory uncertainty governs the extent of audio–visual interaction. Vision Res 44(25):2875–2884PubMedCrossRefGoogle Scholar
  20. Huang J, Francis AP, Carr TH (2008) Studying overt word reading and speech production with event-related fMRI: a method for detecting, assessing, and correcting articulation-induced signal changes and for measuring onset time and duration of articulation. Brain Lang 104(1):10–23PubMedCrossRefGoogle Scholar
  21. Larsen A, McIlhagga W, Baert J, Bundesen C (2003) Seeing or hearing? Perceptual independence, modality confusions, and crossmodal congruity effects with focused and divided attention. Percept Psychophys 65(4):568–574PubMedGoogle Scholar
  22. Laurienti PJ, Wallace MT, Maldjian JA, Susi CM, Stein BE, Burdette JH (2003) Cross-modal sensory processing in the anterior cingulate and medial prefrontal cortices. Hum Brain Mapp 19(4):213–223PubMedCrossRefGoogle Scholar
  23. Lehmann S, Murray MM (2005) The role of multisensory memories in unisensory object discrimination. Brain Res Cogn Brain Res 24(2):326–334PubMedCrossRefGoogle Scholar
  24. Lewis JL (1972) Semantic processing with bisensory stimulation. J Exp Psychol 96(2):455–457PubMedCrossRefGoogle Scholar
  25. Marks LE (2004) Cross-modal interactions in speeded classification. In: Calvert GA, Spence C, Stein BE (eds) The handbook of multisensory processes. MIT, Cambridge, pp 85–105Google Scholar
  26. Marks LE, Ben-Artzi E, Lakatos S (2003) Cross-modal interactions in auditory and visual discrimination. Int J Psychophysiol 50(1–2):125–145PubMedCrossRefGoogle Scholar
  27. McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 264(5588):746–748PubMedCrossRefGoogle Scholar
  28. Michelson A (1927) Studies in optics. University of Chicago Press, ChicagoGoogle Scholar
  29. Molholm S, Ritter W, Javitt DC, Foxe JJ (2004) Multisensory visual–auditory object recognition in humans: a high-density electrical mapping study. Cereb Cortex 14(4):452–465PubMedCrossRefGoogle Scholar
  30. Morein-Zamir S, Soto-Faraco S, Kingstone A (2003) Auditory capture of vision: examining temporal ventriloquism. Brain Res Cogn Brain Res 17(1):154–163PubMedCrossRefGoogle Scholar
  31. Mynatt BT (1977) Reaction times in a bisensory task: implications for attention and speech perception. J Exp Psychol Hum Percept Perform 3(2):316–324PubMedCrossRefGoogle Scholar
  32. Näsänen R, Ojanpää H, Tanskanen T, Päällysaho J (2006) Estimation of temporal resolution of object identification in human vision. Exp Brain Res 172(4):464–471PubMedCrossRefGoogle Scholar
  33. Porciatti V, Bonanni P, Fiorentini A, Guerrini R (2000) Lack of cortical contrast gain control in human photosensitive epilepsy. Nat Neurosci 3(3):259–263PubMedCrossRefGoogle Scholar
  34. Recanzone GH (2003) Auditory influences on visual temporal rate perception. J Neurophysiol 89(2):1078–1093PubMedCrossRefGoogle Scholar
  35. Reich DS, Mechler F, Victor JD (2001) Temporal coding of contrast in primary visual cortex: when, what, and why. J Neurophysiol 85(3):1039–1050PubMedGoogle Scholar
  36. Rock I, Victor J (1964) Vision and touch: an experimentally created conflict between the two senses. Science 143:594–596PubMedCrossRefGoogle Scholar
  37. Roelofs A (2005) The visual–auditory color-word stroop asymmetry and its time course. Memory Cogn 33(8):1325–1336Google Scholar
  38. Sen A, Posner MI (1979) The effect of unattended visual and auditory words on cross-modal naming. Bull Psychon Soc 13(6):405–408Google Scholar
  39. Shimada H (1990) Effect of auditory presentation of words on color naming: the intermodal Stroop effect. Percept Mot Skills 70(3 Pt 2):1155–1161PubMedGoogle Scholar
  40. Stroop JR (1935) Studies of interference in serial verbal reactions. J Exp Psychol 18:643–662CrossRefGoogle Scholar
  41. Stuart DM, Carrasco M (1993) Semantic component of a cross-modal Stroop-like task. Am J Psychol 106(3):383–405CrossRefGoogle Scholar
  42. Tellinghuisen DJ, Nowak EJ (2003) The inability to ignore auditory distractors as a function of visual task perceptual load. Percept Psychophys 65(5):817–828PubMedGoogle Scholar
  43. von Kriegstein K, Giraud AL (2006) Implicit multisensory associations influence voice recognition. PLoS Biol 4(10):1809–1820Google Scholar
  44. Welch RB, Warren DH (1980) Immediate perceptual response to intersensory discrepancy. Psychol Bull 88(3):638–667PubMedCrossRefGoogle Scholar
  45. Witten IB, Knudsen EI (2005) Why seeing is believing: merging auditory and visual worlds. Neuron 48(3):489–496PubMedCrossRefGoogle Scholar
  46. Yuval-Greenberg S, Deouell LY (2007) What you see is not (always) what you hear: induced gamma band responses reflect cross-modal interactions in familiar object recognition. J Neurosci 27(5):1090–1096PubMedCrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2008

Authors and Affiliations

  1. 1.Department of PsychologyThe Hebrew University of JerusalemJerusalemIsrael
  2. 2.Interdisciplinary Center for Neural ComputationThe Hebrew University of JerusalemJerusalemIsrael

Personalised recommendations