Skip to main content
Log in

Computing an optimal time window of audiovisual integration in focused attention tasks: illustrated by studies on effect of age and prior knowledge

  • Research Article
  • Published:
Experimental Brain Research Aims and scope Submit manuscript

Abstract

The concept of a "time window of integration" holds that information from different sensory modalities must not be perceived too far apart in time in order to be integrated into a multisensory perceptual event. Empirical estimates of window width differ widely, however, ranging from 40 to 600 ms depending on context and experimental paradigm. Searching for theoretical derivation of window width, Colonius and Diederich (Front Integr Neurosci 2010) developed a decision-theoretic framework using a decision rule that is based on the prior probability of a common source, the likelihood of temporal disparities between the unimodal signals, and the payoff for making right or wrong decisions. Here, this framework is extended to the focused attention task where subjects are asked to respond to signals from a target modality only. Evoking the framework of the time-window-of-integration (TWIN) model, an explicit expression for optimal window width is obtained. The approach is probed on two published focused attention studies. The first is a saccadic reaction time study assessing the efficiency with which multisensory integration varies as a function of aging. Although the window widths for young and older adults differ by nearly 200 ms, presumably due to their different peripheral processing speeds, neither of them deviates significantly from the optimal values. In the second study, head saccadic reactions times to a perfectly aligned audiovisual stimulus pair had been shown to depend on the prior probability of spatial alignment. Intriguingly, they reflected the magnitude of the time-window widths predicted by our decision-theoretic framework, i.e., a larger time window is associated with a higher prior probability.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. See Cohen (2011), for a related discussion of the role of time in cognitive neuroscience.

  2. What remains unclear is whether the slowing is only due to an age-related decrement in peripheral processing or also in a more centrally located processing stage where integration of visual and auditory input is taking place.

  3. These remarks were prompted by some of the reviewers’ comments.

References

  • Alais D, Burr D (2004) The ventriloquist effect results from near-optimal bimodal integration. Curr Biol 14:257–262

    PubMed  CAS  Google Scholar 

  • Anastasio TJ, Patton PE, Belkacem-Boussaid K (2000) Using Bayes‘ rule to model multisensory enhancement in the superior colliculus. Neural Comput 12:1165–1187

    Article  PubMed  CAS  Google Scholar 

  • Battaglia PW, Jacobs RA, Aslin RN (2003) Bayesian integration of visual and auditory signals for spatial localization. J Opt Soc Am Optic Image Vis 20:1391–1397

    Article  Google Scholar 

  • Beierholm U, Koerding K, Shams L, Ma WJ (2008) Comparing Bayesian models for multisensory cue combination without mandatory integration. Advances in neural information processing systems, vol 20. The MIT Press, Cambridge, pp 81–88

  • Cohen MX (2011) It’s about time. Front Hum Neurosci 5:2. doi:10.3389/fnhum.2011.00002

    Article  PubMed  Google Scholar 

  • Colonius H, Arndt P (2001) two-stage model for visual-auditory interaction in saccadic latencies. Percept Psychophys 63:126–147

    Article  PubMed  CAS  Google Scholar 

  • Colonius H, Diederich A (2010) The optimal time window of visual-auditory integration: a reaction time analysis. Front Integr Neurosci 4:11. doi:10.338/fnint.2010.00011

    PubMed  Google Scholar 

  • Colonius H, Diederich A (2006) Race model inequality: interpreting a geometric measure of the amount of violation. Psychol Rev 113(1):148–154

    Article  Google Scholar 

  • Colonius H, Diederich A (2004) Multisensory interaction in saccadic reaction time: a time-window-of-integration model. J Cog Neurosci 16:1000–1009

    Article  Google Scholar 

  • Colonius H, Diederich A (2004) Why aren’t all deep superior colliculus neurons multisensory? A Bayes’ ratio analysis. Cog Affect Behav Neurosci 4(3):344–353

    Article  Google Scholar 

  • Corneil BD, Munoz DP (1996) The influence of auditory and visual distractors on human orienting gaze shifts. J Neurosci 16:8193–8207

    PubMed  CAS  Google Scholar 

  • Diederich A, Colonius H (2011) Modeling multisensory processes in saccadic responses: time-window-of-integration model. In: Wallace MT, Murray MM (eds) Frontiers in the neural bases of multisensory processes. CRC Press, Boca Raton

    Google Scholar 

  • Diederich A, Colonius H (2007) Why two “distractors” are better than one: modeling the effect of non-target auditory and tactile stimuli on visual saccadic reaction time. Exp Brain Res 179:43–54

    Article  PubMed  Google Scholar 

  • Diederich A, Colonius H (2007) Modeling spatial effects in visual-tactile saccadic reaction time. Percept Psychophys 69(1):56–67

    Article  PubMed  Google Scholar 

  • Diederich A, Colonius H (2008) Crossmodal interaction in saccadic reaction time: separating multisensory from warning effects in the time window of integration model. Exp Brain Res 186:1–22

    Article  PubMed  Google Scholar 

  • Diederich A, Colonius H (2008) When a high-intensity “distractor” is better then a low-intensity one: modeling the effect of an auditory or tactile nontarget stimulus on visual saccadic reaction time. Brain Res 1242:219–230

    Article  PubMed  CAS  Google Scholar 

  • Diederich A, Colonius H (2004) Modeling the time course of multisensory interaction in manual and saccadic responses. In: Calvert G, Spence C, Stein BE (eds) Handbook of multisensory processes. The MIT Press, Cambridge, pp 395–408

    Google Scholar 

  • Diederich A, Colonius H (1987) Intersensory facilitation in the motor component? A reaction time analysis. Psychol Res 49:23–29

    Article  Google Scholar 

  • Diederich A, Colonius H, Schomburg A (2008) Assessing age-related multisensory enhancement with the time-window-of-integration model. Neuropsychologia 46:2556–2562

    Article  PubMed  Google Scholar 

  • Di Luca M, Machulla TK, Ernst MO (2009) Recalibration of multisensory simultaneity: cross-modal transfer coincides with a change in perceptual latency. J Vis 9(12):1–16

    Article  PubMed  Google Scholar 

  • Ernst MO (2007) Learning to integrate arbitrary signals from vision and touch. J Vis 7(5):1–14

    Article  Google Scholar 

  • Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415(6870):429–433

    Article  PubMed  CAS  Google Scholar 

  • Ernst MO (2005) A Bayesian view on multimodal cue integration. In: Knoblich G, Thornton I, Grosjean M, Shiffrar M (eds) Human body perception from the inside out. Oxford University Press, New York, pp 105–131

    Google Scholar 

  • Frens MA, Van Opstal AJ, Vander Willigen RF (1995) Spatial and temporal factors determine auditory-visual interactions in human saccadic eye movements. Percept Psychophys 57:802–816

    Article  PubMed  CAS  Google Scholar 

  • Giray M, Ulrich R (1993) Motor coactivation revealed by response force in divided and focused attention. J Exp Psychol Hum Percept Perform 19(6):1278–1291

    Article  PubMed  CAS  Google Scholar 

  • Harrington LK, Peck CK (1998) Spatial disparity affects visual-auditory interactions in human sensorimotor processing. Exp Brain Res 122:247–252

    Article  PubMed  CAS  Google Scholar 

  • Hillis JM, Ernst MO, Banks MS, Landy MS (2002) Combining sensory information: mandatory fusion within, but not between, senses. Science 298(5598):1627–1630

    Article  PubMed  CAS  Google Scholar 

  • Hugenschmidt CE, Mozolic JL, Tan H, Kraft RA, Laurienti PJ (2009) Age-related increase in cross-sensory noise in resting and steady-state cerebral perfusion. Brain Topogr 21(3–4):241–251

    Article  PubMed  Google Scholar 

  • Hughes HC, Nelson MD, Aronchick DM (1998) Spatial characteristics of visual-auditory summation in human saccades. Vis Res 38:3955–3963

    Article  PubMed  CAS  Google Scholar 

  • Innes-Brown H, Crewther D (2009) The Impact of spatial Incongruence on an auditory-visual Illusion. PLoS ONE 4(7). doi:10.1371/journal.pone.0006450

  • Keetels M, Vroomen J (2007) No effect of auditory-visual spatial disparity on temporal recalibration. Exp Brain Res 182(4):559–565

    Article  PubMed  Google Scholar 

  • Körding KP, Beierholm U, Ma WJ, Quartz S, Tenenbaum JB, Shams L (2007) Causal inference in multisensory perception. PLoS ONE 2(9):e943. doi:10.1371/journal.pone.0000943

    Article  PubMed  Google Scholar 

  • Kotz S, Kozubowski TJ, Podgórski K (2001) The Laplace distribution and generalizations. A revisit with applications to communications, economics, engineering, and finance. Birkhäuser, Boston

  • Kubovy M, Schutz M (2010) Audio-visual objects. Rev Phil Psych 1:41–61

    Article  Google Scholar 

  • Laurienti PJ, Burdette JH, Maldjian JA, Wallace MT (2006) Enhanced multisensory integration in older adults. Neurobiol Aging 27:1155–1163

    Article  PubMed  Google Scholar 

  • Lewkowicz DJ (1996) Perception of auditory-visual temporal synchrony in human infants. J Exp Psychol: Hum Percept Perform 22:1094–1106

    Article  CAS  Google Scholar 

  • Lewkowicz DJ (2010) Infant perception of audio-visual speech synchrony. Dev Psychol 46(1):66–77

    Article  Google Scholar 

  • Meredith MA (2002) On the neural basis for multisensory convergence: a brief overview. Cogn Brain Res 14:31–40

    Article  Google Scholar 

  • Miller JO (1982) Divided attention: evidence for coactivation with redundant signals. Cogn Psychol 14:247–279

    Article  PubMed  CAS  Google Scholar 

  • Molholm S, Ritter W, Javitt DC, Foxe JJ (2004) Multisensory visual-auditory object recognition in humans: a highdensity electrical mapping study. Cereb Cortex 14:452–465

    Article  PubMed  Google Scholar 

  • Parise C, Spence C (2009) ‘When birds of a feather clock together‘: synesthetic correspondences modulate audiovisual integration in non–synesthetes. PLoS ONE 4(5):e5664

    Article  PubMed  Google Scholar 

  • Peiffer AM, Mozolica JL, Hugenschmidt CE, Laurienti PJ (2007) Age-related multisensory enhancement in a simple audiovisual detection task. Neuro Report 18(10):1077–1081

    Article  PubMed  Google Scholar 

  • Poliakoff E, Shore DI, Lowe C, Spence C (2006) Visuotactile temporal order judgments in ageing. Neurosci Lett 396(3):207–211

    Article  PubMed  CAS  Google Scholar 

  • Powers AR III, Hillock AR, Wallace MT (2009) Perceptual training narrows the temporal window of multisensory binding. J Neurosci 29(39):12265–12274

    Article  PubMed  CAS  Google Scholar 

  • Raab DH (1962) Statistical facilitation of simple reaction times. Trans NY Acad Sci 24:574–590

    CAS  Google Scholar 

  • Roach NW, Heron J, McGraw PV (2006) Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration. Proc Biol Sci 273:2159–2168

    Article  PubMed  Google Scholar 

  • Roseboom W, Nishida S, Arnold DH (2009) The sliding window of audio–visual simultaneity. J Vis 9(12):4.1–8.1

    Google Scholar 

  • Sanabria D, Soto-Faraco S, Chan JS, Spence C (2005) Intramodal perceptual grouping modulates multisensory integration: evidence from the crossmodal congruency task. Neurosci Lett 377:59–64

    Article  PubMed  CAS  Google Scholar 

  • Sato Y, Toyoizumi T, Aihara K (2007) Bayesian inference explains perception of unity and ventriloquism aftereffect: identification of common sources of audiovisual stimuli. Neural Comput 19(12):3335–3355

    Article  PubMed  Google Scholar 

  • Setti A, Burke KE, Kenny RA, Newell FN (2011) Is inefficient multisensory processing associated with falls in older people. Exp Brain Res 209(3):375–384

    Article  PubMed  Google Scholar 

  • Shams L, Kamitani Y, Shimojo S (2000) Illusions. What you see is what you hear. Nature 408(6814):788

    CAS  Google Scholar 

  • Shams L, Kamitani Y, Shimojo S (2002) Visual illusion induced by sound. Brain Res Cogn Brain Res 14(1):147–152

    Article  PubMed  Google Scholar 

  • Shams L, Ma WJ, Beierholm U (2005) Sound-induced flash illusion as an optimal percept. Neuroreport 16:1923–1927

    Article  PubMed  Google Scholar 

  • Soto-Faraco S, Lyons J, Gazzaniga M, Spence C, Kingstone A (2002) The ventriloquist in motion: illusory capture of dynamic information across sensory modalities. Cognit Brain Res 14:139–146

    Article  Google Scholar 

  • Spence C (2007) Audiovisual multisensory integration. Acoust Sci Technol 28:61–70

    Article  Google Scholar 

  • Stein BE, Meredith MA (1993) The merging of the senses. The MIT Press, Cambridge

    Google Scholar 

  • Van Wanrooij MM, Bell AH, Munoz DP, Van Opstal AJ (2009) The effect of spatial-temporal audiovisual disparities on saccades in a complex scene. Exp Brain Res 198:425–437

    Article  PubMed  Google Scholar 

  • Van Wanrooij MM, Bremen P, Van Opstal AJ (2010) Acquired prior knowledge modulates audiovisual integration. Eur J Neurosci 31:1763–1771

    Article  PubMed  Google Scholar 

  • Vroomen J, Keetels M (2010) Perception of intersensory synchrony: a tutorial review. Attent Percept Psychophys 72:871–884

    Article  Google Scholar 

  • Vroomen J, Keetels M, de Gelder B, Bertelson P (2004) Recalibration of temporal order perception by exposure to audio-visual asynchrony. Brain Res Cogn Brain Res 22:32–35

    Article  PubMed  Google Scholar 

  • Wallace MT, Roberson GE, Hairston WD, Stein BE, Vaughan JW, Schirillo JA (2004) Unifying multisensory signals across time and space. Exp Brain Res 158(2):252–258

    Article  PubMed  CAS  Google Scholar 

  • Whitchurch EA, Takahashi TT (2006) Combined auditory and visual stimuli facilitate head saccades in the barn owl (Tyto alba). J Neurophysiol 96:730–745

    Article  PubMed  Google Scholar 

  • Wozny D, Beierholm U, Shams L (2010) Probability matching as a computational strategy used in perception. PLoS Comput Biol 6(8):e1000871. doi:10.1371/journal.pcbi.1000871

    Article  PubMed  Google Scholar 

Download references

Acknowledgments

This research is supported by grant SFB/TR31 (Project B4) from Deutsche Forschungsgemeinschaft (DFG) to H.C. and by a grant from Nowetas Foundation to both authors. We are most thankful to Marc Van Wanrooij for providing us with data on the effect of prior probability. Helpful comments from the reviewers are also gratefully acknowledged.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hans Colonius.

Appendices

Appendix 1: probability of integration P(I)

The peripheral processing times V for the visual and A for the visual stimulus have an exponential distribution with parameters λ V and λ A , respectively. That is,

$$ \begin{aligned} f_V(t)&=\lambda_V e^{-\lambda_V t} \\ f_A(t)&=\lambda_A e^{-\lambda_A t} \end{aligned} $$

for t ≥ 0, and f V (t) = f A (t)≡ 0 for t < 0. The corresponding distribution functions are referred to as F V (t) and F A (t).

The visual stimulus is the target and the auditory stimulus is the nontarget. By definition,

$$ \begin{aligned} P(I)& = Pr(A + \tau < V < A + \tau + \omega) \\ & = \int\limits_0^\infty f_A(x)\{F_V(x+\tau + \omega)-F_V(x+\tau)\} dx, \\ \end{aligned} $$

where τ denotes the SOA value and ω is the width of the integration window. Computing the integral expression requires that we distinguish between three cases for the sign of τ + ω:

  1. (i)

    τ < τ + ω < 0

    $$ \begin{aligned} P(I)& = \int\limits_{-\tau - \omega}^{-\tau} \lambda_A e^{-\lambda_A x} \{1-e^{-\lambda_V(x+\tau+\omega)}\} {\hbox {d}}x \\ &\quad+ \int\limits_{-\tau}^\infty \lambda_A e^{-\lambda_A x}\{e^{-\lambda_V(x+\tau)}-e^{-\lambda_V(x+\tau+\omega)}\} {\hbox {d}}x \\ &=\frac{\lambda_V}{\lambda_V+\lambda_A}\, e^{\lambda_A \tau}(-1+e^{\lambda_A \omega}); \end{aligned} $$
  2. (ii)

    τ < 0 < τ + ω

    $$ \begin{aligned} P(I)&=\int\limits_0^{-\tau} \lambda_A e^{-\lambda_A x} \left\{1-e^{-\lambda_V(x+\tau+\omega)}\right\} {\hbox {d}}x \\ &\quad + \int\limits_{-\tau}^\infty \lambda_A e^{-\lambda_A x}\left\{e^{-\lambda_V(x+\tau)}-e^{-\lambda_V(x+\tau+\omega)}\right\} {\hbox {d}}x \\ &=\frac{1}{\lambda_V+\lambda_A}\, \left\{\lambda_A\left(1-e^{-\lambda_V(\omega+\tau)}\right) +\lambda_V(1-e^{\lambda_A \tau})\right\}; \end{aligned} $$
  3. (iii)

    0 < τ < τ + ω

    $$ \begin{aligned} P(I)&=\int\limits_0^\infty \lambda_A e^{-\lambda_A x}\{e^{-\lambda_V(x+\tau)}-e^{-\lambda_V(x+\tau+\omega)}\} {\hbox {d}}x \\ &=\frac{\lambda_A}{\lambda_V+\lambda_A} \{e^{-\lambda_V \tau} -e^{-\lambda_V(\omega + \tau)} \}. \end{aligned} $$

Appendix 2: asymmetric-Laplace density

The asymmetric-Laplace density AL(θ, κ, σ) is defined as (cf. Kotz et al. 2001, p. 137)

$$ f(t| \theta, \sigma,\kappa)= \frac{\sqrt{2}}{\sigma} \frac{\kappa}{1+\kappa^2}\left\{\begin{array}{ll}\exp\left( -\frac{\sqrt{2}\kappa}{\sigma}\,|x-\theta| \right)&\hbox{if}\,t \geq \theta,\\ \exp\left(-\frac{\sqrt{2}}{\sigma \kappa}\,|x-\theta| \right)&\hbox{if}\,t<\theta, \end{array}\right. $$

where θ is location parameter, κ = 1 implies symmetry, and σ is a scale parameter. In deriving the distributional form used in the text, the following parameter mappings have been made:

$$ \sigma^2=\frac{2}{\lambda_V \lambda_A}; \quad \kappa^2=\lambda_V/\lambda_A;\quad \theta=0. $$

Rights and permissions

Reprints and permissions

About this article

Cite this article

Colonius, H., Diederich, A. Computing an optimal time window of audiovisual integration in focused attention tasks: illustrated by studies on effect of age and prior knowledge. Exp Brain Res 212, 327–337 (2011). https://doi.org/10.1007/s00221-011-2732-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00221-011-2732-x

Keywords

Navigation