Advertisement

The Multisensory Driver: Contributions from the Time-Window-of-Integration Model

Conference paper

Abstract

In this paper, we sketch a specific framework developed to describe and predict crossmodal effects in orienting responses. The time-window-of-integration (TWIN) model postulates that a crossmodal stimulus triggers a race mechanism in the early peripheral sensory pathways, then followed by a compound stage of converging subprocesses that comprise neural integration of the input and preparation of a response. We describe the model in the context of a focused attention task and outline its potential for informing the design of multisensory warning signals.

Keywords

Multisensory Integration TWIN model Focused attention Reaction time 

Notes

Acknowledgements

Partial support of this research by DFG SFB/TR 31 (Active listening), by NOWETAS Foundation, and by IMoST (Integrated Modeling for Safe Transportation) is acknowledged.

References

  1. 1.
    Calvert G, Spence C, Stein BE (eds) (2004) Handbook of multisensory processes. MIT Press, CambridgeGoogle Scholar
  2. 2.
    Chan AHS, Chan KWL (2006) Synchronous and asynchronous presentation of auditory and visual signals: implications for control console design. Appl Ergon 37:131–140CrossRefGoogle Scholar
  3. 3.
    Colonius H, Diederich A (2004) Multisensory interaction in saccadic reaction time: a time-window-of-integration model. J Cog Neurosci 16:1000–1009CrossRefGoogle Scholar
  4. 4.
    Colonius H, Diederich A (2010) The optimal time window of visual–auditory integration: a reaction time analysis. Front Integr Neurosci 4:11. doi: 10.3389/fnint.2010.00011 Google Scholar
  5. 5.
    Colonius H, Diederich A, Steenken R (2009) Time-window-of-integration (TWIN) model for saccadic reaction time: effect of auditory masker level on visual–auditory spatial interaction in elevation. Brain Topogr 21(3–4):177–184CrossRefGoogle Scholar
  6. 6.
    Diederich A, Colonius H (2007) Why two ‘distractors’ are better then one: modeling the effect on non-target auditory and tactile stimuli on visual saccadic reaction time. Exp Brain Res 179:43–54CrossRefGoogle Scholar
  7. 7.
    Diederich A, Colonius H (2007) Modeling spatial effects in visual–tactile saccadic reaction time. Percept Psychophys 69(1):56–67CrossRefGoogle Scholar
  8. 8.
    Diederich A, Colonius H (2008) Crossmodal interaction in saccadic reaction time: separating multisensory from warning effects in the time window of integration model. Exp Brain Res 186:1–22CrossRefGoogle Scholar
  9. 9.
    Diederich A, Colonius H (2008) When a high-intensity ‘distractor’ is better then a low-intensity one: modeling the effect of an auditory or tactile nontarget stimulus on visual saccadic reaction time. Brain Res 1242:219–230CrossRefGoogle Scholar
  10. 10.
    Diederich A, Colonius H (2011) Modeling multisensory processes in saccadic responses: time-window-of-integration model. In: Wallace MT, Murray MM (eds) Frontiers in the neural bases of multisensory processes. CRC Press, Boca Raton Google Scholar
  11. 11.
    Gray R, Mohebbi R, Tan HZ (2009) The spatial resolution of crossmodal attention: implications for the design of multimodal interfaces. ACM Transact Appl Percept 6(1):4Google Scholar
  12. 12.
    Ho C, Spence C (2008) The multisensory driver: implications for ergonomic car interface design. Ashgate Publishing Ltd., AldershotGoogle Scholar
  13. 13.
    Ho C, Spence C (2009) Using peripersonal warning signals to orient a driver’s gaze. Hum Factors 51(4):539–556CrossRefGoogle Scholar
  14. 14.
    Liu YC (2001) Comparative study of the effects of auditory, visual and multimodality displays on drivers’ performance in advanced traveller information systems. Ergonomics 44(4):425–442CrossRefGoogle Scholar
  15. 15.
    McEvoy SP, Stevenson MR, Woodward M (2007) The contribution of passengers versus mobile phone use to motor vehicle crashes resulting in hospital attendance by the driver. Accid Anal Prev 39(6):1170–1176CrossRefGoogle Scholar
  16. 16.
    Rowland BA, Stein BE (2008) Temporal profiles of response enhancement in multisensory integration. Front Neurosci 2:218–224CrossRefGoogle Scholar
  17. 17.
    Rowland BA, Quessy S, Stanford TR, Stein BE (2007) Multisensory integration shortens physiological response latencies. J Neurosci 27:5879–5884CrossRefGoogle Scholar
  18. 18.
    Santangelo V, Ho C, Spence C (2008) Capturing spatial attention with multisensory cues. Psychonom Bull Rev 15:398–403CrossRefGoogle Scholar
  19. 19.
    Spence C, Ho C (2008) Multisensory warning signals for event perception and safe driving. Theoret Issues Ergon Sci 7:1–32Google Scholar
  20. 20.
    Spence C, Ho C (2009) Crossmodal information processing in driving. In: Castro C (eds) Human factors of visual and cognitive performance in driving. CRC Press, Boca Raton, pp 187–200Google Scholar
  21. 21.
    Sivak M (1996) The information that drivers use: is it indeed 90% visual. Perception 25:1081–1089CrossRefGoogle Scholar
  22. 22.
    Van Eerp JBF, Van Veen HAHC (2004) Vibrotactile in-vehicle navigation system. Transportation Res Part F 7:247–256Google Scholar
  23. 23.
    Van Wanrooij MM, Bremen P, Van Opstal AJ (2010) Acquired prior knowledge modulates audiovisual integration. Europ J Neurosci 31:1763–1771CrossRefGoogle Scholar
  24. 24.
    Wickens CD (2002) Multiple resources and performance prediction. Theoret Issues Ergon Sci 3:159–177CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Italia Srl 2011

Authors and Affiliations

  1. 1.Department of PsychologyOldenburg UniversityOldenburgGermany
  2. 2.School of Humanities and Social SciencesJacobs University BremenBremenGermany

Personalised recommendations