Attention, Perception, & Psychophysics

, Volume 72, Issue 1, pp 142–152 | Cite as

Temporal dynamics of unimodal and multimodal feature binding

Research Articles


In two experiments, we studied the temporal dynamics of feature integration with auditory (Experiment 1) and audiovisual (Experiment 2) stimuli and manual responses. Consistent with previous observations, performance was better when the second of two consecutive stimuli shared all or none of the features of the first, rather than when only one of the features overlapped. Comparable partial-overlap costs were obtained for combinations of stimulus features and responses. These effects decreased systematically with increasing time between the two stimulus-and-response events, and the decreased rate was comparable for unimodal and multimodal bindings. General effect size reflected the degree of task relevance of the dimension or modality of the respective feature, but the effects of relevance and of temporal delay did not interact. This suggests that the processing of stimuli on task-relevant sensory modalities and feature dimensions is facilitated by task-specific attentional sets, whereas the temporal dynamics might reflect that bindings “decay” or become more difficult to access over time.


  1. Akyürek, E. G., Toffanin, P., & Hommel, B. (2008). Adaptive control of event integration. Journal of Experimental Psychology: Human Perception & Performance, 34, 569–577. doi:10.1037/0096-1523.34.3.569CrossRefGoogle Scholar
  2. Allport, D. A., Tipper, S. P., & Chmiel, N. R. J. (1985). Perceptual integration and postcategorical filtering. In M. I. Posner & O. S. M. Marin (Eds.), Attention & performance XI (pp. 107–132). Hillsdale, NJ: Erlbaum.Google Scholar
  3. Alvarez, G. A., & Thompson, T. W. (2009). Overwriting and rebinding: Why feature-switch detecting tasks underestimate the binding capacity of visual working memory. Visual Cognition, 17, 141–159. doi:10.1080/13506280802265496CrossRefGoogle Scholar
  4. Bertelson, P. (1963). S-R relationships and reaction times to new versus repeated signals in a serial task. Journal of Experimental Psychology, 65, 478–484.CrossRefGoogle Scholar
  5. Bertelson, P., Vroomen, J., de Gelder, B., & Driver, J. (2000). The ventriloquist effect does not depend on the direction of deliberate visual attention. Perception & Psychophysics, 62, 321–332.CrossRefGoogle Scholar
  6. Burke, L. (1952). On the tunnel effect. Quarterly Journal of Experimental Psychology, 4, 121–138.CrossRefGoogle Scholar
  7. Colzato, L. S., Raffone, A., & Hommel, B. (2006). What do we learn from binding features? Evidence for multilevel feature integration. Journal of Experimental Psychology: Human Perception & Performance, 32, 705–716. doi:10.1037/0096-1523.32.3.705CrossRefGoogle Scholar
  8. DeSchepper, B., & Treisman, A. [M.] (1996). Visual memory for novel shapes: Implicit coding without attention. Journal of Experimental Psychology: Learning, Memory, & Cognition, 22, 27–47.CrossRefGoogle Scholar
  9. Dyson, B. J., & Quinlan, P. T. (2004). Stimulus processing constraints in audition. Journal of Experimental Psychology: Human Perception & Performance, 30, 1117–1131. doi:10.1037/0096-1523.30.6.1117CrossRefGoogle Scholar
  10. Einhorn, H. J., & Hogarth, R. M. (1986). Judging probable cause. Psychological Bulletin, 99, 3–19.CrossRefGoogle Scholar
  11. Gao, T., & Scholl, B. J. (in press). Are objects required for object-files? Roles of segmentation and spatiotemporal continuity in computing object persistence. Visual Cognition. doi:10.1080/13506280802614966Google Scholar
  12. Goldstein, E. B. (2007). Sensation and perception (7th ed.). Belmont, CA: Thomson Wadsworth.Google Scholar
  13. Gordon, R. D., & Irwin, D. E. (2000). The role of physical and conceptual properties in preserving object continuity. Journal of Experimental Psychology: Learning, Memory, & Cognition, 26, 136–150.CrossRefGoogle Scholar
  14. Gruber, H. E., Fink, C. D., & Damm, V. (1957). Effects of experience on perception of causality. Journal of Experimental Psychology, 53, 89–93.PubMedCrossRefGoogle Scholar
  15. Hall, M. D., Pastore, R. E., Acker, B. E., & Huang, W. (2000). Evidence for auditory feature integration with spatially distributed items. Perception & Psychophysics, 62, 1243–1257.CrossRefGoogle Scholar
  16. Hommel, B. (1996). The cognitive representation of action: Automatic integration of perceived action effects. Psychological Research, 59, 176–186. doi:10.1007/BF00425832PubMedCrossRefGoogle Scholar
  17. Hommel, B. (1998). Event files: Evidence for automatic integration of stimulus-response episodes. Visual Cognition, 5, 183–216. doi:10.1080/713756773CrossRefGoogle Scholar
  18. Hommel, B. (2004). Event files: Feature binding in and across perception and action. Trends in Cognitive Sciences, 8, 494–500. doi:10.1016/j.tics.2004.08.007PubMedCrossRefGoogle Scholar
  19. Hommel, B. (2005). How much attention does an event file need? Journal of Experimental Psychology: Human Perception & Performance, 31, 1067–1082. doi:10.1037/0096-1523.31.5.1067CrossRefGoogle Scholar
  20. Hommel, B. (2007). Feature integration across perception and action: Event files affect response choice. Psychological Research, 71, 42–63. doi:10.1007/s00426-005-0035-1PubMedCrossRefGoogle Scholar
  21. Hommel, B. (2009). Action control according to TEC (theory of event coding). Psychological Research, 73, 512–526. doi:10.1007/s00426-009-0234-2PubMedCrossRefGoogle Scholar
  22. Hommel, B. (2010). Grounding attention in action control: The intentional control of selection. In B. J. Bruya (Ed.), Effortless attention: A new perspective in the cognitive science of attention and action (pp. 121–140). Cambridge, MA: MIT Press.Google Scholar
  23. Hommel, B., & Colzato, L. S. (2004). Visual attention and the temporal dynamics of feature integration. Visual Cognition, 11, 483–521. doi:10.1080/13506280344000400CrossRefGoogle Scholar
  24. Hommel, B., & Colzato, L. S. (2009). When an object is more than a binding of its features: Evidence for two mechanisms of visual feature integration. Visual Cognition, 17, 120–140. doi:10.1080/13506280802349787CrossRefGoogle Scholar
  25. Hommel, B., Memelink, J., Zmigrod, S., & Colzato, L. S. (2009). How information of relevant dimension controls the creation and retrieval of feature-response binding. Manuscript submitted for publication.Google Scholar
  26. Hötting, K., & Röder, B. (2004). Hearing cheats touch, but less in congenitally blind than in sighted individuals. Psychological Science, 15, 60–64.PubMedCrossRefGoogle Scholar
  27. Hyun, J.-S., Woodman, G. F., & Luck, S. J. (2009). The role of attention in the binding of surface features to locations. Visual Cognition, 17, 10–24. doi:10.1080/13506280802113894CrossRefGoogle Scholar
  28. Kahneman, D., Treisman, A. [M.], & Gibbs, B. J. (1992). The reviewing of object files: Object-specific integration of information. Cognitive Psychology, 24, 175–219.PubMedCrossRefGoogle Scholar
  29. Logan, G. D. (1988). Toward an instance theory of automatization. Psychological Review, 95, 492–527.CrossRefGoogle Scholar
  30. Luck, S. J., & Hillyard, S. A. (1994). Spatial filtering during visual search: Evidence from human electrophysiology. Journal of Experimental Psychology: Human Perception & Performance, 20, 1000–1014.CrossRefGoogle Scholar
  31. Luck, S. J., & Hillyard, S. A. (1995). The role of attention in feature detection and conjunction discrimination: An electrophysiological analysis. International Journal of Neuroscience, 80, 281–297.PubMedCrossRefGoogle Scholar
  32. Massaro, D. W. (1987). Speech perception by ear and eye: A paradigm for psychological inquiry. Hillsdale, NJ: Erlbaum.Google Scholar
  33. McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264, 746–748.PubMedCrossRefGoogle Scholar
  34. Mitroff, S. R., & Alvarez, G. A. (2007). Space and time, not surface features, guide object persistence. Psychonomic Bulletin & Review, 14, 1199–1204.CrossRefGoogle Scholar
  35. Mitroff, S. R., Arita, J. T., & Fleck, M. S. (2009). Staying in bounds: Contextual constraints on object-file coherence. Visual Cognition, 17, 195–211. doi:10.1080/13506280802103457PubMedCrossRefGoogle Scholar
  36. Mitroff, S. R., Scholl, B. J., & Wynn, K. (2004). Divide and conquer: How object files adapt when a persisting object splits into two. Psychological Science, 15, 420–425.PubMedCrossRefGoogle Scholar
  37. Mondor, T. A., Hurlburt, J., & Thorne, L. (2003). Categorizing sounds by pitch: Effects of stimulus similarity and response repetition. Perception & Psychophysics, 65, 107–114.CrossRefGoogle Scholar
  38. Mordkoff, J. T., & Halterman, R. (2008). Feature integration without visual attention: Evidence from the correlated flankers task. Psychonomic Bulletin & Review, 15, 385–389. doi:10.3758/PBR.15.2.385CrossRefGoogle Scholar
  39. Noles, N. S., Scholl, B. J., & Mitroff, S. R. (2005). The persistence of object file representations. Perception & Psychophysics, 67, 324–334.CrossRefGoogle Scholar
  40. Pratt, J., & Hommel, B. (2003). Symbolic control of visual attention: The role of working memory and attentional control settings. Journal of Experimental Psychology: Human Perception & Performance, 29, 835–845. doi:10.1037/0096-1523.29.5.835CrossRefGoogle Scholar
  41. Reed, P. (1992). Effect of a signalled delay between an action and outcome on human judgement of causality. Quarterly Journal of Experimental Psychology, 44B, 81–100.Google Scholar
  42. Reed, P. (1999). Role of a stimulus filling an action-outcome delay in human judgments of causal effectiveness. Journal of Experimental Psychology: Animal Behavior Processes, 25, 92–102.PubMedCrossRefGoogle Scholar
  43. Saiki, J. (2009). Functional roles of memory for feature-location binding in event perception: Investigation with spatiotemporal visual search. Visual Cognition, 17, 212–231. doi:10.1080/13506280802280230CrossRefGoogle Scholar
  44. Shams, L., Kamitani, Y., & Shimojo, S. (2000). Illusions: What you see is what you hear. Nature, 408, 788.PubMedCrossRefGoogle Scholar
  45. Takegata, R., Brattico, E., Tervaniemi, M., Varyagina, O., Näätänen, R., & Winkler, I. (2005). Preattentive representation of feature conjunctions for concurrent spatially distributed auditory objects. Cognitive Brain Research, 25, 169–179. doi:10.1016/j.cogbrainres.2005.05.006PubMedCrossRefGoogle Scholar
  46. Talsma, D., & Woldorff, M. G. (2005). Selective attention and multisensory integration: Multiple phases of effects on the evoked brain activity. Journal of Cognitive Neuroscience, 17, 1098–1114.PubMedCrossRefGoogle Scholar
  47. Treisman, A. M., & Gelade, G. (1980). A feature-integration theory of attention. Cognitive Psychology, 12, 97–136.PubMedCrossRefGoogle Scholar
  48. Treisman, A. [M.], & Sato, S. (1990). Conjunction search revisited. Journal of Experimental Psychology: Human Perception & Performance, 16, 459–478.CrossRefGoogle Scholar
  49. Vroomen, J., Bertelson, P., & de Gelder, B. (2001). The ventriloquist effect does not depend on the direction of automatic visual attention. Perception & Psychophysics, 63, 651–659.CrossRefGoogle Scholar
  50. Vroomen, J., & de Gelder, B. (2004). Temporal ventriloquism: Sound modulates the flash-lag effect. Journal of Experimental Psychology: Human Perception & Performance, 30, 513–518.CrossRefGoogle Scholar
  51. Wessinger, C. M., VanMeter, J., Tian, B., Van Lare, J., Pekar, J., & Rauschecker, J. P. (2001). Hierarchical organization of the human auditory cortex revealed by functional magnetic resonance imaging. Journal of Cognitive Neuroscience, 13, 1–7.PubMedCrossRefGoogle Scholar
  52. Yi, D.-J., Turk-Browne, N. B., Flombaum, J. I., Kim, M.-S., Scholl, B. J., & Chun, M. M. (2008). Spatiotemporal object continuity in human ventral visual cortex. Proceedings of the National Academy of Sciences, 105, 8840–8845. doi:10.1073/pnas.0802525105CrossRefGoogle Scholar
  53. Zeki, S., & Bartels, A. (1999). Toward a theory of visual consciousness. Consciousness & Cognition, 8, 225–259.CrossRefGoogle Scholar
  54. Zmigrod, S., & Hommel, B. (2009). Auditory event files: Integrating auditory perception and action planning. Attention, Perception, & Psychophysics, 71, 352–362. doi:10.3758/APP.71.2.352CrossRefGoogle Scholar
  55. Zmigrod, S., Spapé, M., & Hommel, B. (2009). Intermodal event files: Integrating features across vision, audition, taction, and action. Psychological Research, 73, 674–684. doi:10.1007/s00426-008-0163-5PubMedCrossRefGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2010

Authors and Affiliations

  1. 1.Cognitive Psychology UnitLeiden UniversityLeidenThe Netherlands
  2. 2.Leiden Institute for Brain and CognitionLeidenThe Netherlands

Personalised recommendations