Abstract
Integration information theories posit that the integration of information is necessary and/or sufficient for consciousness. In this paper, we focus on three of the most prominent information integration theories: Information Integration Theory (IIT), Global Workspace Theory (GWT), and Attended Intermediate-Level Theory (AIR). We begin by explicating each theory and key concepts they utilize (e.g., information, integration, etc.). We then argue that the current evidence indicates that the integration of information (as specified by each of the theories) is neither necessary nor sufficient for consciousness. Unlike GWT and AIR, IIT maintains that conscious experience is both necessary and sufficient for consciousness. We present empirical evidence indicating that simple features are experienced in the absence of feature integration and argue that it challenges IIT’s necessity claim. In addition, we challenge IIT’s sufficiency claim by presenting evidence from hemineglect cases and amodal completion indicating that contents may be integrated and yet fail to give rise to subjective experience. Moreover, we present empirical evidence from subjects with frontal lesions who are unable to carry out simple instructions (despite appearing to understand their meaning) and argue that they are irreconcilable with GWT. Lastly, we argue that empirical evidence indicating that patients with visual agnosia fail to identify objects they report being conscious of present a challenge to AIR’s necessity claim.
This is a preview of subscription content, access via your institution.


Adopted from Treisman (2006)

Adopted from Treisman (2006)

Adopted from Neri and Levi (2006)


Adopted from Turnbull et al. (2004)



Notes
A theory of consciousness should be able to explain the neural substrates of mental states and their behavioral manifestations.
Bayne (2018) argues that the phenomenological axioms posited by ITT fail to capture the essential features of every experience.
ITT’s postulates identify an experience with the set of all mechanisms (i.e., the “conceptual structure”) and the maximally irreducible probability distribution of potential past and future states of a system as informed by a mechanism in its current state (i.e., its “cause–effect repertoire. See Tononi and Koch 2015).
But why, one might ask, is there more information contained in the representation of a blue square at some location L1, than there is in the representation of a color at L1 and a shape at L1? Why wouldn’t the first representation be reducible to its parts? The reason for this seems to be that the former representation carries the information that the color and the shape are properties of a single object (and thus their fates are non-accidentally correlated), whereas the latter representation lacks this information.
IIT’s contention that the experience of a blue square has its integrated content essentially is closely tied to the unity of consciousness: conscious unity can be understood in terms of the irreducibility of its components (see Bayne and Chalmers 2003).
For arguments against this and other claims pertaining to the axioms see Bayne (2018).
One might wonder how IIT would account for experiences in different sensory modalities that pick out the same feature of a given object. Consider, for example, the case of seeing and holding a ball. The visual and tactile experiences of the ball’s spherical shape have different phenomenal characters. How would IIT account for this difference in phenomenology, given that both experiences pick out a spherical shape? One way for IIT to accommodate this sort of phenomenal difference is to take the shared feature to be integrated differently in the different sensory modalities or submodalities. For example, the sphericality of the ball may be taken to be integrated in a visual way in one case but in a tactile way in the other (for a similar solution to the general problem, sometimes referred to as ‘Molyneux’s question’, see Chalmers 2004).
The global workspace cannot be associated with a fixed set of brain areas because various cortical areas can contain workspace neurons with suitable long-distance and widespread connectivity needed to give rise to conscious experiences. However, the fact that workspace neurons seem to be denser in certain areas such as the prefrontal cortices (PFC) and the anterior cingulate (AC) suggests that these areas play a dominant role in the function of the global workspace.
Block (1995, p. 234) argues that there is a natural use of ‘consciousness’ and ‘awareness’ corresponding to ‘access consciousness’ and ‘phenomenal consciousness’, respectively. According to Block, phenomenal consciousness can be understood as awareness, whereas access consciousness is better understood as consciousness proper. When a content is both P- and A-conscious, Block suggests that we speak of ‘conscious awareness’.
Many of those who agree that there is a meaningful distinction to be drawn between phenomenal and access consciousness argue that the two do not actually come apart. David Chalmers (1997), for example, argues that while phenomenal consciousness and access consciousness, as defined by Block (1995), coincide (i.e., are both present or both absent in the actual world), it is conceivable and, therefore, possible that they come apart. His view supports those like Baars and Prinz who take P-consciousness and A-consciousness to coincide.
In the case of vision, the intermediate level is anatomically located in a family of areas involved in processing color (hue), motion and three-dimensional shape (extra-striate brain regions). By comparison, the lower level is anatomically located in primary visual cortex (V1) and some subcortical structures such as the visual nuclei of the thalamus and the superior colliculus; the high level recruits structures in inferior temporal areas (such as TE, TEO, and sections of the superior temporal sulcus), the lateral occipital complex, and some structures in parietal cortex (such as the ventral and posterior inter-parietal areas).
Note that, on this view, high-level representations are used to mediate encoding once an attended item has been selected for use in cognitive tasks or for retention in long-term memory, but Prinz argues, high-level representations are not themselves modulated by attention and therefore do not themselves reach conscious awareness (Prinz 2012).
We are grateful to an anonymous reviewer for suggesting these objections.
As an anonymous reviewer noted, proponents of IIT could insist that these subjective reports should not be taken at face value because the Φ is maximal at posterior parts of the brain. Since the frontal activity associated with reportability is not part of the neural correlates of consciousness, reportability is suspect as a guide to the phenomenology of experience. However, this objection seems misguided. Our claim is not that reportability is part of conscious experience but rather that conscious experience is, in the good cases, accurately reportable. If IIT rejects this claim, then it is unclear what sort of evidence it can provide (apart from reportability) for the claim that its axioms govern the phenomenology of experience. If proponents of IIT claim instead that the Treisman cases are not “good” for whatever reason, but that report does generally serve as a guide to phenomenology (especially when it comes to their axioms), then the latter claim, given the disassociation between frontal activity and consciousness, starts seeming exceedingly mysterious.
While objective measures typically involve asking subjects to make forced-choice guesses about what they have seen, subjective measures typically involve reportability (Szczepanowski and Pessoa 2007; Kunimoto et al. 2001). When subjects are asked to report on whether they saw a stimulus, negative responses are taken as evidence that the stimulus was not experienced consciously.
See the special journal issue published in Consciousness and Cognition, 2015 Mar. volume 32. Edited by Robert Foley and Bob Kentridge.
We are grateful to an anonymous reviewer for raising this objection.
References
Baars, B. J. (1988). A cognitive theory of consciousness. Cambridge, MA: Cambridge University Press.
Baars, B. J. (1997). In the theater of consciousness. New York, NY: Oxford University Press.
Baars, B. J. (2002). The conscious access hypothesis: Origins and recent evidence. Trends in Cognitive Sciences, 6(1), 47–52.
Baars, B. J. (2005). Global workspace theory of consciousness: Toward a cognitive neuroscience of human experience. Progress in Brain Research, 150, 45–53.
Baars, B. J., & Franklin, S. (2009). Consciousness is computational: The LIDA model of global workspace theory. International Journal of Machine Consciousness, 1, 23–32.
Baars, B. J., & Franklin, S. (2003). How conscious experience and working memory interact. Trends in Cognitive Science, 7, 166–172.
Baddeley, A., & Wilson, B. (1988). Frontal amnesia and the dysexecutive syndrome. Brain and Cognition, 7(2), 212–230.
Ballard, D. H., Hayhoe, M. M., & Pelz, J. B. (1994). Memory representations in natural tasks. Journal of Cognitive Neuroscience, 7, 66–80.
Bayne, T. (2010). The unity of consciousness. Oxford: Oxford University Press.
Bayne, T. (2018). On the axiomatic foundations of the integrated information theory of consciousness. Neuroscience of Consciousness. https://doi.org/10.1093/nc/niy007.
Bayne, T., & Chalmers, D. J. (2003). The unity of consciousness. In A. Cleermans (Ed.), The unity of consciousness: Binding, integration, dissociation. New York: Oxford University Press.
Bisiach, E., & Rusconi, M. L. (1990). Break-down of perceptual awareness in unilateral neglect. Cortex, 26, 643–649.
Blackmore, S. (2002). There is no stream of consciousness. Journal of Consciousness Studies, 9(5–6), 17–28.
Block, N. (1995). On a confusion about the function of consciousness. Behavioral and Brain Sciences, 18, 227–287.
Block, N. (1997). On a confusion about a function of consciousness. In N. Block, O. Flanagan, & G. Güzeldere (Eds.), The nature of consciousness: Philosophical debates (pp. 375–415). Cambridge, MA: MIT Press.
Brogaard, B. (2011a). Are there unconscious perceptual processes? Consciousness and Cognition, 20(2011), 449–463.
Brogaard, B. (2011b). Conscious vision for action vs. unconscious vision for action. Cognitive Science, 35(2011), 1076–1104.
Brogaard, B. (2015). Type 2 blindsight and the nature of visual experience. Consciousness and Cognition, 32, 92–103.
Brogaard, B., & Gatzia, D. E. (2017). Unconscious imagination and the mental imagery debate. Frontiers in Psychology, 8, 799. https://doi.org/10.3389/fpsyg.2017.00799.
Brüntrup, G., & Jaskolla, L. (Eds.). (2016). Panpsychism. New York: Oxford University Press.
Carruthers, P. (1996). Language, thought and consciousness. Cambridge: Cambridge University Press.
Carruthers, P. (2000). Phenomenal consciousness. Cambridge: Cambridge University Press.
Chalmers, D. J. (1997). Availability: The cognitive basis of experience? Behavioral and Brain Sciences, 20, 148–149.
Chalmers, D. J. (1998). The problems of consciousness. In H. Jasper, L. Descarries, V. Castellucci, & S. Rossignol (Eds.), Advances in neurologe: Consciousness: At the frontiers of neuroscience. New York: Lippincott-Raven.
Chalmers, D. J. (2004). The representational character of experience. In B. Leiter (Ed.), The future for philosophy. New York: Oxford University Press.
Chalmers, D. J. (2015). Panpsychism and panprotopsychism. In T. Alter & Y. Nagasawa (Eds.), Consciousness in the physical world: Perspectives on Russellian Monism (pp. 246–276). New York: Oxford University Press.
Cohen, A., & Ivry, R. (1989). Illusory conjunctions inside and outside the focus of attention. Journal of Experimental Psychology: Human Perception and Performance, 15(4), 650.
Dehaene, S., & Changeux, J. P. (2011). Experimental and theoretical approaches to conscious processing. Neuron, 70, 200–227.
Dehaene, S., Changeux, J. P., Naccache, L., Sackur, J., & Sergent, C. (2006). Conscious, preconscious, and subliminal processing: A testable taxonomy. Trends in Cognitive Sciences, 10(5), 204–211.
Dehaene, S., & Naccache, L. (2001). Towards a cognitive neuroscience of consciousness: Basic evidence and a workspace framework. Cognition, 79, 1–37.
Dretske F (1993) Conscious experience (pp. 263–283). Mind, CII
Driver, J., & Vuilleumier, P. (2001). Perceptual awareness and its loss in unilateral neglect and extinction. Cognition, 79(1), 39–88.
Franklin, S., & Graesser, A. C. (1997). Is it an agent, or just a program?: A taxonomy for autonomous agents” In Intelligent agents III. Berlin: Springer.
Goodale, M. A., & Milner, A. D. (1992). Separate visual pathways for perception and action. Trends in Neurosciences, 15, 20–25.
Goodale, M. A., Milner, A. D., Jakobson, L. S., & Carey, D. P. (1991). A neurological dissociation between perceiving objects and grasping them. Nature, 349, 154–156.
Hardcastle, V. G. (1994). Psychology's ‘binding problem’ and possible neurobiological solutions. Journal of Consciousness Studies, 1(1), 66–90.
Hu, Y., & Goodale, M. A. (2000). Grasping after a delay shifts size-scaling from absolute to relative metrics. Journal of Cognitive Neuroscience, 12, 856–868.
Hummel, J. E. (2001). Complementary solutions to the binding problem in vision: Implications for shape perception and object recognition. Visual Cognition, 8(3–5), 489–517.
Kanne, S. (2002). The role of semantic, orthographic, and phonological prime information in unilateral visual neglect. Cognitive Neuropsychology, 19(3), 245–261.
Konow, A., & Pribram, K. H. (1970). Error recognition and utilization produced by injury to the frontal cortex in man. Neuropsychologia, 8, 489–491.
Köhler, S., & Moscovitch, M. (1997). Unconscious visual processing in neuropsychological syndromes: A survey of the literature and evaluation of models of consciousness. In M. D. Rugg (Ed.), Cognitive neuroscience (pp. 305–373). Cambridge, MA: The MIT Press.
Kunimoto, C., Miller, J., & Pashler, H. (2001). Confidence and accuracy of near-threshold discrimination responses. Consciousness and Cognition, 10, 294–340.
Lycan, W. (1996). Consciousness and experience. Cambridge, MA.: MIT Press.
Macpherson, F. (2015). The structure of experience, the nature of the visual, and type 2 blindsight. Consciousness and Cognition, 32, 104–128.
Manson, N. (2000). State consciousness and creature consciousness: A real distinction. Philosophical Psychology, 13(3), 405–410.
Marr, D. (1982). Vision. San Francisco: Freeman.
Marshall, J. C., & Halligan, P. W. (1988). Blindsight and insight in visuospatial neglect. Nature, 336, 766–767.
McGlinchey-Berroth, R., Milberg, W. P., Verfaellie, M., Alexander, M., & Kilduff, P. (1993). Semantic priming in the neglected field: Evidence from a lexical decision task. Cognitive Neuropsychology, 10, 79–108.
Michotte, A., Thinès, G., Costall, A., & Butterworth, G. (1991). Michotte's experimental phenomenology of perception. Hillsdale: L. Erlbaum Associates.
Milner, A., & Goodale, M. (1995). The visual brain in action. Oxford: Oxford University Press.
Milner, A. D., Perrett, D. I., Johnston, R. S., Benson, P. J., Jordan, T. R., Heeley, D. W., et al. (1991). Perception and action in 'visual form agnosia'. Brain, 114((Pt 1B)), 405–428.
Neisser, U. (1967). Cognitive psychology. New York, NY: Appleton-Century-Crofts.
Neri, P., & Levi, D. M. (2006). Spatial resolution for feature binding is impaired in peripheral and amblyopic vision. Journal of Neurophysiology, 96(1), 142–153.
Oizumi, M., Albantakis, L., & Tononi, G. (2014). From the phenomenology to the mechanisms of consciousness: Integrated information theory 3.0. PLoS Computational Biology, 10(5), 1–25.
Pins, D., & Ffytche, D. (2003). The neural correlates of conscious vision. Cerebral Cortex, 13(5), 461–474.
Posner, M. I. (1994). Attention: The mechanisms of consciousness. Proceedings of the National Academy of Sciences USA, 91, 7398–7403.
Posner, M. I., & Dehaene, S. (1994). Attentional networks. Trends in Neuroscience, 17, 75–79.
Prinz, J. (2000). A neurofunctional theory of visual consciousness. Consciousness and Cognition, 9(2), 243–259.
Prinz, J. (2012). The conscious brain. Oxford: Oxford University Press.
Raffone, A., & Pantani, M. (2010). A global workspace model for phenomenal and access consciousness. Consciousness and Cognition, 19, 580–596.
Robertson, L. C. (2003). Binding, spatial attention and perceptual awareness. Nature Reviews Neuroscience, 4(2), 93.
Robertson, L., Treisman, A., Friedman-Hill, S., & Grabowecky, M. (1997). The interaction of spatial and object pathways: Evidence from Balint's syndrome. Journal of Cognitive Neuroscience, 9(3), 295–317.
Rosenthal, D. M. (1997). A theory of consciousness. In N. Block, O. Flanagan, & G. Güzeldere (Eds.), The nature of consciousness: philosophical debates (a Bradford Book) . Cambridge, MA: MIT Press.
Schacter, D. L., Buckner, R. L., & Koutstall, W. (1998). Memory, consciousness and neuroimaging. Philosophical Transactions of Royal Society London B, 353, 1861–1878.
Shea, N., & Frith, C. D. (2016). Dual-process theories and consciousness: The case for ‘Type Zero’ cognition. Neuroscience of Consciousness 1–10. https://www.philosophy.ox.ac.uk/sites/default/files/philosophy/documents/media/shea_frith_type_0_cogn_nconsc16_oa.pdf.
Simons, D. J., & Levin, D. T. (1998). Failure to detect changes to people during real-world interaction. Psychonomic Bulletin and Review, 5(4), 644–649.
Sligte, I. G., Scholte, H. S., & Lamme, V. A. F. (2008). Are there multiple visual short-term memory stores? PLoS One, 3(e1699), 1–9.
Strawson, G. (2006). Realistic materialism: Why physicalism entails panpsychism. Journal of Consciousness Studies, 13(10–11), 3–31.
Szczepanowski, R., & Pessoa, L. (2007). Fear perception: Can objective and subjective awareness measures be dissociated? Journal of Vision, 7, 1–17.
Tononi, G. (2004). An information integration theory of consciousness. BMC Neuroscience, 5(1), 42.
Tononi, G. (2008). Consciousness as integrated information: A provisional manifesto. Biological Bulletin, 215, 216–242.
Tononi, G. (2010). Information integration: Its relevance to brain function and consciousness. Archives Italiennes de Biologie, 148, 299–322.
Tononi, G. (2012). Integrated information theory of consciousness: An updated account. Archives Italiennes de Biologie, 150, 290–326.
Tononi, G., & Koch, C. (2015). Consciousness: Here, there and everywhere? Philosophical Transactions of the Royal Society, B370(1668), 20140167.
Treisman, A. (1996). The binding problem. Current Opinion in Neurobiology, 6(2), 171–178.
Treisman, A. (2006). How the deployment of attention determines what we see. Visual Cognition, 14(4–8), 411–443.
Turnbull, O. H., Driver, J., & McCarthy, R. A. (2004). 2D but not 3D: Pictorial-depth deficits in a case of visual agnosia. Cortex, 40, 723–738.
Vogel, E. K., Woodman, G. F., & Luck, S. J. (2001). Storage of features, conjunctions, and objects in visual working memory. Journal of Experimental Psychology: Human Perception and Performance, 27, 92–114.
Vuilleumier, P., & Landis, T. (1998). Illusory Contours and spatial neglect. NeuroReport, 9(11), 2481–2484.
Vuilleumier, P., Armony, J. L., Clarke, K., Husain, M., Driver, J., & Dolan, R. J. (2002). Neural response to emotional faces with and without awareness: Event-related fMRI in a parietal patient with visual extinction and spatial neglect. Neuropsychologia, 40(12), 2156–2166.
Westwood, D. A., & Goodale, M. A. (2003). Perceptual illusion and the real-time control of action. Spatial Vision, 16, 243–254.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Brogaard, B., Chomanski, B. & Gatzia, D.E. Consciousness and information integration. Synthese 198 (Suppl 3), 763–792 (2021). https://doi.org/10.1007/s11229-020-02613-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11229-020-02613-3
Keywords
- Amodal completion
- Attended intermediate-level representation theory
- Attention
- Consciousness
- Feature integration
- Global workspace theory
- Illusory contours
- Information integration theory
- Spatial neglect
- Visual agnosia