Abstract
There is a view on consciousness that has strong intuitive appeal and empirical support: the intermediate-level theory of consciousness, proposed mainly by Ray Jackendoff and by Jesse Prinz. This theory identifies a specific “intermediate” level of representation as the basis of human phenomenal consciousness, which sits between high-level non-perspectival thought processes and low-level disjointed feature-detection processes in the perceptual and cognitive processing hierarchy. In this article, we show that the claim that consciousness arises at an intermediate-level is true of some cognitive systems, but only in virtue of specific constraints on their active interactions with the environment. We provide ecological reasons for why certain processing levels in a cognitive hierarchy are privileged with respect to consciousness. We do this from the perspective of a prediction-error minimization model of perception and cognition, relying especially on the notion of active inference: the privileged level for consciousness depends on the specific dispositions of an organism concerned with inferring its policies for action. Such a level is indeed intermediate for humans, but this depends on the spatiotemporal resolution of the typical actions that a human organism can normally perform. Thus, intermediateness is not an essential feature of consciousness. In organisms with different action dispositions the privileged level or levels may differ as well.
Similar content being viewed by others
Notes
We use ‘consciousness’, ‘phenomenal consciousness’ and ‘phenomenal experience’ interchangeably to refer to the what it’s like to see, hear, touch, taste, etc. something (Nagel, 1974). Also, for reasons of length and focus, we wish to remain neutral on the distinction between phenomenal consciousness and access consciousness. If one accepts such a distinction, much of what we say in this article is directly relevant for access consciousness.
Here the term “cognitive” is used in a broader sense as to encompass, perceptual, affective and evaluative processes.
It should be noted, however, that Prinz (2012) engages with the mechanistic and functional questions as well, offering a detailed outline of what he takes to be the core neural mechanism realizing conscious experience, namely synchronization of some neural representations at the gamma frequency of 40 Hz. We do not engage with these further questions in this article.
Of course, the contents in the example are token-content, but the fact that they differ in the rate of change shows that, for Marr, they belong to different content types. .
We leave the nature of this relation unspecified, since we are not interested with the metaphysical question, as we clarify below.
PEM has been successfully applied to model and explain various phenomena in different perceptual and cognitive domains. Primary examples are binocular rivalry (Hohwy, et al. 2008), and the positive symptoms of schizophrenia (Fletcher and Frith, 2008). For an extensive review and discussion of successful applications of PEM see Clark (2015)
To better illustrate this point, one may rely on Egan’s distinction between mathematical and cognitive content of a representational vehicle (Egan, 2013). The mathematical contents, which are the only real contents, at each level of the PEM internal model’s hierarchy are of the same type, while the intentional contents of hypotheses, such as objects and their properties (e.g. relative and general location in the example above), are part of the explanatory gloss we attach to the theory in order to relate the underlying computations to one’s explanatory goal, which is, in our case, understanding the scope of phenomenal consciousness. Different levels of the hierarchy may be ascribed different intentional contents in the gloss depending on the spatiotemporal resolution of hypotheses at those levels. See also Wiese (2016) for a non-instrumentalist take on Egan’s cognitive contents in PP.
This point is neatly summarized by Clark (2015, p. 293), quoting Lawereyns when he talks about the contents of generative models: “The representations thus constructed [from the generative model] are ‘not actual re-presentations or duplicates of objects in the world but … incomplete, abstract code that makes predictions about the world and revises its predictions on the basis of interaction with the world (Lawereyns, 2012, p. 74)’”.
Here we shall take into account only typically developed organisms.
We are grateful to an anonymous referee for pressing this issue.
Gregory (1996, p. 377)
We are grateful to an anonymous referee for pointing us to this line of research.
We are grateful to an anonymous referee for raising this worry.
We are grateful to an anonymous referee for raising this issue.
References
Beets, I., Hart, B., Rösler, F., Henriques, D., Einhäuser, W., & Fiehler, K. (2010). Online action-to-perception transfer: Only percept-dependent action affects perception. Vision Research, 50, 2633–2641.
Brenner, E., & Smeets, J. (1996). Size illusion influences how we lift but not how we grasp an object. Experimental Brain Research, 111, 473–476.
Brouwer, A., Georgiou, I., Glover, S., & Castiello, U. (2006). Adjusting reach to lift movements to sudden visible changes in target’s weight. Experimental Brain Research, 173, 629–636.
Brownstein, (2018). The implicit mind. Oxford: Oxford University Press.
Clark, A. (2015). Surfing uncertainty. Oxford: Oxford University Press.
Crick, F., & Koch, C. (2000). The unconscious homunculus. Neuropsychoanalysis, 2(1), 3–11.
Dolega, K. (2017). Moderate predictive processing. In T. Metzinger & W. Wiese (Eds.), Philosophy and predictive processing: 10. Frankfurt am Main: MIND Group.
Downey, A. (2018). Predictive processing and the representation wars: A victory for the eliminativist (via fictionalism). Synthese, 195(12), 5115–5139.
Egan, F. (2013). How to think about mental content. Philosophical Studies, 170(1), 115–135.
Fletcher, P., & Frith, C. (2008). Perceiving is believing: A Bayesian approach to explaining the positive symptoms of schizophrenia. Nature Reviews Neuroscience, 10(1), 48–58.
Friston, K. (2010). The free-energy principle: A unified brain theory? Nature Reviews Neuroscience, 11, 127–138.
Friston, K., Daunizeau, J., & Kiebel, S. (2009). Reinforcement learning or active inference? PLoS ONE, 4(7), e6421.
Friston, K., Rigoli, F., Ognibene, D., Mathys, C., Fitzgerald, T., & Pezzulo, G. (2015). Active inference and epistemic value. Cognitive Neuroscience, 6(4), 187–214.
Friston, K., Samothrakis, S., & Montague, R. (2012). Active inference and agency: Optimal control without cost functions. Biological Cybernetics, 106(8–9), 523–541.
Friston, K., Schwartenbeck, P., Fitzgerald, T., Moutoussis, M., Behrens, T., & Dolan, R. (2013). The anatomy of choice: Active inference and agency. Frontiers in Human Neuroscience, 7, 598.
Frith, C. D. (1995). Consciousness is for other people. Behavioral and Brain Sciences, 18(4), 682–683.
Godfrey-Smith, P. (2016). Other minds. New York: Farrar, Strauss, and Giroux.
Gregory, R. (1996). What do qualia do? Perception, 25, 377–379.
Hayhoe, M., Shrivastava, A., Mruczek, R., & Pelz, J. (2003). Visual memory and motor planning in a natural task. Journal of Vision, 3(1), 49–63.
Hohwy, J. (2013). The predictive mind. Oxford: Oxford University Press.
Hohwy, J. (2014). The self-evidencing brain. Nous, 50(2), 259–285.
Hohwy, J. (2016a). Priors in perception: Top-down modulation, bayesian perceptual learning rate, and prediction error minimization. Consciousness and Cognition, 47, 1–11.
Hohwy, J. (2016b). Prediction, agency, and body ownership. Where is the action? In A. Engel, K. Friston, & D. Kragic (Eds.), The pragmatic turn in cognitive science. Cambridge: MIT Press.
Hohwy, J., Roepstorff, A., & Friston, K. (2008). Predictive coding explains binocular rivalry: An epistemological review. Cognition, 108(3), 687–701.
Hornsby, J. (2013). Basic activity. Aristotelian Society Supplementary, 87(1), 1–18.
Humphreys, G., & Riddoch, M. (2007). How to define an object: Evidence from the effects of action on perception and attention. Mind and Language, 22, 534–547.
Jackendoff, R. (1987). Consciousness and the computational mind. London: Bradford Books.
Kentridge, R., Heywood, C., & Weiskrantz, L. (1999). Attention Without Awareness in Blindsight. Proceedings. Biological Sciences/the Royal Society, 266(1430), 1805–1811.
Kentridge, R., Nijboer, T. C., & Heywood, C. (2008). Attended but unseen: Visual attention is not sufficient for visual awareness. Neuropsychologia, 46(3), 864–869.
Koch, C. (2004). The quest for consciousness. Englewood: Roberts Publishers.
Lavin, D. (2013). Must there be basic action? Nous, 47(2), 273–301.
Lawereyns, J. (2012). Brain and the gaze: On the active boundaries of vision. Cambridge: MIT press.
Libet, B. (1985). Unconscious cerebral initiative and the role of conscious will in voluntary action. Behavioral and Brain Sciences, 8, 529–566.
Macpherson, F. (2011). Introduction to Hawley, K. and Macpherson, F. The admissible contents of experience (pp. 1–15). Hoboken: Wiley.
Mandelbaum, E. (2014). Thinking is believing. Inquiry: An Interdisciplinary Journal of Philosophy, 57(1), 55–96.
Mandelbaum, E. (2015). Attitude, inference, association: On the propositional structure of implicit bias. Nous, 50(3), 629–658.
Marchi, F., & Newen, A. (2016). The cognitive foundations of visual consciousness: Why should we favour a processing approach? Phenomenology and the Cognitive Sciences, 15(2), 247–264.
Marr, D. (1982). Vision. Cambridge: MIT press.
Nagel, T. (1974). What is it like to be a bat? The Philosophical Review, 83(4), 435–450.
Newen, A., & Vetter, P. (2017). Why cognitive penetration of our perceptual experience is still the most plausible account. Consciousness and Cognition, 47, 26–37.
Parr, T., Corcoran, A., Friston, K., & Hohwy, J. (2019). Perceptual awareness and active inference. Neuroscience of Consciousness, 5(1), niz012.
Perrykkad, K., & Hohwy, J. (2020). Fidgeting as self-evidencing: A predictive processing account of non-goal-directed action. New Ideas in Psychology, 56, 750.
Prinz, J. (2000). A neurofunctional theory of consciousness. Consciousness and Cognition, 9, 243–259.
Prinz, J. (2012). The conscious brain: How attention engenders experience. New York: Oxford University Press.
Prinz, J. (2017). The intermediate level theory of consciousness. In S. Schneider & M. Velmans (Eds.), The Blackwell companion to consciousness (2nd ed., pp. 257–271). Wiley.
Raftopoulos, A. (2013). The cognitive impenetrability of the content of early vision is a necessary and sufficient condition for purely nonconceptual content. Philosophical Psychology, 27(5), 601–620.
Riddoch, M., Humphreys, G., Hickman, M., Clift, J., Daly, A., & Colin, J. (2006). I can see what you are doing: Action familiarity and affordance promote recovery from extinction. Cognitive Neuropsychology, 23, 583–605.
Sandis, C. (2010). Basic actions and individuation. In T. O’Connor & C. Sandis (Eds.), A companion to the philosophy of action (pp. 10–17). Oxford: Wiley.
Siegel, S. (2006a). Subject and object in the contents of visual experience. The Philosophical Review, 115, 355–388.
Siegel, S. (2006b). Which properties are represented in perception? In T. Szabo Gendler & J. Hawthorne (Eds.), Perceptual experience. New York: Oxford University Press.
Siegel, S. (2012). The contents of visual experience. Oxford: Oxford University Press.
Song, C., & Yao, H. (2016). Unconscious processing of invisible visual stimuli. Scientific Reports, 6, 38917.
Vetter, P., & Newen, A. (2014). Varieties of cognitive penetration in visual perception. Consciousness and Cognition, 27, 62–75.
Wiese, W. (2016). What are the contents of representations in predictive processing? Phenomenology and the Cognitive Sciences, 16(4), 715–736.
Wiese, W. (2017). Action is enabled by systematic misrepresentations. Erkenntnis, 82(6), 1233–1252.
Acknowledgements
We are grateful to the Ruhr University Research School PLUS for supporting mutual visits between Ruhr-Universität Bochum and Monash University in the years 2016–2017 during which this paper was conceived. FM is supported by the Mercator Research Center Ruhr (MERCUR) project Pr-2016-0016 and by the Center for Mind and Cognition of the Ruhr Universität Bochum. JH is supported by the Australian Research Council DP160102770 and DP190101805 and by the Research School Bochum and the Center for Mind and Cognition, Ruhr-University Bochum.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Marchi, F., Hohwy, J. The Intermediate Scope of Consciousness in the Predictive Mind. Erkenn 87, 891–912 (2022). https://doi.org/10.1007/s10670-020-00222-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10670-020-00222-7