Abstract
This article argues that if panpsychism is true, then there are grounds for thinking that digitally-based artificial intelligence (AI) may be incapable of having coherent macrophenomenal conscious experiences. Section 1 briefly surveys research indicating that neural function and phenomenal consciousness may be both analog in nature. We show that physical and phenomenal magnitudes—such as rates of neural firing and the phenomenally experienced loudness of sounds—appear to covary monotonically with the physical stimuli they represent, forming the basis for an analog relationship between the three. Section 2 then argues that if this is true and micropsychism—the panpsychist view that phenomenal consciousness or its precursors exist at a microphysical level of reality—is also true, then human brains must somehow manipulate fundamental microphysical-phenomenal magnitudes in an analog manner that renders them phenomenally coherent at a macro level. However, Sect. 3 argues that because digital computation abstracts away from microphysical-phenomenal magnitudes—representing cognitive functions non-monotonically in terms of digits (such as ones and zeros)—digital computation may be inherently incapable of realizing coherent macroconscious experience. Thus, if panpsychism is true, digital AI may be incapable of achieving phenomenal coherence. Finally, Sect. 4 briefly examines our argument’s implications for Tononi’s Integrated Information Theory (IIT) theory of consciousness, which we contend may need to be supplanted by a theory of macroconsciousness as analog microphysical-phenomenal information integration.
Similar content being viewed by others
Notes
Our argument differs substantially from another recent line of argument that (some) digital computers may be incapable of phenomenal consciousness (Tononi & Koch, 2015). Based on Tononi’s Integrated Information Theory (IIT) of consciousness, which holds that phenomenal consciousness is identical to maximally integrated information, Tononi and Koch argue that two functionally identical systems can have the same input-output function, while only one of those systems integrates information whereas the other does not. On IIT, the latter system—even if it otherwise functioned like a human brain—would be a “zombie” system with no conscious experience (see also Oizumi et al., 2014, pp. 19–22). Further, as Tononi & Koch (2015) and Koch (2019) elaborate, because current digital computers cannot integrate information in anything like the fine-grained way that human brains do (Koch, 2019, pp. 142–144), if IIT is true, then it may take neuromorphic electronic hardware “built according to the brain’s design principles” for AI to “amass sufficient intrinsic cause-effect power to feel like something” (ibid., p. 150; see also Tononi & Koch 2015, p. 16, fn. 15). Our argument is more radical than this in at least two respects. First, our argument implies that even a “neuromorphic” digital machine could fail to realize coherent macrophenomenal consciousness—for such a machine might still fail to manipulate fundamental microphysics in a way necessary for combining fundamental phenomenal qualities into a coherent macrophenomenal manifold. Second, as we explain in Sect. 4, our argument entails that IIT itself may be false. If we are correct, then the only way for AI to have coherent macrophenomenal consciousness may be for them to be analog machines that integrate fundamental microphysical-phenomenal magnitudes in the right way.
There are other ways of naming numbers, such as using words in a natural language (e.g. “four”), using different kinds of numerical conventions (e.g. Roman numerals), or using purely conventional single symbols (e.g. “π”). Digital representations (and other numerical conventions) are a special kind of name that specify the value of the number named as a function of the individual numerals. Chrisomalis (2020) discusses these systems—and many others—in fascinating detail.
Analog computers are virtually never used for these purposes now, although they were once the dominant computing paradigm before the 1970s.
“Non-analog” is typically taken to mean “digital”, but there are other schemes that are neither (e.g. the symbol “π”, mentioned in footnote 2 above).
Although some readers may doubt whether hues are magnitudes—as hue does not come in different degrees—particular hues clearly do come in degrees: something can clearly be more or less red, as well as more or less of a combination of one hue with another (different orange hues, for example, are different graded combinations of red and yellow).
We want to distinguish here between having a phenomenal experience of a banana and having an experience as of a banana, where the difference is as follows. As Fisher (2007) contends, perhaps all that a physical, functional, or phenomenal state must do in order to represent (or be of) a banana is to be causally or functionally related to banana(s) in the external environment in some way. That may well be the case. Still, whether a phenomenal experience resembles an actual banana in a coherent fashion (viz. first-personally looking like or seeming as of a small yellow fruit) also seems relevant to representation: namely, for qualitatively representing the banana in consciousness as it really is (Summers & Arvan, forthcoming, §3). Our point is that even if digital AI could represent bananas in Fisher’s externalist sense—visually “tracking” bananas in their environment—digital AI cannot do so in a manner that produces a coherent first-personal phenomenal experience as of yellow bananas.
We thank an anonymous reviewer for pressing this concern.
Whereas in a binary scheme, two digits would be needed to represent four different values.
References
Adams, Z. (2019). The history and philosophical significance of the analog/digital distinction. Colloquium Series, Pitt Center for Philosophy of Science
Beck, J. (2015). Analogue magnitude representations: A philosophical introduction. British Journal for the Philosophy of Science, 66(4), 829–855. https://doi.org/10.1093/bjps/axu014
Beck, J. (2019). Perception is analog: The argument from Weber’s Law. The Journal of Philosophy, 116(6), 319–349. https://doi.org/10.5840/jphil2019116621
Biggs, S. (2009). The scrambler: An argument against representationalism. Canadian Journal of Philosophy, 39(2), 215–236. https://doi.org/10.1353/cjp.0.0046
Block, N. (1978). Troubles with functionalism. Minnesota Studies in the Philosophy of Science, 9, 261–325
Chalmers, D. J. (1996). The conscious mind: In search of a fundamental theory. Oxford University Press
Chalmers, D. J. (2013). Panpsychism and panprotopsychism. The Amherst Lecture in Philosophy, 8, 1–35
Chalmers, D. J. (2016). The combination problem for panpsychism. In G. Brüntrup, & L. Jaskolla (Eds.), Panpsychism. Oxford University Press
Chiang, C. C., Shivacharan, R. S., Wei, X., Gonzales-Reyes, L. E., & Durand, D. M. (2018). Slow periodic activity in the longitudinal hippocampal slice can self-propagate non-synaptically by a mechanism consistent with ephatic coupling. The Journal of Physiology, 597(1), 249–269. https://doi.org/10.1113/JP276904
Chrisomalis, S. (2020). Reckonings: Numerals, cognition, and history. MIT Press
Dennett, D. C. (1993). Consciousness explained. Back Bay Books
Fisher, J. C. (2007). Why nothing mental is just in the head. Noûs, 41(2), 318–334. https://doi.org/10.1111/j.1468-0068.2007.00649.x
Fisher, M. P. A. (2015). Quantum cognition: the possibility of processing with nuclear spins in the brain. Annals of Physics, 362, 593–602. https://doi.org/10.1016/j.aop.2015.08.020
Goff, P. (2017). Consciousness and Fundamental Reality. Oxford University Press
Goff, P., Seager, W., & Allen-Hermanson, S. (2020). Panpsychism. In E. N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Summer 2020 Edition), https://plato.stanford.edu/archives/sum2020/entries/panpsychism/
Goodman, N. (1968). Languages of art: An approach to a theory of symbols. Hackett Publishing Company, Inc.
Hoel, E. P., Albantakis, L., & Tononi, G. (2013). Quantifying causal emergence shows that macro can beat micro. Proceedings of the National Academy of Sciences, 110(49), 19790–19795. https://doi.org/10.1073/pnas.1314922110
Katz, M. (2008). Analog and digital representation. Minds and Machines, 18(3), 403–408. https://doi.org/10.1007/s11023-008-9112-8
Koch, C. (2019). The feeling of life itself: Why consciousness is widespread but can’t be computed. The MIT Press
Kulvicki, J. (2015). Analog representation and the parts principle. Review of Philosophy and Psychology, 6(1), 165–180. https://doi.org/10.1007/s13164-014-0218-z
Lewis, D. K. (1971). Analog and digital. Noûs, 5(3), 321–327
Maley, C. J. (2011). Analog and digital, continuous and discrete. Philosophical Studies, 155(1), 117–131. https://doi.org/10.1007/s11098-010-9562-8
Maley, C. J. (2018a). Brains as analog computers,Medium, https://medium.com/the-spike/brains-as-analog-computers-fa297021f935
Maley, C. J. (2018b). Toward analog neural computation. Minds and Machines, 28(1), 77–91. https://doi.org/10.1007/s11023-017-9442-5
Maley, C. J. (2020). Continuous neural spikes and information theory. Review of Philosophy and Psychology, 11, 647–667. https://doi.org/10.1007/s13164-018-0412-5
Maley, C. J. (forthcoming). Analog computation and representation. British Journal for the Philosophy of Science. https://doi.org/10.1086/715031
McGinn, C. (1989). Can We Solve the Mind-Body Problem? Mind, 98/391, 349–366
Mørch, H. H. (2014). Panpsychism and causation: A new argument and a solution to the combination problem. Dissertation, Oslo
Nagasawa, Y., & Wager, K. (2016). Panpsychism and priority cosmopsychism. In G. Brüntrup, & L. Jaskolla (Eds.), Panpsychism: Contemporary Perspectives. Oxford University Press
Nagel, T. (1974). What is it like to be a bat? The Philosophical Review, 83(4), 435–450
Oizumi, M., Albantakis, L., & Tononi, G. (2014). From the phenomenology to the mechanisms of consciousness: integrated information theory 3.0. PLoS computational biology, 10(5), e1003588: 1–25. https://doi.org/10.1371/journal.pcbi.1003588
Peacocke, C. (2019). The primacy of metaphysics. Oxford University Press
Pereira, A. Jr., & Furlan, F. A. (2009). On the role of synchrony for neuron–astrocyte interactions and perceptual conscious processing. Journal of biological physics, 35(4), 465–480. https://doi.org/10.1007/s10867-009-9147-y
Russell, B. (1921). The analysis of mind. George Allen and Unwin
Russell, B. (1927). The analysis of matter. George Allen and Unwin
Schwitzgebel, E. (2006). The unreliability of naive introspection. Philosophical Review, 117(2), 245–273. https://doi.org/10.1215/00318108-2007-037
Searle, J. R. (1980). Minds, brains, and programs. Behavioral and Brain Sciences, 3, 417–457
Searle, J. R. (1984). Minds, brains and science. Harvard University Press
Searle, J. R. (1998). How to study consciousness scientifically. Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, 353(1377), 1935–1942
Smart, J. J. C. (2017). The mind/brain identity theory. In E. N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Spring 2017 Edition), https://plato.stanford.edu/archives/spr2017/entries/mind-identity/
Strawson, G. (2006). Realistic monism: Why physicalism entails panpsychism. Journal of Consciousness Studies, 13(10–11), 3–31
Strawson, G. (2016). Mind and being: The primacy of panpsychism. In G. Brüntrup, & L. Jaskolla (Eds.), Panpsychism: Contemporary Perspectives. Oxford University Press
Summers, M., & Arvan, M. (forthcoming). Two new doubts about simulation hypotheses. Australasian Journal of Philosophy. https://doi.org/10.1080/00048402.2021.1913621
Thyrhaug, E., Tempelaar, R., Alcocer, M. J. P., Žídek, K., Bína, D., Knoester, J. … Zigmantas, D. (2018). Identification and characterization of diverse coherences in the Fenna-Matthews-Olson complex. Nature chemistry, 10(7), 780–786. https://doi.org/10.1038/s41557-018-0060-5
Tononi, G. (2012). The integrated information theory of consciousness: an updated account. Archives italiennes de biologie, 150(2/3), 56–90. https://doi.org/10.4449/aib.v149i5.138
Tononi, G. (2015). Integrated information theory. Scholarpedia: the peer-reviewed open-access encyclopedia, http://www.scholarpedia.org/article/Integrated_information_theory
Tononi, G., Boly, M., Massimini, M., & Koch, C. (2016). Integrated information theory: from consciousness to its physical substrate. Nature Reviews Neuroscience, 17(7), 450–461. https://doi.org/10.1038/nrn.2016.44
Tononi, G., & Koch, C. (2015). Consciousness: here, there and everywhere? Philosophical Transactions of the Royal Society B: Biological Sciences, 370(1668), 20140167: 1–18. https://doi.org/10.1098/rstb.2014.0167
Zbili, M., & Debanne, D. (2019). Past and future of analog-digital modulation of synaptic transmission. Frontiers in Cellular Neuroscience, 13, 1–12. https://doi.org/10.3389/fncel.2019.00160
Acknowledgements
We are grateful to several sets of anonymous reviewers, Philippe Chuard, Gerardo Viera, and audience members at the 2021 Meeting of the Pacific Division of the APA for helpful feedback on earlier drafts of this paper.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors have no competing financial or non-financial interests directly or indirectly related to the work submitted for publication.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Arvan, M., Maley, C. Panpsychism and AI consciousness. Synthese 200, 244 (2022). https://doi.org/10.1007/s11229-022-03695-x
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11229-022-03695-x