Skip to main content
Log in

Brain and mind operational architectonics and man-made “machine” consciousness

  • Review
  • Published:
Cognitive Processing Aims and scope Submit manuscript

Abstract

To build a true conscious robot requires that a robot’s “brain” be capable of supporting the phenomenal consciousness as human’s brain enjoys. Operational Architectonics framework through exploration of the temporal structure of information flow and inter-area interactions within the network of functional neuronal populations [by examining topographic sharp transition processes in the scalp electroencephalogram (EEG) on the millisecond scale] reveals and describes the EEG architecture which is analogous to the architecture of the phenomenal world. This suggests that the task of creating the “machine” consciousness would require a machine implementation that can support the kind of hierarchical architecture found in EEG.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. This distinction is well known as dichotomy between the Weak Artificial Consciousness and Strong Artificial Consciousness (Holland 2003), where the Weak Artificial Consciousness deals with design and construction of machines that simulate consciousness or cognitive processes usually correlated with it, while the Strong Artificial Consciousness aims to design a true conscious machine. This separation between Weak and Strong Artificial Consciousness mirrors the separation between the ‘easy’ and the ‘hard problem’ of consciousness (Chalmers 1996). According to this distinction, ‘easy problem’ of understanding consciousness refers to explaining the ability to discriminate, integrate information, report mental states, focus attention, etc., whereas the ‘hard problem’ needs to answer the question why does subjective awareness of sensory information exist at all? Both mentioned dichotomies are strongly related to a third one—dichotomy between ‘access’ and ‘phenomenal’ consciousness (Block 1995). In the framework of this distinction, access consciousness is defined as availability for use reasoning and rationally in guiding speech, action and thought. In contrast, phenomenal consciousness is a subjective experience.

  2. Even though there are first attempts to create so-called ‘synthetic phenomenology’ (Gamez 2005, 2006; Stening et al. 2005; Chrisley and Parthemore 2007; Kiverstein 2007; Hesslow and Jirenhed 2007; Ikegami 2007; Haikonen 2007a, b), it is very early to speak about even possibility of achieving genuinely conscious machines. Despite productive work, there is strong awareness in the field of synthetic phenomenology that something crucial is still missing in the current implementations of autonomous systems (see Manzotti 2007; Ziemke 2007; Dreyfus 2007; Koch and Tononi 2008).

  3. In another words, phenomenal consciousness is a higher level of biological organization in the brain (Revonsuo 2006).

  4. Although it is often claimed that volume conduction is the main obstacle in interpreting EEG data, we have shown through modeling experiments that the proper EEG methodology reveals such EEG architecture which is sensitive to the morpho-functional organization of the cortex rather than to the volume conduction and/or reference electrode (for relevant details, we address the reader to Kaplan et al. 2005).

  5. That are largely still to be devised.

  6. Even though this framework has many similarities with other theoretical conceptualizations, it is quite distant from them in the core principles (for the detailed comparative analysis, see Fingelkurts and Fingelkurts 2006). Additionally, in the context of OA framework (and in contrast to other theories) there is a range of methodological tools which enable in practice to measure the postulated entities of the theory (Fingelkurts and Fingelkurts 2008).

  7. These EEG phenomena are rarely exploited due to lack of analytical tools and methodology. Special techniques (which take into consideration the inherent nature and structure of EEG signal) are required for the detection of them (Fingelkurts and Fingelkurts 2001, 2008).

  8. See Fingelkurts and Fingelkurts (2005).

  9. Quantitatively such phenomenon is assessed through the measure of synchronization of EEG segments (structural synchrony) obtained from different brain locations (Fingelkurts and Fingelkurts 2001, 2005).

  10. Each OM is a metastable spatial-temporal pattern of brain activity; it is so because the neuronal assemblies which constitute OM have different operations/functions and each does its own inherent “job” (thus expressing the autonomous tendency), while still, at the same time, been temporally entangled among each other (and thus expressing the coordinated activity) in order to execute common complex operation or complex cognitive act of a higher hierarchy (Fingelkurts and Fingelkurts 2004b, 2005). As has been proposed by Kelso (1995) metastability relates exactly to the phenomenon of a constant interplay between these autonomous and interdependent tendencies (see also Bressler and Kelso 2001).

  11. Note, that we use the term ‘nonconsiousness’ instead of ‘unconsciousness’. We are not going to enter into the extensive debate about the difference of these terms, but they are quite distant and should not be intermixed. In short, unconscious material is still presented somewhere within the mental sphere, but usually is inaccessible for awareness. Only within the mental world it does make sense to speak of conscious or unconscious events or processes (Allen 1994). In contrast, nonconscious processes are nonmental in nature—they are simply not available to mental experience, they are physical or neurophysiological processes (Searle 1992).

  12. The actual neuron-physiological machinery is hidden from our awareness,—it is transparent to us. In other words we have access only to the content, but not the vehicle, of the phenomenal information (see also Revonsuo 2006; Haikonen 2007b).

  13. Isomorphism is generally defined as a mapping of one entity into another having the same elemental structure, whereby the behaviors of the two entities are identically describable (Warfield 1977). A functional isomorphism on the other hand requires the functional connectivity between its component entities (Lehar 2003). It is an extension to Muller’s psychophysical postulate (Muller 1896), and Chalmers’ principle of structural coherence (Chalmers 1995). Therefore, two systems that are functionally isomorphic are, in virtue of this fact, different realizations of the same kind (Shapiro 2000). In other words, two functionally isomorphic different systems bring about the same function that defines the kind. But, if two particulars differ only in properties that do not in any way affect the achievement of the defining capacity of a kind then there is no reason to say that they are tokens of different realizations of the kind (Shapiro 2000).

  14. Such approach coincides with a positive methodology suggested by Chalmers (1995) for facing up to the hard problem of consciousness. The main points of this methodology are: (a) pay careful attention both to physical processing and to phenomenology, (b) find systematic regularities between the two, (c) work down to the simpler principles which explain these regularities in turn, and (d) ultimately explain the connection in terms of a simple set of fundamental laws.

  15. An alternative approach to consciousness which is based on the process-oriented ontology (Whitehead 1927/1978; Griffin 1998) has been suggested by Manzotti (2006). According to Manzotti, consciousness and physical reality can be conceived as two perspectives on the same processes. In this case there is no problem of re-presentation since the experience and the occurrence of the world are identical. More precisely, phenomenal experiences do not represent reality but are reality (for a detail see Manzotti 2006).

  16. Both, the material neurophysiological organization that characterizes brain and the informational order that characterizes mind necessarily involve such events as operations at their cores (Fingelkurts and Fingelkurts 2003, 2005).

  17. It seems an extreme point of view, but one that is gaining some currency in recent discussions of brain-mind interaction (for an example, see Pockett 2000; McFadden 2002; Revonsuo 2006; Freeman 2007a, b).

References

  • Aleksander I, Dunmall B (2003) Axioms and tests for the presence of minimal consciousness in agents. J Conscious Stud 10:7–18

    Google Scholar 

  • Allen JA (1994) Delineating conscious and unconscious processes: commentary on Baars on contrastive analysis. Psyche 1(9). http://psyche.cs.monash.edu.an/v2/psyche-1-9-allen.html

  • Block N (1995) On a confusion about a function of consciousness. Behav Brain Sci 18:227–287

    Article  Google Scholar 

  • Bressler SL, Kelso JAS (2001) Cortical coordination dynamics and cognition. Trends Cogn Sci 5:26–36. doi:10.1016/S1364-6613(00)01564-3

    Article  PubMed  Google Scholar 

  • Chalmers DJ (1995) Facing up to the problems of consciousness. J Conscious Stud 2:200–219

    Google Scholar 

  • Chalmers DJ (1996) The conscious mind: In search of a fundamental theory. Oxford University Press, New York

    Google Scholar 

  • Chella A, Manzotti R (2007) Artificial consciousness. Imprint Academic, Exeter

    Google Scholar 

  • Chrisley R, Parthemore J (2007) Synthetic phenomenology: exploiting embodiment to specify the non-conceptual content of visual experience. J Conscious Stud 14:44–58

    Google Scholar 

  • Clowes R, Torrance S, Chrisley R (2007) Machine consciousness: embodiment and imagination. J Conscious Stud 14:7–14

    Google Scholar 

  • Dainton B (2000) Stream of consciousness. Routledge, London

    Google Scholar 

  • Dennett DC (1991) Consciousness explained. Little Brown, Boston

    Google Scholar 

  • Dreyfus HL (2007) Why Heideggerian AI failed and how fixing it would require making it more Heideggerian. Philos Psychol 20:247–268. doi:10.1080/09515080701239510

    Article  Google Scholar 

  • Fingelkurts AA, Fingelkurts AA (2001) Operational architectonics of the human brain biopotential field: towards solving the mind-brain problem. Brain Mind 2:261–296. doi:10.1023/A:1014427822738

    Article  Google Scholar 

  • Fingelkurts AA, Fingelkurts AA (2003) Operational architectonics of perception and cognition: A principle of self-organized metastable brain states, VI Parmenides Workshop “Perception and Thinking” of the Institute of Medical Psychology. University of Munich, Elba 5–10 April

  • Fingelkurts AA, Fingelkurts AA (2004a) The Operational Architectonics concept of brain and mind functioning. Congress on Modeling Mental Processes and Disorders. Kusadasi 24–29 May

  • Fingelkurts AA, Fingelkurts AA (2004b) Making complexity simpler: multivariability and metastability in the brain. Int J Neurosci 114:843–862. doi:10.1080/00207450490450046

    Article  PubMed  Google Scholar 

  • Fingelkurts AA, Fingelkurts AA (2005) Mapping of the brain operational architectonics. In: Chen FJ (ed) Focus on brain mapping research, Chap 2. Nova Science Publishers Inc, New York, pp 59–98

    Google Scholar 

  • Fingelkurts AA, Fingelkurts AA (2006) Timing in cognition and EEG brain dynamics: discreteness versus continuity. Cogn Process 7:135–162. doi:10.1007/s10339-006-0035-0

    Article  PubMed  Google Scholar 

  • Fingelkurts AA, Fingelkurts AA (2008) Brain-mind Operational Architectonics imaging: technical and methodological aspects. Open Neuroimag J 2:73–93

    Article  PubMed  Google Scholar 

  • Freeman WJ (2007a) Indirect biological measures of consciousness from field studies of brains as dynamical systems. Neural Netw 20:1021–1031. doi:10.1016/j.neunet.2007.09.004

    Article  PubMed  Google Scholar 

  • Freeman WJ (2007b) Definitions of state variables and state space for brain-computer interface. Part 1. Multiple hierarchical levels of brain function. Cogn Neurodyn 1:3–14. doi:10.1007/s11571-006-9001-x

    Article  PubMed  Google Scholar 

  • Gamez D (2005) An ordinal probability scale for synthetic phenomenology. In: Chrisley R, Clowes R, Torrance S (eds) Proceedings of the AISB05 symposium on next generation approaches to machine consciousness. Hatfield, UK, pp 85–94

    Google Scholar 

  • Gamez D (2006) The XML approach to synthetic phenomenology. In: Chrisley R, Clowes R, Torrance S (eds) Proceedings of the AISB06 symposium on integrative approaches to machine consciousness. Bristol, UK, pp 128–135

  • Gamez D (2008) Progress in machine consciousness. Conscious Cogn 17:887–910. doi:10.1016/j.concog.2007.04.005

    Article  PubMed  Google Scholar 

  • Griffin DR (1998) Unsnarling the world-knot: consciousness, freedom and the mind-body problem. University of California Press, Berkeley

    Google Scholar 

  • Haikonen PO (2007a) Robot brains: circuits and systems for conscious machines. Wiley, UK

    Google Scholar 

  • Haikonen PO (2007b) Essential issues of conscious machines. J Conscious Stud 14:72–84

    Google Scholar 

  • Hesslow G, Jirenhed D-A (2007) The inner world of a simple robot. J Conscious Stud 14:85–96

    Google Scholar 

  • Holland O (2003) Editorial introduction. J Conscious Stud 12:1–6

    Google Scholar 

  • Ikegami T (2007) Simulating active perception and mental imagery with embodied chaotic itinerancy. J Conscious Stud 14:111–125

    Google Scholar 

  • James W (1890) The principles of psychology, vol I. Dover, New York

    Google Scholar 

  • Kaplan AYa, Fingelkurts AA, Fingelkurts AA, Borisov SV, Darkhovsky BS (2005) Nonstationary nature of the brain activity as revealed by EEG/MEG: methodological, practical and conceptual challenges. Signal Process 85:2190–2212. doi:10.1016/j.sigpro.2005.07.010

    Article  Google Scholar 

  • Kelso JAS (1995) Dynamic patterns: the self-organization of brain and behavior. MIT Press, Cambridge

    Google Scholar 

  • Kiverstein J (2007) Could a robot have a subjective point of view? J Conscious Stud 14:127–139

    Google Scholar 

  • Koch C (2004) The quest for consciousness: a neurobiological approach. Roberts & Company Publishers, Englewood (Col)

    Google Scholar 

  • Koch C, Tononi G (2008) Can machines be conscious? IEEE Spectr 6:47–51

    Google Scholar 

  • Laureys S, Pellas F, Van Eeckhout F, Ghorbel S, Schnakers C et al (2005) The locked-in syndrome: what is it like to be conscious but paralyzed and voiceless? In: Laureys S (ed) Progress in brain research, vol 150. Elsevier, Amsterdam, pp 495–511

    Google Scholar 

  • Lehar S (2003) Gestalt isomorphism and the primacy of subjective conscious experience: a gestalt bubble model. Behav Brain Sci 26:375–408

    PubMed  Google Scholar 

  • von der Malsburg C (1999) The what and why of binding: the modeler’s perspective. Neuron 24:95–104. doi:10.1016/S0896-6273(00)80825-9

    Article  Google Scholar 

  • Manzotti R (2006) A process oriented view of conscious perception. J Conscious Stud 13:7–41

    Google Scholar 

  • Manzotti R (2007) Towards artificial consciousness. APA Newsl Philos Comput 07:12–15

    Google Scholar 

  • Markram H (2006) The blue brain project. Nat Rev Neurosci 7:153–160. doi:10.1038/nrn1848

    Article  PubMed  CAS  Google Scholar 

  • McFadden J (2002) Synchronous firing and its influence on the brain’s electromagnetic field. Evidence for an electromagnetic field theory of consciousness. J Conscious Stud 9:23–50

    Google Scholar 

  • Metzinger T (2003) Being no one. MIT Press, Cambridge

    Google Scholar 

  • Moran D (2000) Introduction to phenomenology. Routledge, London

    Google Scholar 

  • Muller GE (1896) Zur psychophysik der gesichtsempfindungen. Z Psychol 10:1–82

    Google Scholar 

  • Nagel T (1974) What is it like to be a bat? Phyl Rev 83:435–450. doi:10.2307/2183914

    Article  Google Scholar 

  • Nunez PL (2000) Toward a quantitative description of large-scale neocortical dynamic function and EEG. Behav Brain Sci 23:371–398. doi:10.1017/S0140525X00003253

    Article  PubMed  CAS  Google Scholar 

  • Pockett S (2000) The nature of consciousness: a hypothesis. Writers Club Press, Lincoln

    Google Scholar 

  • Revonsuo A (2001) Can functional brain imaging discover consciousness in the brain? J Conscious Stud 8:3–23

    Google Scholar 

  • Revonsuo A (2006) Inner presence: Consciousness as a biological phenomenon. MIT Press, Cambridge

    Google Scholar 

  • Searle JR (1992) The rediscovery of the mind. MIT Press, Cambridge

    Google Scholar 

  • Shannon CE (1948) A mathematical theory of communication. Bell Sys Tech J 27(379–423):623–656

    Google Scholar 

  • Shapiro LA (2000) Multiple realizations. J Philos 97:635–654. doi:10.2307/2678460

    Article  Google Scholar 

  • Sloman A, Chrisley RL (2005) More things than are dreamt of in your biology: Information-processing in biologically inspired robots. Cogn Syst Res 6:145–174. doi:10.1016/j.cogsys.2004.06.004

    Article  Google Scholar 

  • Stening J, Jacobsson H, Ziemke T (2005) Imagination and abstraction of sensorimotor flow: Towards a robot model. In: Chrisley R, Clowes R, Torrance S (eds) Proceedings of the AISB05 symposium on next generation approaches to machine consciousness. Hatfield, UK, pp 50–58

  • Stubenberg L (1998) Consciousness and qualia. John Benjamins, Amsterdam

    Google Scholar 

  • Warfield JN (1977) Crossing theory and hierarchy mapping. IEEE Trans Syst Man Cybern 7:505–523. doi:10.1109/TSMC.1977.4309760

    Article  Google Scholar 

  • Whitehead AN (1929/1978) Process and Reality, Free Press, London

  • Zeki S (2003) The disunity of consciousness. Trends Cogn Sci 7:214–218. doi:10.1016/S1364-6613(03)00081-0

    Article  PubMed  Google Scholar 

  • Zeki S, Bartels A (1999) Towards a theory of visual consciousness. Consious Cogn 8:225–259. doi:10.1006/ccog.1999.0390

    Article  CAS  Google Scholar 

  • Ziemke T (2007) What’s life got to do with it? In: Chella A, Manzotti R (eds) Artificial consciousness. Imprint Academic, Exeter , pp 48–66

    Google Scholar 

Download references

Acknowledgment

This theoretical work was supported by BM-Science.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew A. Fingelkurts.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Fingelkurts, A.A., Fingelkurts, A.A. & Neves, C.F.H. Brain and mind operational architectonics and man-made “machine” consciousness. Cogn Process 10, 105–111 (2009). https://doi.org/10.1007/s10339-008-0234-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10339-008-0234-y

Keywords

Navigation