Skip to main content
Log in

Conscious machines: Memory, melody and muscular imagination

  • Published:
Phenomenology and the Cognitive Sciences Aims and scope Submit manuscript

Abstract

A great deal of effort has been, and continues to be, devoted to developing consciousness artificially (A small selection of the many authors writing in this area includes: Cotterill (J Conscious Stud 2:290–311, 1995, 1998), Haikonen (2003), Aleksander and Dunmall (J Conscious Stud 10:7–18, 2003), Sloman (2004, 2005), Aleksander (2005), Holland and Knight (2006), and Chella and Manzotti (2007)), and yet a similar amount of effort has gone in to demonstrating the infeasibility of the whole enterprise (Most notably: Dreyfus (1972/1979, 1992, 1998), Searle (1980), Harnad (J Conscious Stud 10:67–75, 2003), and Sternberg (2007), but there are a great many others). My concern in this paper is to steer some navigable channel between the two positions, laying out the necessary pre-conditions for consciousness in an artificial system, and concentrating on what needs to hold for the system to perform as a human being or other phenomenally conscious agent in an intersubjectively-demanding social and moral environment. By adopting a thick notion of embodiment—one that is bound up with the concepts of the lived body and autopoiesis (Maturana and Varela 1980; Varela et al. 2003; and Ziemke 2003, 2007a, J Conscious Stud 14(7):167–179, 2007b)—I will argue that machine phenomenology is only possible within an embodied distributed system that possesses a richly affective musculature and a nervous system such that it can, through action and repetition, develop its tactile-kinaesthetic memory, individual kinaesthetic melodies pertaining to habitual practices, and an anticipatory enactive kinaesthetic imagination. Without these capacities the system would remain unconscious, unaware of itself embodied within a world. Finally, and following on from Damasio’s (1991, 1994, 1999, 2003) claims for the necessity of pre-reflective conscious, emotional, bodily responses for the development of an organism’s core and extended consciousness, I will argue that without these capacities any agent would be incapable of developing the sorts of somatic markers or saliency tags that enable affective reactions, and which are indispensable for effective decision-making and subsequent survival. My position, as presented here, remains agnostic about whether or not the creation of artificial consciousness is an attainable goal.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. A small selection of the many authors writing in this area includes: Cotterill (1995, 1998), Haikonen (2003), Aleksander and Dunmall (2003), Sloman (2004, 2005), Aleksander (2005), Holland and Knight (2006) and Chella and Manzotti (2007).

  2. Most notably: Dreyfus (1972/1979, 1992, 1998), Searle (1980), Harnad (2003) and Sternberg (2007), but there are a great many others.

  3. See: http://fr.wikipedia.org/wiki/Jacques_de_Vaucanson.

  4. Our readiness to over-interpret the actions of the ‘other’ in our midst as intentional is a useful strategy for survival. Even though infants learn to distinguish animate from inanimate, and then to distinguish minded animate from non-minded animate (Stern 1985), it is still wiser to be mistaken occasionally and over-ascribe a mind to some thing rather than under-ascribe and risk becoming that thing’s lunch.

  5. The most obvious exceptions from a clear answer of this sort are art objects.

  6. See: http://marsrover.nasa.gov/home/index.html.

  7. The implications of robot servitude will not be considered in this paper, but they have been discussed in much detail elsewhere, for example, Asimov (1950) and Peterson (2007).

  8. These figures were taken from http://money.cnn.com.

  9. These figures were taken from http://www.antiwar.com.

  10. It is not lost on the author that the characteristics of concern, of compassion, and of the possession of moral wisdom (phronesis) which is associated with being the best human being, do not sit well with waging war.

  11. This is by no means a full list of the possible reasons for creating an artificially conscious system; there may even be the (possibly) non-instrumental reason of creating it so that we better understand consciousness, or simply the desire to bring another conscious being into existence.

  12. For a very interesting and detailed discussion of the sensory and proprioceptive capacities of exoskeletal systems and invertebrates in general I recommend Sheets-Johnstone (1998): “hard-bodied invertebrates have external sensilla of various kinds: hairs, exoskeletal plates, epidermal organs, cilia, spines, pegs, slits, and so on. It is these external sensory organs that make possible an awareness of surface events in the double sense noted above: an awareness of the terrain on which and/or the environment through which the animal is moving and an awareness of bodily deformations or stresses occurring coincident with moving on the terrain and/or through the environment” [p.279].

  13. http://www.consciousness.it/CAI/CAI.htmViz. Johnson (1990), Chiel and Beer (1997), Clark (1997), Damasio (1999), Lakoff and Johnson (1999), Seitz (2000), Dobbyn and Stuart (2003), Legrand (2006), Ziemke (2003, 2007a, b); amongst a great many others.

  14. The concept of ‘environment’ is used thickly to refer to the system’s world and its own variable internal states that are the subject of homeostatic functions.

  15. The CRONOS Project website: http://www.cronosproject.net/.

  16. ‘Oliver Sacks remarks (in the ‘Forward’ to Cole 1995) that the case of IW “shows how such a peripheral disorder can have the profoundest ‘central’ effects on what Gerald Edelman called the ‘primary consciousness’ of a person: his ability to experience his body as continuous, as ‘owned,’ as controlled, as his. We see that a disorder of touch and proprioception, itself unconscious, becomes, at the highest level, a ‘disease of consciousness’” (xiii).’ (Gallagher and Cole 1995, Footnote 7).

  17. There may also have been some corresponding effect on the individual’s body image, for example, cases of body dysmorphic disorder when someone does not believe that a limb is theirs or apotemnophilia when someone wants a limb amputated because it does not correspond with how they feel themselves to be, might be the result of a faulty or malfunctioning body schema which leaks into how they perceive themselves, but this cannot be taken further in this paper.

  18. Having said this, there are times when a phenomenally affective conscious agent can entail ineffective decision-making, and where the operation of a WAC, an unconscious machine, would be preferable. In the case of the mid-air collision of the Bakshirian Airlines Tupolov 154 and the DHL Boeing 757 on 1 July 2002, the pilots gave the conscious human flight controller the benefit of the doubt as he over-ruled the ‘unconscious’ ACAS II-compliant collision-avoidance system’s message; had they not, the disaster might just have been avoided.

References

  • Aleksander, I., & Dunmall, B. (2003). Axioms and tests for the presence of minimal consciousness in agents. Journal of Consciousness Studies, 10(4–5), 7–18.

    Google Scholar 

  • Aleksander, I. (2005). The world in my mind, my mind in the world: Key mechanisms of consciousness in humans, animals and machines. Exeter: Imprint Academic.

    Google Scholar 

  • Asimov, I. (1950). I, Robot. Greenwich: Fawcett.

    Google Scholar 

  • Bauby, J.-D. (1997). The diving-bell and the butterfly. London: Fourth Estate, Harper Perennial 2004.

    Google Scholar 

  • Brewer, B. (1992). Self-location and agency. Mind, 101, 17–34.

    Article  Google Scholar 

  • Chella, A., & Manzotti, R. (2007). Artificial consciousness. Exeter: Imprint Academic.

    Google Scholar 

  • Chiel, H. J., & Beer, R. D. (1997). The brain has a body: Adaptive behavior emerges from interactions of nervous system, body and environment. Trends in Neurosciences, 20, 553–557.

    Article  Google Scholar 

  • Clark, A. (1997). Being there: Putting brain, body, and world together again. Cambridge: MIT.

    Google Scholar 

  • Cole, J. (1995). Pride and a daily marathon. Cambridge: MIT (orig. 1991 London: Duckworth).

    Google Scholar 

  • Cole, J. (2005). Imagination after neurological losses of movement and sensation: The experience of spinal cord injury. Phenomenology and the Cognitive Sciences, 4(2), 183–195.

    Article  Google Scholar 

  • Cotterill, R. M. J. (1995). On the unity of conscious experience. Journal of Consciousness Studies (Imprint Academic), 2(4), 290–311.

    Google Scholar 

  • Cotterill, R. M. J. (1998). Enchanted looms: Conscious networks in brains and computers. Cambridge: Cambridge University Press

    Google Scholar 

  • Damasio, A. R. (1994). Descartes’ error: Emotion, reason, and the human brain, New York: Grosset/Putnam.

    Google Scholar 

  • Damasio, A. R. (1999). The feeling of what happens: Body, emotion and the making of consciousness. New York: Harcourt Brace.

    Google Scholar 

  • Damasio, A. R. (2003). Looking for Spinoza: Joy, sorrow, and the feeling brain. London: Harcourt.

    Google Scholar 

  • Damasio, A. R., Grabowski, T. J., Bechara, A., Damasio, H., Ponto, L. L. B., Parvizi, J., et al. (2000). Subcortical and cortical brain activity during the feeling of self-generated emotions. Nature Neuroscience, 3(10), 1049–1056.

    Article  Google Scholar 

  • Damasio, H., Grabowski, T., Frank, R., Galaburda, A. M., & Damasio, A. R. (1994). The return of Phineas Gage: Clues about the brain from the skull of a famous patient. Science, 264(5162), 1102–1105.

    Article  Google Scholar 

  • Damasio, A. R., Tranel, D., & Damasio, H. (1991). Somatic markers and the guidance of behaviour: Theory and preliminary testing. In H. S. Levin, H. M. Eisenberg, & A. L. Benton (Eds.), Frontal lobe function and dysfunction (pp. 217–229). New York: Oxford University Press.

    Google Scholar 

  • Dobbyn, C., & Stuart, S. A. J. (2003). The self as an embedded agent. Minds and Machines, 13(2), 187–201.

    Article  Google Scholar 

  • Dreyfus, H. (1972/1979). What computers can’t do: A critique of artificial reason. New York: Harper & Row.

    Google Scholar 

  • Dreyfus, H. (1992). What computers “still” can’t do: A critique of artificial reason (revised ed.). Cambridge: MIT.

    Google Scholar 

  • Dreyfus, H. (1998). Response to my critics. In T. W. Bynum (Ed.), The digital phoenix. Cambridge: Blackwell.

    Google Scholar 

  • Ekman, P. (1992). Facial expressions of emotions: New findings, new questions. Psychological Science, 3(1), 34–38.

    Article  Google Scholar 

  • Flanagan, O. (1992). Consciousness reconsidered. Cambridge: MIT.

    Google Scholar 

  • Gallagher, S. (1986). Body image and body schema: A conceptual clarification. Journal of Mind and Behavior, 7(4), 541–554.

    Google Scholar 

  • Gallagher, S. (2007). Moral agency, self-consciousness ,and practical wisdom. Journal of Consciousness Studies, 14(5–6), 199–223.

    Google Scholar 

  • Gallagher, S., & Cole, J. D. (1995). Body schema and body image in a deafferented subject. Journal of Mind and Behavior, 16, 369–390.

    Google Scholar 

  • Gibson, J. J. (1968). The senses considered as perceptual systems. London, George Allen & Unwin.

    Google Scholar 

  • Gibson, J. J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin.

    Google Scholar 

  • Haikonen, P. (2003). The cognitive approach to conscious machines. Exeter: Imprint Academic.

    Google Scholar 

  • Harlow, J. M. (1848). Passage of an iron rod through the head. Boston Medical and Surgical Journal, 39, 389–393. (Republished: Intro. by T.C. Neylan, Frontal lobe function: Mr Phineas Gage’s famous injury, Journal of Neuropsychiatry and Clinical Neuroscience, 11(2), 281–283 (1999)).

    Google Scholar 

  • Harnad, S. (2003). Can a machine be conscious? How? Journal of Consciousness Studies, 10(45), 67–75.

    Google Scholar 

  • Head, H., & Holmes, G. M. (1911). Sensory disturbances from cerebral lesions. Brain (Oxford), 34, 102–254.

    Google Scholar 

  • Heidegger, M. (1962). Being and time. Trans. by John Macquarrie & Edward Robinson. London: SCM.

    Google Scholar 

  • Holland, O. (2003). Machine consciousness. New York: Imprint Academic.

    Google Scholar 

  • Holland, O., & Knight, R. (2006). The anthropomimetic principle. Department of Computer Science, University of Essex.

  • Ings, S. (2007). The eye: A natural history. London: Bloomsbury.

    Google Scholar 

  • Johnson, M. (1990). The body in the mind: The bodily basis of meaning, imagination, and reason. London: University of Chicago Press.

    Google Scholar 

  • Lakoff, G., & Johnson, M. (1999). Philosophy in the flesh: The embodied mind and its challenge to western thought, New York: Basic.

    Google Scholar 

  • Legrand, D. (2006). The bodily self: The sensori-motor roots of pre-reflexive self-consciousness. Phenomenology and the Cognitive Sciences, 5(1), 89–118.

    Article  Google Scholar 

  • Luria, A. R. (1973). The working brain: An introduction to neuropsychology. Trans. by Basil Haigh. London: Allen Lane.

    Google Scholar 

  • Maturana, H. R., & Varela, F. J. (1980). Autopoiesis and cognition. Dordrecht: Reidel.

    Google Scholar 

  • Meijsing, M. (2000). Self-consciousness and the body. Journal of Consciousness Studies, 7(6), 34–52.

    Google Scholar 

  • Merleau-Ponty, M. (1962). Phenomenology of perception. Trans. by Colin Smith. London Routledge & Kegan Paul; Humanities.

  • Merleau-Ponty, M. (1968). The visible and the invisible. Evanston: Northwestern University Press.

    Google Scholar 

  • Paillard, J. (2005). Vectorial versus configural encoding of body space, a neural basis for a distinction between body schema and body image. In V. Knockaert, & H. De Preester (Eds.), Body image and body schema: Interdisciplinary perspectives (pp. 89–109). Amsterdam: John Benjamin.

    Google Scholar 

  • Peterson, S. (2007). The ethics of robot servitude. Journal of Experimental & Theoretical Artificial Intelligence, 19(1), 43–54.

    Article  Google Scholar 

  • Sacks, O. (1984). A leg to stand on. New York: Perennial Library; Harper and Row.

    Google Scholar 

  • Schwoebel, J., Friedman, R., Duda, N., & Coslet, H. B. (2001). Pain and the body schema: Evidence for peripheral effects on mental representations of movement. Brain, 124(10), 2098–2104.

    Article  Google Scholar 

  • Searle, J. (1980). Minds, brains and programs. The Behavioral and Brain Sciences, 3(3), 417–457.

    Google Scholar 

  • Seitz, J. A. (2000). The bodily basis of thought. New Ideas in Psychology: An International Journal of Innovative Theory in Psychology, 18(1), 23–40.

    Google Scholar 

  • Sheets-Johnstone, M. (1998). ‘Consciousness: A natural history’. Journal of Consciousness Studies, 5(3), 260–294.

    Google Scholar 

  • Sheets-Johnstone, M. (1999). The primacy of movement. Amsterdam: J. Benjamins.

    Google Scholar 

  • Sheets-Johnstone, M. (2000). Kinetic tactile-kinesthetic bodies: Ontogenetical foundations of apprenticeship learning. Human Studies, 23, 343–370.

    Article  Google Scholar 

  • Sheets-Johnstone, M. (2003). Kinesthetic memory. Theoria et Historia Scientiarum, 7, 69–92.

    Google Scholar 

  • Sloman, A. (2004) Varieties of affect and learning in a complete human-like architecture. http://www.cs.bham.ac.uk/research/cogaff/talks/#talk24. Retrieved July 2004.

  • Sloman, A. (2005). What are information-processing machines? What are information-processing vitual machines. http://www.cs.bham.ac.uk/~axs/misc/talks/information.pdf. Retrieved January 2005.

  • Stern, D. (1985). The interpersonal world of the infant: A view from psychoanalysis and developmental psychology. New York: Basic Books.

    Google Scholar 

  • Sternberg, E. J. (2007). Are you a machine? Tha brain the mind and what it means to be human. Amherst: Prometheus.

    Google Scholar 

  • Stuart, S. (2007). Machine consciousness: cognitive and kinaesthetic imagination. Journal of Consciousness Studies, 14(7), 141–153.

    Google Scholar 

  • Torrance, S. (2008). Ethics and consciousness in artificial agents. ArtificiaI Intelligence & Society, 22, 495–521.

    Google Scholar 

  • Varela, F., Thompson, E., & Rosch, E. (2003). The embodied mind: Cognitive science and human experience. Cambridge: MIT.

    Google Scholar 

  • Whitehead, A. N. (1929). Process and reality. Cambridge: Cambridge University Press.

    Google Scholar 

  • Ziemke, T. (2003). What’s that thing called embodiment? In Proceedings of the 25th annual meeting of the cognitive science society. Hove: Lawrence Erlbaum.

    Google Scholar 

  • Ziemke, T. (2007a). What’s life got to do with it? In A. Chella & R. Manzotti (Eds.), Artificial consciousness (pp. 48–66), Exeter: Imprint Academic.

    Google Scholar 

  • Ziemke, T. (2007b). The embodied self: Theories, hunches and robot models. Journal of Consciousness Studies, 14(7), 167–179.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Susan A. J. Stuart.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Stuart, S.A.J. Conscious machines: Memory, melody and muscular imagination. Phenom Cogn Sci 9, 37–51 (2010). https://doi.org/10.1007/s11097-009-9134-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11097-009-9134-6

Keywords

Navigation