Abstract
Synthetic psychology describes the approach of “understanding through building” applied to the human condition. In this chapter, we consider the specific challenge of synthesizing a robot “sense of self”. Our starting hypothesis is that the human self is brought into being by the activity of a set of transient self-processes instantiated by the brain and body. We propose that we can synthesize a robot self by developing equivalent sub-systems within an integrated biomimetic cognitive architecture for a humanoid robot. We begin the chapter by motivating this work in the context of the criteria for recognizing other minds, and the challenge of benchmarking artificial intelligence against human, and conclude by describing efforts to create a sense of self for the iCub humanoid robot that has ecological, temporally-extended, interpersonal and narrative components set within a multi-layered model of mind.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
By this, we mean the cluster of different but overlapping intellectual/cognitive faculties that make humans adaptive, flexible sociotechnical animals. Gardner’s [22] “multiple intelligences” view provides a good guide to this broader notion of human cognition. Attempts to create machine intelligence of this more multi-faceted form are increasingly discussed under the label Artificial General Intelligence (AGI) (e.g., [23]), hence we are using the phrase “general intelligence” rather than Gardner’s multiple intelligences.
- 2.
Nathan Bateman to Caleb Smith about the humanoid robot “Ava” he has created, from the original movie script for Ex Machina (2015) by Alex Garland.
- 3.
The suggestion that we call this the Garland test has also been made by Murray Shanahan, one of the scientific advisors on Ex Machina.
- 4.
It has been suggested that Harnad’s T2 level cannot be achieved without first building T3 to achieve symbol-grounding [26]. Going directly to T2 is nevertheless a theoretical possibility, even if it might prove impossible to achieve without a contribution from robotics.
- 5.
This idea also follows in the footsteps of many others. For example, the eighteenth century Neapolitan philosopher Giambattista Vico, who wrote “verum et factum reciprocantur seu convertuntur [the true is precisely what is made]”, and the 20th century physicist Richard Feynman, whose office blackboard on the day he died held the message, “what I cannot create I do not understand”.
- 6.
There are multiple measures of so-called “correlates of consciousness”, Tononi’s Φ [65], a measure of information integration, being one of the better-known ones. The problem is that there is no way to be sure that an organism or machine that scores highly on any such measure is actually experiencing consciousness. This is known as the “other minds” problem in philosophy. For Turing [68], this was part of the reason to devise a behavioral test for the existence of machine thought and to leave the challenge of consciousness to others.
- 7.
Abel’s “field of being” view stems from Merleau-Ponty’s [38] phenomenology and his insistence on the centrality of the experience of the body. Studies in cognitive neuroscience, such as those of the “rubber hand” illusion (see [10]), support Merleau-Ponty’s proposal that the sense of the body/self can extend into objects and the world. With virtual reality systems and telepresence robots, it is now possible to experimentally manipulate the sense of a virtual body, or of a physically remote robot body, and the associated feelings of immersion or “presence”, demonstrating that “my body is wherever there is something to be done” (Merleau-Ponty, [38] p. 291) and providing new ways to test hypotheses about the self.
- 8.
This was proposed by Hume [30], for whom, if the stream of perceptions is turned off, as happens in sleep, the self ceases to exist, and by Locke [35], for whom self was a manifestation of consciousness, which, in turn, requires an awake mind. Some elements of Locke’s view of self, which saw identity as arising from learning and memory, are close to the ideas of the extended and narrative selves discussed in this chapter.
- 9.
We should admit here that Strawson intends the more restricted philosophical sense of phenomenology as a form of systematic reflection on the structure of experience. We prefer to interpret the challenge of describing the nature of self from a more empirical perspective as phenomena associated with self that could be accessible to methods in psychology and cognitive neuroscience.
- 10.
Note that, for a theory or concept of self to be useful, we would not consider that the self has to be emergent in a strong sense (that is, not reducible to lower level phenomena), but rather it has to serve a useful explanatory function in our psychological theory. In other words, the concept of self as explicated and realized in machine form should help us to provide useful accounts of human (or machine) cognition and behavior. See Verschure and Prescott [72] for a discussion of theory building and the role of synthetic approaches in the sciences of mind and brain.
- 11.
Modularity is itself a topic that is widely debated within the cognitive sciences. Again, we consider that the synthetic approach can help answer some of the longstanding questions about how distributed vs. modular human minds/brains are. Our view is that the distributed nature of the brain can be over-stated. The brain is a layered architecture [49], and as such, there is significant replication of function and some redundancy across these layers, however, there is also localization of function and specific local or repeated circuits that perform roles that can be clearly described and differentiated.
- 12.
Endel Tulving’s patient N.N. exemplifies this point [67]. A traffic accident caused N.N. to experience profound retrograde and anterograde amnesia, nevertheless he could still talk about himself, his experience, his preferences, and so on; he had intact short-term memory and could describe time and events in general terms. He could talk about consciousness, which he described as “being aware of who we are and what we are and where we are” ([67], p. 4). When asked to imagine what he might do tomorrow, however, his mind drew a blank, which he described as being “like swimming in the middle of a lake. There’s nothing there to do hold you up or do anything with” ([67], p. 4). Like other patients with amnesia, N.N. could be described as “marooned in the present” [34] or as having a self that has lost much of its “temporal thickness” [20].
References
Abel, C. (2014). The extended self: Architecture, memes and minds. Manchester: Manchester University Press.
Amsterdam, B. (1972). Mirror self-image reactions before age two. Developmental Psychobiology, 5(4), 297–305.
Ardiel, E. L., & Rankin, C. H. (2010). An elegant mind: Learning and memory in Caenorhabditis elegans. Learning & Memory, 17(4), 191–201. https://doi.org/10.1101/lm.960510.
Bard, K. A., Todd, B. K., Bernier, C., Love, J., & Leavens, D. A. (2006). Self-awareness in human and chimpanzee infants: What is measured and what is meant by the mark and mirror test? Infancy, 9(2), 191–219. https://doi.org/10.1207/s15327078in0902_6.
Baron-Cohen, S., Leslie, A. M., & Frith, U. (1985). Does the autistic child have a ‘theory of mind’? Cognition, 21, 37–48.
Bauer, P. J. (2012). The life I once remembered: The waxing and waning of early memories. In D. Berntsen & D. C. Rubin (Eds.), Understanding autobiographical memory (pp. 205–225). Cambridge: CUP.
Bell, M. A., & Deater-Deckard, K. (2007). Biological systems and the development of self-regulation: Integrating behavior, genetics, and psychophysiology. Journal of Developmental & Behavioral Pediatrics, 28(5).
Bermúdez, J. (1988). The paradox of self-consciousness. Cambridge, MA: MIT Press.
Blakemore, S. (2003). Consciousness in meme machines. Journal of Consciousness Studies, 10(4–5), 19–30.
Blakeslee, S., & Blakeslee, M. (2007). The body has a mind of its own. New York: Random House.
Braitenberg, V. (1986). Vehicles: Experiments in synthetic psychology. Cambridge, MA: MIT Press.
Camilleri, D., & Prescott, T. J. (2017). Action recognition with unsynchronised multi-sensory data. Paper presented at the 7th Joint IEEE International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EPIROB), Lisbon, Portugal.
Camilleri, D., Damianou, A., Jackson, H., Lawrence, N., & Prescott, T. J. (2016). iCub visual memory inspector: Visualising the iCub’s thoughts. In N. F. Lepora, A. Mura, M. Mangan, P. F. M. J. Verschure, M. Desmulliez, & T. J. Prescott (Eds.), Biomimetic and Biohybrid Systems, the 5th International Conference on Living Machines (pp. 48–57). Berlin: Springer LNAI.
Cangelosi, A., Schlesinger, M., & Smith, L. B. (2015). Developmental robotics: From babies to robots. Cambridge, MA: MIT Press.
Damianou, A., Henrik, C., Boorman, L., Lawrence, N. D., & Prescott, T. J. (2015). A top-down approach for a synthetic autobiographical memory system. In S. Wilson, T. J. Prescott, A. Mura, & P. F. M. J. Verschure (Eds.), Biomimetic and Biohybrid Systems, the 4th International Conference on Living Machines (Vol. 9222, pp. 280–292). Berlin: Springer LNAI.
Demiris, Y., Aziz-Zadeh, l., & Bonaiuto, J. (2014). Information processing in the mirror neuron system in primates and machines. Neuroinformatics, 12(1), 63–91.
Doherty, M. (2009). Theory of mind: How children understand others’ thoughts and feelings. Hove: Psychology Press.
Donald, M. (2012). Evolutionary origins of autobiographical memory: A retrieval hypothesis. In D. Berntsen & D. C. Rubin (Eds.), Understanding autobiographical memory (pp. 269–289). Cambridge: CUP.
Evans, M. H., Fox, C. W., & Prescott, T. J. (2014). Machines learning—towards a new synthetic autobiographical memory. In A. Duff, N. Lepora, A. Mura, T. Prescott, & P. M. J. Verschure (Eds.), Biomimetic and Biohybrid Systems, the 3rd International Conference on Living Machines (Vol. 8608, pp. 84–96). Berlin: Springer LNAI.
Friston, K. (2017). The mathematics of mind-time. Aeon.
Gallagher, S. (2000). Philosophical conceptions of the self: Implications for cognitive science. Trends in Cognitive Sciences, 4(1), 14–21.
Gardner, H. (2006). Multiple intelligences: New horizons. New York: Basic Books.
Goertzel, B., & Pennachin, C. (2007). Artificial general intelligence. New York: Springer.
Harnad, S. (1991). Other bodies, other minds: A machine incarnation of an old philosophical problem. Minds and Machines, 1, 43–54.
Harnad, S. (1994). Does the mind piggy-back on robotic and symbolic capacity? In H. L. Morowitz & J. L. Singer (Eds.), The mind, the brain, and complex adaptive systems, santa fe institute studies in complexity XXII (pp. 204–220). Boston: Addison Wesley.
Hauser, L. (1993). Reaping the worldwind: Reply to harnad’s “other bodies, other minds”. Minds and Machines, 3, 219–238.
Hoffmann, M., Straka, Z., Farkas, I., Vavrecka, M., & Metta, G. (2017). Robotic homunculus: Learning of artificial skin representation in a humanoid robot motivated by primary somatosensory cortex. IEEE Transactions on Cognitive and Developmental Systems, pp(99), 1–1. https://doi.org/10.1109/tcds.2017.2649225.
Hofstadter, D. (2007). I am a strange loop. New York: Basic Books.
Hood, B. (2012). The Self illusion: Why there is no ‘you’ inside your head. London: Constable and Robinson.
Hume, D. (1740). A treatise on human nature.
Humphries, M. D., & Prescott, T. J. (2010). The ventral basal ganglia, a selection mechanism at the crossroads of space, strategy, and reward. Progress in Neurobiology, 90(4), 385–417. https://doi.org/10.1016/j.pneurobio.2009.11.003.
Jeannerod, M. (2003). The mechanism of self-recognition in humans. Behavioural Brain Research, 142(1), 1–15. https://doi.org/10.1016/S0166-4328(02)00384-4.
Lambert, F. R., Lavenex, P., & Lavenex, P. B. (2017). The “when” and the “where” of single-trial allocentric spatial memory performance in young children: Insights into the development of episodic memory. Developmental Psychobiology, 59(2), 185–196. https://doi.org/10.1002/dev.21479.
Lidz, T. (1942). The amnesic syndrome. Archives of Neurology and Psychiatry, 47, 588–605.
Locke, J. (1777). An enquiry concerning human understanding.
Lungarella, M., Metta, G., Pfeifer, R., & Sandini, G. (2003). Developmental robotics: A survey. Connection Science, 15(4), 151–190. https://doi.org/10.1080/09540090310001655110.
Martinez-Hernandez, U., Damianou, A., Camilleri, D., Boorman, L. W., Lawrence, N., & Prescott, T. J. (2016). An integrated probabilistic framework for robot perception, learning and memory. Paper presented at the 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, China. pp. 1796–1801.
Merleau-Ponty, M. (1945/1962). Phénoménologie de la Perception (C. Smith, Trans.). London: Routledge.
Metzinger, T. (2009). The ego tunnel: The science of the mind and the myth of the self. New York: Basic Books.
Mitchinson, B., Pearson, M., Pipe, T., & Prescott, T. J. (2011). Biomimetic robots as scientific models: A view from the whisker tip. In J. Krichmar & H. Wagatsuma (Eds.), Neuromorphic and brain-based robots (pp. 23–57). Boston, MA: MIT Press.
Moulin-Frier, C., Fischer, T., Petit, M., Pointeau, G., Puigbo, J. Y., & Pattacini, U., et al. (2017). DAC-h3: A proactive robot cognitive architecture to acquire and express knowledge about the world and the self. IEEE Transactions on Cognitive and Developmental Systems, PP(99), 1–1. https://doi.org/10.1109/tcds.2017.2754143.
Neisser, U. (1988). Five kinds of self-knowledge. Philosophical Psychology, 1, 35–59. https://doi.org/10.1080/09515088808572924.
Neisser, U. (1995). Criteria for an ecological self. In P. Rochat (Ed.), The Self in infancy: Theory and research. Amsterdam: Elsevier.
Nelson, K. (2007). Young minds in social worlds: Experience, meaning and memory. Cambridge, MA: Harvard University Press.
Panksepp, J. (1998). Affective neuroscience: The foundations of human and animal emotions. Oxford: OUP.
Pointeau, G., & Dominey, P. F. (2017). The role of autobiographical memory in the development of a robot self. Frontiers in Neurorobotics, 11, 27.
Prescott, T. J. (2007). Forced moves or good tricks in design space? Landmarks in the evolution of neural mechanisms for action selection. Adaptive Behavior, 15(1), 9–31.
Prescott, T. J. (2015). Me in the machine. New Scientist, 36–39.
Prescott, T. J., Redgrave, P., & Gurney, K. N. (1999). Layered control architectures in robots and vertebrates. Adaptive Behavior, 7(1), 99–127.
Prescott, T. J., Mitchinson, B., Lepora, N. F., Wilson, S. P., Anderson, S. R., Porrill, J., et al. (2015). The robot vibrissal system: Understanding mammalian sensorimotor co-ordination through biomimetics. In P. Krieger & A. Groh (Eds.), Sensorimotor integration in the whisker system (pp. 213–240). New York: Springer.
Prescott, T. J., Ayers, J., Grasso, F. W., & Verschure, P. F. M. J. (2016). Embodied models and neurorobotics. In M. A. Arbib & J. J. Bonaiuto (Eds.), From neuron to cognition via computational neuroscience (pp. 483–512). Cambridge, MA: MIT Press.
Prescott, T. J., Lepora, N., & Verschure, P. F. M. J. (2018). The handbook of living machines: Research in biomimetic and biohybrid systems. Oxford, UK: OUP.
Prior, H., Schwarz, A., & Gunturkun, O. (2008). Mirror-induced behavior in the magpie (Pica pica): Evidence of self-recognition. PLoS Biology, 6(8), e202.
Rochat, P. (2001). The infant’s world. Cambridge, MA: Harvard University Press.
Roncone, A., Hoffmann, M., Pattacini, U., & Metta, G. (2014). Automatic kinematic chain calibration using artificial skin: Self-touch in the iCub humanoid robot. Paper presented at the 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 2305–2312.
Roncone, A., Hoffmann, M., Pattacini, U., Fadiga, L., & Metta, G. (2016). Peripersonal space and margin of safety around the body: Learning visuo-tactile associations in a humanoid robot with artificial skin. PLoS ONE, 11(10), e0163713. https://doi.org/10.1371/journal.pone.0163713.
Rubin, D. C. (2006). The basic-systems model of episodic memory. Perspectives on Psychological Science, 1(4), 277–311.
Schacter, D. L., Addis, D. R., Hassabis, D., Martin, V. C., Spreng, R. N., & Szpunar, K. K. (2012). The future of memory: Remembering, imagining, and the brain. Neuron, 76(4). https://doi.org/10.1016/j.neuron.2012.1011.1001, https://doi.org/10.1016/j.neuron.2012.11.001.
Searle, J. (1990). Is the brain’s mind a computer program? Scientific American, 262(1), 20–25.
Silberman, E. K., Putnam, F. W., Weingartner, H., Braun, B. G., & Post, R. M. (1985). Dissociative states in multiple personality disorder: A quantitative study. Psychiatry Research, 15(4), 253–260. https://doi.org/10.1016/0165-1781(85)90062-9.
Silver, D., Hubert, T., Schrittwieser, J., Antonoglou, I., Lai, M., & Guez, A., et al. (2017). Mastering chess and shogi by self-play with a general reinforcement learning algorithm. arXiv:1712.01815v1 [cs.AI] 5 Dec 2017.
Strawson, G. (1997). The self. Journal of Consciousness Studies, 4(5/6), 405–428.
Suddendorf, T., & Corballis, M. C. (2007). The evolution of foresight: What is mental time travel, and is it unique to humans? Behavioral and Brain Sciences, 30(3), 299–313. https://doi.org/10.1017/S0140525X07001975.
Tani, J. (1998). An interpretation of the ‘self’ from the dynamical systems perspective: A constructivist approach. Journal of Consciousness Studies, 5, 516–542.
Tononi, G. (2004). An information integration theory of consciousness. BMC Neuroscience, 5(1), 42. https://doi.org/10.1186/1471-2202-5-42.
Towner, S. (2010). Concept of mind in non-human primates. Bioscience Horizons: The International Journal of Student Research, 3(1), 96–104. https://doi.org/10.1093/biohorizons/hzq011.
Tulving, E. (1985). Memory and consciousness. Canadian Journal of Psychology, 26(1), 1–12.
Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460.
Uddin, L. Q. (2011). The self in autism: An emerging view from neuroimaging. Neurocase, 17(3), 201–208. https://doi.org/10.1080/13554794.2010.509320.
Vallar, G. (1998). Spatial hemineglect in humans. Trends in Cognitive Sciences, 2(3), 87–97. https://doi.org/10.1016/S1364-6613(98)01145-0.
Verschure, P. F. M. J. (2012). Distributed adaptive control: A theory of the mind, brain, body nexus. Biologically Inspired Cognitive Architectures, 1, 55–72. https://doi.org/10.1016/j.bica.2012.04.005.
Verschure, P. F. M. J., & Prescott, T. J. (2018). A living machines approach to the sciences of mind and brain. In T. J. Prescott, N. Lepora, & P. F. M. J. Verschure (Eds.), The handbook of living machines: Research in biomimetic and biohybrid systems. Oxford, UK: OUP.
Verschure, P. F. M. J., Krose, B., & Pfeifer, R. (1992). Distributed adaptive control: The self-organization of structured behavior. Robotics and Autonomous Systems, 9, 181–196.
Verschure, P. F. M. J., Pennartz, C. M. A., & Pezzulo, G. (2014). The why, what, where, when and how of goal-directed choice: Neuronal and computational principles. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 369(1655). https://doi.org/10.1098/rstb.2013.0483.
Zelazo, P. D. (2004). The development of conscious control in childhood. Trends in Cognitive Sciences, 8(1), 12–17. https://doi.org/10.1016/j.tics.2003.11.001.
Acknowledgements
The preparation of this chapter was supported by funding from the EU Seventh Framework Programme as part of the projects Experimental Functional Android Assistant (EFAA, FP7-ICT-270490) and What You Say Is What You Did (WYSIWYD, FP7-ICT-612139) and the EU H2020 Programme as part of the Human Brain Project (HBP-SGA1, 720270). We are particularly grateful to Paul Verschure, Peter Dominey, Giorgio Metta, Yiannis Demiris and the other members of the WYSIWYD and EFAA consortia, and to our colleagues at the University of Sheffield who have helped us to develop memory systems for the iCub, particularly Uriel Martinez, Andreas Damianou, Neil Lawrence, Luke Boorman and Matthew Evans. The Sheffield iCub was purchased with the support of the UK Engineering and Physical Science Research Council (EPSRC).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Prescott, T.J., Camilleri, D. (2019). The Synthetic Psychology of the Self. In: Aldinhas Ferreira, M., Silva Sequeira, J., Ventura, R. (eds) Cognitive Architectures. Intelligent Systems, Control and Automation: Science and Engineering, vol 94. Springer, Cham. https://doi.org/10.1007/978-3-319-97550-4_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-97550-4_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-97549-8
Online ISBN: 978-3-319-97550-4
eBook Packages: EngineeringEngineering (R0)