Skip to main content
Log in

Neuroethics, Cognitive Technologies and the Extended Mind Perspective

  • Original Paper
  • Published:
Neuroethics Aims and scope Submit manuscript

Abstract

Current debates in neuroethics engage with extremely diverse technologies, for some of which it is a point of contention whether they should be a topic for neuroethics at all. In this article, I will evaluate extended mind theory’s claim of being able to define the scope of neuroethics’ domain as well as determining the extension of an individual’s mind via its so-called trust and glue criteria. I argue that a) extending the domain of neuroethics by this manoeuvre endangers the theoretical consistency of neuroethics and b) the current state of the trust and glue criteria can introduce a bias towards overstating the relevance which representational devices have in characterising the field of neuroethics as compared to other technologies which are clearly of neuroethical relevance. As a remedy, I suggest a modification of the trust and glue criteria and a broader conception of cognition.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. A fundamental critique can be found in the writings of Adams and Aizawa [7] and Rupert [8]. Their arguments have been criticized or rejected by several authors in the situated cognition tradition. For an overview see the contributions in [9].

  2. Functional equivalence and functional complementarity, however, are context dependent. Functionally equivalent tools such as the spell-checker can become functionally complementary depending on the use they are being put to. The more users come to rely on functionally equivalent tools, the more they become functionally complementary, i.e. they are being assigned more and more of the tasks in which they are functionally complementary to the user’s biological abilities. This development will in extremis even affect the original functional equivalence. Once users completely rely on the tools carrying out the task in question, they will stop performing it themselves and thus stop training their own unassisted ability. With lack of training it might well happen that the process in question is not realised by the user’s brain anymore and cannot be performed by it without renewed training. Thus, the user’s brain and the external tool are not functionally equivalent anymore, but fully functionally complementary (Bainbridge [16] calls this an irony of automation).

  3. On the cognitive role of gestures see [17].

  4. Wilson and Clark discuss reliability and durability and in passing mention that external resources are “fluently tuned and integrated“[14] hinting at the further dimension later called individualisation or entrenchment. Sutton et al. discuss the dimensions of durability, reliability, individualisation or entrenchment and transparency as well as Sterelny’s distinction between individual and collective resources [15]. Sterelny discusses the three dimensions trust, individualisation, and individual versus collective use [20]. Menary [21] discusses the cognitive agent’s integration into the cognitive environment under the heading of physical manipulation and transformation.

  5. This topic has been treated in the movies as early as 1990, when the gadget in question was not a smartphone but a Filofax (in the movie Filofax by Arthur Hiller).

  6. Levy refers to other situated cognition approaches beyond the extended mind theory in arguing for his ethical parity principle, especially the embodied cognition hypothesis by Damásio [10]. He does accept that the extended mind sceptic has common intuition on his side. Consequently, he is – for the sake of the argument – willing to settle for the weaker version of the ethical parity principle, which is supported by embodied cognition theories, for the purpose of introducing neuroethics. His own theoretical framework is, however, tied to the stronger position of the extended mind theory and thus to EPP (strong). For a detailed discussion of Levy’s EPP see [22], for a criticism of its use in neuroethics see [23].

  7. Anderson refers to different situated cognition approaches in support of his parity argument, i.e. to a person’s cognitive processes as embodied (p. 262, 265), embedded (p. 264) or extended (p. 264 f.).

  8. Smart et al., [25], discuss the internet as a cognitive ecology and refers to smartphones primarily as establishing mobile access to internet services. Instead of considering smartphones as part of the extended mind, they take a detour via embodied cognition theory. They refer to a number of studies supporting the claim that people regard their phones as extensions of their own body. As an extension of an individual’s body, these devices are to be considered a part of the individual’s embodied mind, not an extended mind.

    As Smart et al. recognize, several prominent authors in the debate quite explicitly exclude the internet as a possible extension of individual minds. Against this verdict they argue that internet services made accessible by mobile and other devices rank high on the dimensions of Heersmink’s account. They stress a high ranking on the dimensions 1 (information flow), 4 (trust), 5 (procedural transparency), 6 (informational transparency) to which both, “accessibility of information” and “the ease with which the information can be interpreted” pertain, as well as 7 (individualization). They omit dimensions 2 (reliability), 3 (durability) which stand in the focus of the present discussion.

  9. Sounds obviously can become representations, if interpreted within a representational system. I merely want to distinguish them from signals already subsumed under a specific representational system. Unlike the information stored in a representational device, the information provided by sensory support devices is not yet part of a representational system, much as the information arriving on a human’s retina in not yet a representation but becomes one through acquired neural processes of visual processing.

  10. In [23] I argued that some of these dimensions are not easily applied to some parts of our cognitive environment, especially some modern neurotechnologies such as transcranial stimulation devices. Rather, they seem to be custom made for modern digital lifestyle appliances i.e. wearables, smartphones and tablets.

  11. Actually, the thought is much older. As Hatfield points out: “Descartes of course affirms that the brain is involved in memory. But so is the body: ‘I think that all the nerves and muscles can be so utilized [for memory], so that a lute player, for instance, has a part of his memory in his hands’ (Descartes, 1991, p. 146).” [30]

  12. A near miss would be [35], which refers to situated cognition approaches in their discussion of Brain-Computer-Interfaces for Locked-in patients. The theory they really could make use of, is enactivism (see [36]), and even that plays little role in their ethical argument.

  13. Admittedly, the last set of examples does contain some artefacts, which do not store representational information in the same way as maps and notebooks. Rather they include social interaction and ecological artefacts. The social interaction described by Heersmink is, however, thoroughly representational. His examples are verbal cue-givings in conversations and their linguistic content. He does in earlier writing distinguish between ecological and representational cognitive artefacts, thus is aware that the cognitive environment consists of more than our digital lifestyle appliances. Ecological artefacts are objects to which we assign representational character by virtue of their position or arrangements, e.g. the DVD on my front-desk being put there to remind me to return it to the library. Thus, the issue remains, information flow (as well as trust and informational transparency) is a criterion designed for tools and props containing representational information be it by their design or by their arrangement [2].

  14. Such actions are called ‘epistemic actions’ [37]. The authors introduced the term to distinguish actions which are directed at a perceptual or cognitive advantage from actions directed at non-epistemically relevant changes in the world. Clark and Chalmers as well as Heersmink adopt this terminology.

  15. Another reason speaks for dropping this dimension altogether: It has never been obvious why a device which provides more information should be more closely integrated with a human mind than one which provides only little information. The amount of information required from a device in the support of a cognitive task depends on the task in question. Neither more information nor additional information channels and directions will integrate a device better, only a better fit with the tasks requirements would.

    Admittedly the dimension of information flow knows two different important forms of variation: kind and quantity. The kind of information flow, one-way, two-way or reciprocal is relevant. Two things however make the amount of information exchanges relevant as well: 1) information flow is meant to be a dimension and qua dimension evaluable at least in an ordinal sense. 2) Heersmink explicitly discussed bandwidth, i.e. the amount of information exchanged in a given time: “Finally, like information flow in computer networks, information flow in situated cognitive systems has a certain bandwidth, which is the amount of information that is interpreted or offloaded per unit of time. So, the more information is interpreted or offloaded in a given amount of time, the higher the bandwidth, which is often an important aspect for realizing a cognitive purpose.“ [2]

References

  1. Anderson, Joel. 2008. Neuro-Prosthetics, the Extended Mind, and Respect for Persons with Disability. In The Contingent Nature of Life: Bioethics and Limits of Human Existence, ed. Marcus Düwell, Christoph Rehmann-Sutter, and Dietmar Mieth, 259–274. Dordrecht: Springer Netherlands.

    Chapter  Google Scholar 

  2. Heersmink, Richard. 2015. Dimensions of Integration in Embedded and Extended Cognitive Systems. Phenomenology and the Cognitive Sciences 14 (3): 577–598. https://doi.org/10.1007/s11097-014-9355-1.

    Article  Google Scholar 

  3. Clark, Andy. 1997. Being There. Putting Brain, Body, and World Together Again. Cambridge: MIT Press.

  4. Heersmink, Richard. 2017. Extended Mind and Cognitive Enhancement: Moral Aspects of Cognitive Artifacts. Phenomenology and the Cognitive Sciences 16 (1): 17–32. https://doi.org/10.1007/s11097-015-9448-5.

    Article  Google Scholar 

  5. Levy, Neil. 2007. Neuroethics. Cambridge; New York: Cambridge University Press.

    Book  Google Scholar 

  6. Heersmink, Richard. 2017. Distributed Selves: Personal Identity and Extended Memory Systems. Synthese 194 (8): 3135–3151. https://doi.org/10.1007/s11229-016-1102-4.

    Article  Google Scholar 

  7. Adams, Frederick, and Kenneth Aizawa. 2008. The Bounds of Cognition. Malden: Blackwell.

  8. Rupert, Robert D. 2004. Challenges to the Hypothesis of Extended Cognition. Journal of Philosophy 101 (8): 389–428. https://doi.org/10.5840/jphil2004101826.

    Article  Google Scholar 

  9. Menary, Richard. 2010. The Extended Mind. Cambridge: MIT Press.

    Book  Google Scholar 

  10. Damásio, António R. 1994. Descartes' Error: Emotion, Reason, and the Human Brain. New York: Avon Books.

    Google Scholar 

  11. Hurley, Susan L. 1998. Consciousness in Action. Cambridge: Harvard University Press.

    Google Scholar 

  12. Clark, Andy, and David Chalmers. 1998. The Extended Mind. Analysis 58 (1): 7–19. https://doi.org/10.1111/1467-8284.00096.

    Article  Google Scholar 

  13. Sutton, John. 2010. Exograms and Interdisciplinarity: History, the Extended Mind, and the Civilizing Process. In The Extended Mind, ed. Richard Menary, 189–226. Cambridge: MIT Press.

  14. Wilson, Robert A., and Andy Clark. 2009. How to Situate Cognition: Letting Nature Take its Course. In The Cambridge Handbook of Situated Cognition, ed. Murat Aydede and Philip Robbins, 55–77. Cambridge: Cambridge University Press.

  15. Sutton, John, Celia B. Harris, Paul G. Keil, and Amanda J. Barnier. 2010. The Psychology of Memory, Extended Cognition, and Socially Distributed Remembering. Phenomenology and the Cognitive Sciences 9 (4): 521–560. https://doi.org/10.1007/s11097-010-9182-y.

    Article  Google Scholar 

  16. Bainbridge, Lisanne. 1983. Ironies of Automation. Automatica 19 (6): 775–779. https://doi.org/10.1016/0005-1098(83)90046-8.

    Article  Google Scholar 

  17. Alibali, Martha W., Rebecca Boncoddo, and Autumn B. Hostetter. 2014. Gesture in Reasoning. In The Routledge Handbook of Embodied Cognition, ed. Lawrence Shapiro, 150-159. London: Routledge.

  18. Yu, Chen, Linda B. Smith, Hongwei Shen, Alfredo F. Pereira, and Thomas Smith. 2009. Active Information Selection: Visual Attention Through the Hands. IEEE Transactions on Autonomous Mental Development 1 (2): 141–151. https://doi.org/10.1109/TAMD.2009.2031513.

    Article  Google Scholar 

  19. Klaming, Laura, and Pim Haselager. 2013. Did My Brain Implant Make Me Do It? Questions Raised by DBS Regarding Psychological Continuity, Responsibility for Action and Mental Competence. Neuroethics 6: 527–539. https://doi.org/10.1007/s12152-010-9093-1.

    Article  Google Scholar 

  20. Sterelny, Kim. 2010. Minds: Extended or Scaffolded? Phenomenology and the Cognitive Sciences 9 (4): 465–481. https://doi.org/10.1007/s11097-010-9174-y.

    Article  Google Scholar 

  21. Menary, Richard. 2010. Cognitive integration and the extended mind. In The Extended Mind, ed. Richard Menary, 227–243. Cambridge: MIT Press.

  22. DeMarco, Joseph P., and Paul J. Ford. 2014. Neuroethics and the Ethical Parity Principle. Neuroethics 7 (3): 317–325. https://doi.org/10.1007/s12152-014-9211-6.

    Article  Google Scholar 

  23. Heinrichs, Jan-Hendrik. 2017. Against Strong Ethical Parity: Situated Cognition Theses and Transcranial Brain Stimulation. Frontiers in Human Neuroscience 11 (171): 171. https://doi.org/10.3389/fnhum.2017.00171.

    Article  Google Scholar 

  24. Noë, Alva. 2004. Action in Perception. Cambridge: MIT Press.

  25. Smart, Paul, Richard Heersmink, and Robert W. Clowes. 2017. The Cognitive Ecology of the Internet. In Cognition Beyond the Brain: Computation, Interactivity and Human Artifice, ed. Stephen J. Cowley and Frédéric Vallée-Tourangeau, 251–282. Dordrecht: Springer.

    Chapter  Google Scholar 

  26. Montgomery, Erwin B., Jr., and John T. Gale. 2008. Mechanisms of Action of Deep Brain Stimulation (DBS). Neuroscience & Biobehavioral Reviews 32 (3): 388–407. https://doi.org/10.1016/j.neubiorev.2007.06.003.

    Article  Google Scholar 

  27. Chan, Danny T.M., Xian Lun Zhu, Jonas H.M. Yeung, Vincent C.T. Mok, Edith Wong, Clara Lau, Rosanna Wong, Christine Lau, and Wai S. Poon. 2009. Complications of Deep Brain Stimulation: a Collective Review. Asian Journal of Surgery 32 (4): 258–263. https://doi.org/10.1016/S1015-9584(09)60404-8.

    Article  Google Scholar 

  28. Benabid, Alim-Louis, P. Pollak, A. Louveau, S. Henry, and J. de Rougemont. 1987. Combined (Thalamotomy and Stimulation) Stereotactic Surgery of the VIM Thalamic Nucleus for Bilateral Parkinson Disease. Applied Neurophysiology 50 (1–6): 344–346. https://doi.org/10.1159/000100803.

    Article  Google Scholar 

  29. de Haan, Sanneke, Erik Rietveld, Martin Stokhof, and Damiaan Denys. 2017. Becoming more oneself? Changes in Personality following DBS Treatment for Psychiatric Disorders: Experiences of OCD Patients and General Considerations. PLoS One 12 (4): e0175748. https://doi.org/10.1371/journal.pone.0175748.

    Article  Google Scholar 

  30. Hatfield, Gary. 2014. Cognition. In The Routledge Handbook of Embodied Cognition, ed. Lawrence Shapiro, 361–373. London: Routledge.

    Google Scholar 

  31. Varela, Francisco J., Evan Thompson, and Eleanor Rosch. 1991. The Embodied Mind. Cognitive Science and Human Experience. Cambridge: MIT Press.

    Book  Google Scholar 

  32. Prinz, Wolfgang. 1997. Perception and Action Planning. European Journal of Cognitive Psychology 9 (2): 129–154. https://doi.org/10.1080/713752551.

    Article  Google Scholar 

  33. Hohwy, Jakob. 2013. The Predictive Mind. 1st ed. Oxford; New York: Oxford University Press.

    Book  Google Scholar 

  34. Shapiro, Lawrence A. 2011. Embodied Cognition. New York: Routledge.

    Google Scholar 

  35. Fenton, Andrew, and Sheri Alpert. 2008. Extending Our View on Using BCIs for Locked-in Syndrome. Neuroethics 1 (2): 119–132. https://doi.org/10.1007/s12152-008-9014-8.

    Article  Google Scholar 

  36. Walter, Sven. 2010. Locked-in Syndrome, BCI, and a Confusion about Embodied, Embedded, Extended, and Enacted Cognition. Neuroethics 3 (1): 61–72. https://doi.org/10.1007/s12152-009-9050-z.

  37. Kirsh, David, and Paul Maglio. 1994. On Distinguishing Epistemic from Pragmatic Action. Cognitive Science 18 (4): 513–549. https://doi.org/10.1016/0364-0213(94)90007-8.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jan-Hendrik Heinrichs.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Heinrichs, JH. Neuroethics, Cognitive Technologies and the Extended Mind Perspective. Neuroethics 14, 59–72 (2021). https://doi.org/10.1007/s12152-018-9365-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12152-018-9365-8

Keywords

Navigation