Skip to main content

Infantilisation through Technology

  • Chapter
  • First Online:
Technology, Anthropology, and Dimensions of Responsibility

Part of the book series: Techno:Phil – Aktuelle Herausforderungen der Technikphilosophie ((TPAHT,volume 1))

Abstract

Persuasive technologies are designed with the aim of influencing people’s attitudes and behaviour in order to let them achieve their goals and realise their values more efficiently. It is commonly assumed that such technologies do not work in a malevolent, manipulative or coercive way, but rather function as self-administered nudges which promote rational, autonomous and perhaps even moral conduct and are therefore conducive to leading a good life. As beneficial as such assisting technologies may appear at first sight, they might involve a substantial drawback. Due to the ambivalent nature of technology and the dialectic entailed in technological progress in general, persuasive technologies ‒ far from enhancing autonomy and the good life ‒ might exert an influence on their users which I will term “infantilisation”. Infants and children generally lack the capacities to care for themselves and reflect on appropriate courses of action autonomously and, therefore, require assistance in developing values, understanding and adopting social norms and acting according to these standards. However, once these capacities have been developed, adults are usually regarded as being able to assume responsibility for their own lives. In the present paper, I will argue that the use of persuasive technologies might threaten this (self-)ascription of responsibility by treating average autonomous persons like children in need of normative and practical guidance. I will assume that this holds both on individualist and relational accounts of autonomy and might have a negative impact on exercising capacities of reflecting on and realising personal conceptions of the good life.

I’m a 21st century digital boy, I don’t know how to live, but I’ve got a lot of toys.

(Bad Religion, 1990)

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Interestingly, King and Tester (1999, p. 36) noted at the onset of the development of persuasive technologies that “[a]lthough one might expect to find many persuasive technologies geared toward self-help issues, like how to break bad habits, remarkably few existing technologies fall into this category.” Today, installing certain habits appears to be among the paramount goals of persuasive technologies; therefore, the authors were again right in forecasting that such technologies would “become more common—perhaps even pervasive” (King and Tester 1999, p. 36). To this effect, Kersten-van Dijk et al. (2017, p. 270) assert that a kind of “self-improvement hypothesis” underlies the development and use of what they call “personal informatics” technologies.

  2. 2.

    Cf. Spahn (2013, p. 113): “Arguably persuasion lies in between the ethically unproblematic, but often not effective influence method ‘convincing through arguments’ and the ethically prima facie problematic, but often effective methods of ‘manipulation’ and ‘coercion.’”

  3. 3.

    This “mirror” resembles the picture of Dorian Gray, which reflected its model’s moral decline while he himself stayed young and beautiful. Similarly, the persuasive mirror mercilessly reveals the foreseeable consequences of “bad” health behaviour while the user’s self-image presumably stays relatively stable. Verbeek (2011, pp. 112, 123) also refers to this example.

  4. 4.

    The authors obviously use the notions of happiness and (emotional) well-being interchangeably and are exclusively concerned with a psychological approach to subjective well-being; cf. Hollis et al. (2017, p. 215, original emphasis): “Although exact definitions of well-being are debated, there is scientific consensus that it involves two main components — hedonic, relating to real-time affect […], and eudaimonic, which concerns progress towards longer term life-goals and values[.]”.

  5. 5.

    Technology writer and co-founder of the “Quantified Self” movement Gary Wolf, for example, takes it for granted that ubiquitous computing and self-tracking technologies like Fitbit “make us better humans in every aspect, i. e., thin, rich, happy and smart” (quoted in: Bernard 2017, p. 116; own translation).

  6. 6.

    Being content with leading a comparatively inactive, withdrawn and unexceptional—let alone unhealthy and unathletic—life no longer appears to be up to date, which means bad news for prototypical scholars, geeks, philosophers, particularly adherents of classical Epicureanism and scepticism, or other proponents of the time-honoured ideal of a contemplative life.

  7. 7.

    As with all kinds of advertising, the first thought intended to be provoked in users by persuasive technologies is not “Do I need this?”, but rather “All the others have it, I need it too.”

  8. 8.

    See also Verbeek (2011, p. 127): “Technologies’ so-called multistability makes it difficult to fully predict the ways they will influence human actions or to evaluate this influence in ethical terms.”

  9. 9.

    I would like to thank Michael Kühler for pressing me on this point and for helpful comments which allowed me to express my concerns more precisely.

  10. 10.

    Verbeek instead proposes applying the term “freedom”: “While the concept of autonomy stresses the importance of the absence of ‘external influences’ in order to keep the moral subject as pure as possible, the concept of freedom recognizes that the subject is formed in interaction with these influences” (Verbeek 2011, p. 85, original emphasis). However, the concept of freedom appears as complex and equivocal as the concept of autonomy, all the more when relying on such demanding authors in regard to terminology as Foucault and Heidegger (Verbeek 2011, Chap. 4). Moreover, the word “freedom” could be easily confused with a concept of “freedom of the will”, which is even more complicated and quite the opposite of what Verbeek appears to have in mind.

  11. 11.

    Apart from empirical occupation with so-called bounded rationality (Kahnemann 2011; Ariely 2010), it is probably still Gehlen’s concept from 1940 of man as deficient creature (“Mängelwesen”) that lurks behind such assumptions (Gehlen 2016). However, it is not self-evident why this conceptualisation, which often appears to go unsubstantiated, should be taken for granted.

References

  • Andrés del Valle, A. C., & Opalach, A. (n. D.). The Persuasive Mirror: Computerized persuasion for healthy living. https://pdfs.semanticscholar.org/21a9/184e68bd35c76a44eb17a34fb26925147a73.pdf. Accessed 21 Aug 2019.

  • Ariely, D. (2010). Predictably irrational. The hidden forces that shape our decisions (Revised and updated edition). London: Harper.

    Google Scholar 

  • Beauchamp, T. L., & Childress, J. F. (2013). Principles of biomedical ethics (7th ed.). New York: Oxford University Press.

    Google Scholar 

  • Beck, B. (2015). Kunden, die dieses neurale Aktivitätsmuster zeigten, kauften auch… Nudge durch Neuroökonomie? In J. S. Ach, B. Lüttenberg, & A. Nossek (Eds.), Neuroimaging und Neuroökonomie. Grundlagen, ethische Fragestellungen, soziale und rechtliche Relevanz (Münsteraner Bioethik-Studien Bd. 14, pp. 79–102). Berlin: LIT.

    Google Scholar 

  • Bernard, A. (2017). Komplizen des Erkennungsdienstes. Das Selbst in der digitalen Kultur. Frankfurt a. M.: Fischer.

    Google Scholar 

  • Comber, R., & Thieme, A. (2017). BinCam: Evaluating persuasion at multiple scales. In L. Little, E. Sillence, & A. Joinson (Eds.), Behavior change research and theory. Psychological and technological perspectives (pp. 181–194). London: Academic.

    Google Scholar 

  • Christman, J. (2009). The politics of persons. Individual autonomy and socio-historical selves. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Deutscher Ethikrat. (2019). Impfen als Pflicht? Stellungnahme. ISBN 978-3-941957-83-1 (PDF). https://www.ethikrat.org/fileadmin/Publikationen/Stellungnahmen/deutsch/stellungnahme-impfen-als-pflicht.pdf. Accessed 15 Dec 2019.

  • Frankfurt, H. (2009). Freedom of the will and the concept of a person. In The importance of what we care about (pp. 11–25). 15th printing, New York: Cambridge University Press.

    Google Scholar 

  • Gehlen, A. (2016). Der Mensch. Seine Natur und seine Stellung in der Welt. Frankfurt a. M.: Klostermann.

    Google Scholar 

  • Halttu, K., & Oinas-Kukkonen, H. (2017). Persuading to reflect: Role of reflection and insight in persuasive systems design for physical health. Human-Computer Interaction, 32, 381–412.

    Article  Google Scholar 

  • Hollis, V., Konrad, A., Springer, A., Antoun, M., Antoun, C., Martin, R., et al. (2017). What does all this data mean for my future mood? Actionable analytics and targeted reflection for emotional well-being. Human-Computer Interaction, 32, 208–267.

    Article  Google Scholar 

  • Kahnemann, D. (2011). Thinking, fast and slow. New York: Macmillan Publishers.

    Google Scholar 

  • Kersten-van Dijk, E., Westerink, J. H. D. M., Beute, F., & Ijsselsteijn, W. A. (2017). Personal informatics, self-insight, and behavior change: A critical review of current literature. Human-Computer Interaction, 32, 268–296.

    Article  Google Scholar 

  • King, P., & Tester, J. (1999). The landscape of persuasive technologies. Communications of the ACM, 42(5), 31–38.

    Article  Google Scholar 

  • Kühler, M., & Mitrović, V. (Eds.). (forthcoming). Theories of the self and autonomy in medical ethics. Cham: Springer.

    Google Scholar 

  • Mackenzie, C., & Stoljar, N. (Eds.). (2000). Relational autonomy. Feminist perspectives on autonomy, agency, and the social self. New York: Oxford University Press.

    Google Scholar 

  • Morozov, E. (2013). To save everything, click here. Technology, solutionism and the urge to fix problems that don’t exist. London: Penguin.

    Google Scholar 

  • Peterson, M., & Spahn, A. (2011). Can technological artefacts be moral agents? Science and Engineering Ethics, 17, 411–424.

    Article  Google Scholar 

  • Quante, M. (2018). Pragmatistic anthropology. Paderborn: mentis.

    Book  Google Scholar 

  • Rapp, A., & Tirassa, M. (2017). Know thyself: A theory of the self for personal informatics. Human-Computer Interaction, 32, 335–380.

    Article  Google Scholar 

  • Spahn, A. (2012). And lead us (not) into persuasion…? Persuasive technology and the ethics of communication. Science and Engineering Ethics, 18, 633–650.

    Article  Google Scholar 

  • Spahn, A. (2013). Moralizing mobility? Persuasive technologies and the ethics of mobility. Transfers, 3(2), 108–115.

    Article  Google Scholar 

  • Taylor, J. S. (Ed.). (2008). Personal autonomy. New essays on personal autonomy and its role in contemporary moral philosophy (First paperback edition). New York: Cambridge University Press.

    Google Scholar 

  • Thaler, R. H., & Sunstein, C. R. (2009). Nudge. Improving decisions about health, wealth, and happiness (Revised and expanded edition). New York: Penguin.

    Google Scholar 

  • Verbeek, P.-P. (2009). Ambient intelligence and persuasive technology: The blurring boundaries between human and technology. Nanoethics, 3, 231–242.

    Article  Google Scholar 

  • Verbeek, P.-P. (2011). Moralizing technology. Understanding and designing the morality of things. Chicago: Chicago University Press.

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Birgit Beck .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer-Verlag GmbH Germany, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Beck, B. (2020). Infantilisation through Technology. In: Beck, B., Kühler, M. (eds) Technology, Anthropology, and Dimensions of Responsibility. Techno:Phil – Aktuelle Herausforderungen der Technikphilosophie , vol 1. J.B. Metzler, Stuttgart. https://doi.org/10.1007/978-3-476-04896-7_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-476-04896-7_4

  • Published:

  • Publisher Name: J.B. Metzler, Stuttgart

  • Print ISBN: 978-3-476-04895-0

  • Online ISBN: 978-3-476-04896-7

  • eBook Packages: J.B. Metzler Humanities (German Language)

Publish with us

Policies and ethics