Abstract
Dark patterns have received significant attention in literature as interface design practices which undermine users’ autonomy by coercing, misleading or manipulating their decision making and behavior. Individual autonomy has been argued to be one of the normative lenses for the evaluation of dark patterns. However, theoretical perspectives on autonomy have not been sufficiently adapted in literature to identify the ethical concerns raised by dark patterns. The aim of this paper is to conceptualize user autonomy within the context of dark patterns. In this paper, we systematically review 151 dark patterns from 16 taxonomies to understand how dark patterns threaten users’ autonomy. We demonstrate through this analysis that implications for autonomy arise along four dimensions, because autonomy itself can be understood as subsuming several distinguishable concepts. These are agency, freedom of choice, control and independence. We argue that an assessment of whether a design pattern qualifies as ‘dark’ should account for the sense in which autonomy is threatened, as individuals’ rights and expectations of autonomy vary in various contexts and depend upon the interpretation of autonomy. This paper aims to contribute to the development of the normative lens of individual autonomy for the evaluation of dark patterns, as well as for persuasive design more broadly.
This is a preview of subscription content, access via your institution.


Data availability
Notes
“dark pattern(s),” “anti-pattern(s),” “deceptive design pattern(s),” “FoMo design(s),” and “manipulative design pattern(s).
References
Ahuja, S., & Kumar, J. (2020). Surveillance based Persuasion: The Good, The Bad and The Ugly. 4th International Conference on Computer-Human Interaction Research and Applications (CHIRA ‘20), Budapest, Hungary. https://doi.org/10.5220/0010121401200127
André, Q., Carmon, Z., Wertenbroch, K., Crum, A., Frank, D., Goldstein, W., Huber, J., van Boven, L., Weber, B., & Yang, H. (2018). Consumer choice and autonomy in the age of artificial intelligence and big data. Customer Needs and Solutions, 5, 28–37. https://doi.org/10.1007/s40547-017-0085-8
Berdichevsky, D., & Neuenschwander, E. (1999). Toward an ethics of persuasive technology. Communications of the ACM, 42(5), 51–58. https://doi.org/10.1145/301353.301410.
Betzler, M. (2009). Authenticity and self-governance. In M. Salmela & V. Mayer (Eds.), Emotions, ethics, and authenticity (pp. 51–68). John Benjamins Publishing Company.
Bongard-Blanchy, K., Rossi, A., Rivas, S., Doublet, S., Koenig, V., & Lenzini, G. (2021). “I am definitely manipulated, even when i am aware of it. It’s ridiculous!”— Dark patterns from the end-user perspective. ACM Designing Interactive Systems Conference (DIS ’21), Virtual Event, USA. https://doi.org/10.1145/3461778.3462086
Bösch, C., Erb, B., Kargl, F., Kopp, H., & Pfattheicher, S. (2016). Tales from the dark side: Privacy dark strategies and privacy dark patterns. Proceedings on Privacy Enhancing Technologies, 4, 237–254. https://doi.org/10.1515/popets-2016-0038
Botti, S., & McGill, A. L. (2011). The locus of choice: personal causality and satisfaction with hedonic and utilitarian decisions. Journal of Consumer Research, 37(6), 1065–1078. https://doi.org/10.1086/656570.
Brehm, J. W. (1966). A theory of psychological reactance. Academic Press.
Brey, P. (2005). Freedom and privacy in ambient intelligence. Ethics & Information Technology, 7, 157–166. https://doi.org/10.1007/s10676-006-0005-3
Brignull, H. (2010). What are dark patterns?. Retrieved March 22, 2022, from https://www.darkpatterns.org/
Buss, S., & Westlund, A. (2018). Personal Autonomy The Stanford Encyclopedia of Philosophy (Spring 2018 Edition), Edward N. Zalta (ed.). Retrieved March 22, 2022 from https://plato.stanford.edu/archives/spr2018/entries/personal-autonomy/
California Secretary of State (2020). Qualified statewide ballot measures. Retrieved March 22, 2022 from https://www.oag.ca.gov/system/files/initiatives/pdfs/19-0021A1%20%28Consumer%20Privacy%20-%20Version%203%29_1.pdf
Christman, J. (2020). Autonomy in moral and political philosophy. The Stanford encyclopedia of philosophy (Fall 2020 Edition), Edward N. Zalta (ed.). Retrieved March 22, 2022 from https://plato.stanford.edu/archives/fall2020/entries/autonomy-moral/
Chung, C., Gorm, N., Shklovski, I. A., & Munson, S. (2017). Finding the right fit: Understanding health tracking in workplace wellness programs. ACM CHI Conference on Human Factors in Computing Systems (CHI ‘17), Denver, Colorado, USA. https://doi.org/10.1145/3025453.3025510
Cialdini, R. B. (2007). Influence: The psychology of persuasion Collins.
Conti, G., & Sobiesk, E. (2010). Malicious interface design: Exploiting the user. 19th International Conference on World Wide Web (WWW ’10), Raleigh, North Carolina, USA. https://doi.org/10.1145/1772690.1772719
Cortiñas, M., Elorz, M., & Mugica, J. (2008). The use of loyalty-cards databases: Differences in regular price and discount sensitivity in the brand choice decision between card and non-card holders. Journal of Retailing and Consumer Services, 15(1), 52–62. https://doi.org/10.1016/j.jretconser.2007.03.006.
Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11(4), 227–268. https://doi.org/10.1207/s15327965pli1104_01
Dhar, R., & Wertenbroch, K. (2012). Self-signaling and the costs and benefits of temptation in consumer choice. Journal of Marketing Research, 49(1), 15–25. https://doi.org/10.1509/jmr.10.0490.
Erler, A. (2011). Does memory modification threaten our authenticity? Neuroethics, 4(3), 235–249. https://doi.org/10.1007/s12152-010-9090-4
Fogg, B. J. (2002). Persuasive technology: using computers to change what we think and do. Morgan Kaufmann Publishers.
Frankfurt, H. G. (1971). Freedom of the will and the concept of a person. Journal of Philosophy, 68(1), 5–20. https://doi.org/10.2307/2024717.
Friedrich, O., Racine, E., Steinert, S., Pömsl, J., & Jox, R. J. (2018). An analysis of the impact of brain-computer interfaces on autonomy. Neuroethics, 14, 17–29. https://doi.org/10.1007/s12152-018-9364-9.
Friestad, M., & Wright, P. (1994). The persuasion knowledge model: How people cope with persuasion attempts. Journal of Consumer Research, 21(1), 1–31.
Frischmann, B., & Selinger, E. (2018). Re-engineering humanity. Cambridge University Press.
Frobrukerrådet [Norwegian Consumer Council or NCC] (2018). Deceived by design: How tech companies use dark patterns to discourage us from exercising our rights to privacy Retrieved March 22, 2022 from https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf
General Data Protection Regulation (GDPR) (2018). Retrieved March 22, 2022 from https://gdpr.eu/tag/gdpr/
Gray, C. M., Chivukula, S. S., & Lee, A. (2020). What kind of work do “Asshole Designers” create? Describing properties of ethical concern on reddit. ACM Designing Interactive Systems Conference (DIS ’20), Eindhoven, Netherlands. https://doi.org/10.1145/3357236.3395486
Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018). The dark (patterns) side of UX design. CHI Conference on Human Factors in Computing Systems (CHI ’18), Montreal, QC, Canada. https://doi.org/10.1145/3173574.3174108
Greenberg, G., Boring, S., Vermeulen, J., & Dostal, J. (2014). Dark patterns in proxemic interactions: A critical perspective. ACM Designing Interactive Systems Conference (DIS ’14), Vancouver, BC, Canada. https://doi.org/10.1145/2598510.2598541
Hansen, P. G., & Jespersen, A. M. (2013). Nudge and the manipulation of choice: A framework for the responsible use of the nudge approach to behaviour change in public policy. European Journal of Risk Regulation, 1, 3–28.
Herman, E. S., & Chomsky, N. (1988). Manufacturing consent: The political economy of the mass media. Pantheon Books.
Jacobs, N. (2019). Two ethical concerns about the use of persuasive technology for vulnerable people. Bioethics, 34(5), 519–526. https://doi.org/10.1111/bioe.12683.
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47(2), 263–292. https://doi.org/10.2307/1914185
Krisam, C., Dietmann, H., Volkamer, M., & Kulyk, O. (2021). Dark patterns in the wild: Review of cookie disclaimer designs on top 500 German websites. European Symposium on Usable Security (EuroUSEC ’21), Karlsruhe, Germany. https://doi.org/10.1145/3481357.3481516
Lacey, C., & Caudwell, C. (2019). Cuteness as a ‘Dark Pattern’ in home robots. 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI ‘19), Daegu, South Korea. https://doi.org/10.1109/HRI.2019.8673274
Langer, E. J., & Rodin, J. (1976). The effects of choice and enhanced personal responsibility for the aged: a field experiment in an institutional setting. Journal of Personality and Social Psychology, 34(2), 191–198. https://doi.org/10.1037/0022-3514.34.2.191.
Lasswell, H. (1927). The theory of political propaganda. American Political Science Review, 21(3), 627–631. https://doi.org/10.2307/1945515
Levy, N. (2007). Neuroethics. Cambridge University Press.
Lewis, C. (2014). Irresistible apps: Motivational design patterns for apps, games, and web-based communities. Apress.
Lindstrom, M. (2012). Buyology: How everything we believe about why we buy is wrong. Random House.
Maier, M. A., & Harr, R. (2020). Dark design patterns: An end-user perspective. Human Technology, 16(2), 170–199. https://doi.org/10.17011/ht/urn.202008245641
Mackenzie, C., & Stoljar, N. (Eds.). (2000). Relational autonomy: Feminist perspectives on autonomy, agency, and the social self. Oxford University Press.
Mackenzie, C., & Walker, M. (2015). Neurotechnologies, personal identity, and the ethics of authenticity. In J. Clausen & N. Levy (Eds.), Handbook of neuroethics (pp. 373–392). Springer.
Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M., & Narayanan, A. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human Computer Interaction, 3 (CSCW), 1–32. https://doi.org/10.1145/3359183
Mathur, A., Mayer, J., & Kshirsagar, M. (2021). What makes a dark pattern… dark?: Design attributes, normative considerations, and measurement methods. ACM CHI Conference on Human Factors in Computing Systems (CHI ’21), Yokohama, Japan. https://doi.org/10.1145/3411764.3445610
McFall, L. (2019). Personalizing solidarity? The role of self-tracking in health insurance pricing. Economy and Society, 48(1), 52–76. https://doi.org/10.1080/03085147.2019.1570707.
Mhaidli, A., & Schaub, F. (2021). Identifying manipulative advertising techniques in XR through scenario construction. CHI Conference on Human Factors in Computing Systems (CHI ’21), Yokohama, Japan. https://doi.org/10.1145/3411764.3445253
Milliman, R. E. (1982). Using background music to affect the behavior of supermarket shoppers. Journal of Marketing, 46(3), 86–91. https://doi.org/10.1177/002224298204600313
Nagenborg, M. (2014). Surveillance and persuasion. Ethics and Information Technology, 16, 43–49. https://doi.org/10.1007/s10676-014-9339-4.
National Commission on Informatics and Liberty (CNIL) (2020). Shaping choices in the digital world. Retrieved March 23, 2022 from https://linc.cnil.fr/sites/default/files/atoms/files/cnil_ip_report_06_shaping_choices_in_the_digital_world.pdf
Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin Books.
Passingham, J. (1998). Grocery retailing and the loyalty card. Market Research Society Journal, 40(1), 1–8. https://doi.org/10.1177/147078539804000105
Petrovskaya, E., & Zendle, D. (2021). Predatory monetisation? A categorisation of unfair, misleading and aggressive monetisation techniques in digital games from the player perspective. Journal of Business Ethics. https://doi.org/10.1007/s10551-021-04970-6
Petty, R. D. (1997). Advertising law in the United States and European Union. Journal of Public Policy & Marketing, 16(1), 2–13.
Pugh, J., Maslen, H., & Savalescu, J. (2017). Deep brain stimulation, authenticity and value. Cambridge Quarterly of Healthcare Ethics, 26(4), 640–657. https://doi.org/10.1017/S0963180117000147
Roskies, A. (2021). Neuroethics The Stanford Encyclopedia of Philosophy (Spring 2021 Edition), Edward N. Zalta (ed.). https://plato.stanford.edu/archives/spr2021/entries/neuroethics/
Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55(1), 68–78. https://doi.org/10.1037/0003-066X.55.1.68.
Saghai, Y. (2013). Salvaging the concept of nudge. Journal of Medical Ethics, 39, 487–493. https://doi.org/10.1136/medethics-2012-100727.
Schechtman, M. (2014). Staying alive: Personal identity, practical concerns, and the unity of a life. Oxford University Press.
Schüll, N. D. (2012). Addiction by design: Machine gambling in Las Vegas. Princeton University Press.
Sharon, T. (2017). Self-tracking for health and the quantified self: Re-articulating autonomy, solidarity, and authenticity in an age of personalized healthcare. Philosophy & Technology, 30, 93–121. https://doi.org/10.1007/s13347-016-0215-5
Spahn, A. (2012). And lead us (not) into persuasion… persuasive technology and the ethics of communication. Science and Engineering Ethics, 18, 633–650. https://doi.org/10.1007/s11948-011-9278-y
Stanovich, K. E., & West, R. F. (2000). Individual difference in reasoning: implications for the rationality debate? Behavioral and Brain Sciences, 23(5), 645–726. https://doi.org/10.1017/s0140525x00003435.
Steindl, C., Jonas, E., Sittenthaler, S., Traut-Mattausch, E., & Greenberg, J. (2015). Understanding psychological reactance. Zeitschrift für Psychologie, 223(4), 205–214. https://doi.org/10.1027/2151-2604/a000222.
Sullivan, L. S., & Reiner, P. (2019). Digital wellness and persuasive technologies. Philosophy & Technology, 34, 413–424. https://doi.org/10.1007/s13347-019-00376-5
Susser, D., Roessler, B., & Nissenbaum, H. (2019). Technology, autonomy, and manipulation. Internet Policy Review. https://doi.org/10.14763/2019.2.1410
Thaler, R. H., & Sunstein, C. R. (2008). Nudge. Yale University Press.
The Consumer Protection from Unfair Trading Regulations 2008 (2008). Queen’s printer of acts of parliament. Retrieved March 23, 2022 from https://www.legislation.gov.uk/uksi/2008/1277/contents/made
Urban, T., Degeling, M., Holz, T., & Pohlmann, N. (2020). Beyond the front page: measuring third party dynamics in the field. Proceedings of The Web Conference 2020 https://doi.org/10.1145/3366423.3380203
Utz, C., Degeling, M., Fahl, S., Schaub, F., & Holz, T. (2019). (Un)informed consent: Studying GDPR consent notices in the field. ACM SIGSAC Conference on Computer and Communications Security (CCS ’19), London, United Kingdom. https://doi.org/10.1145/3319535.3354212
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124
Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211(4481), 453–458. https://doi.org/10.1126/science.7455683.
Verbeek, P. P. (2009). Ambient intelligence and persuasive technology: The blurring boundaries between human and technology. Nanoethics, 3(3), 231–242. https://doi.org/10.1007/s11569-009-0077-8
von Neumann, J., & Morgenstern, O. (1944). Theory of Games and Economic Behavior. Princeton University Press.
Vugts, A., van den Hoven, M., de Vet, E., & Verweij, M. F. (2020). How autonomy is understood in discussions on the ethics of nudging. Behavioural Public Policy, 4(1), 108–123. https://doi.org/10.1017/bpp.2018.5.
Wang, Y., Wu, L., Lange, J., Fadhil, A., & Reiterer, H. (2018). Persuasive technology in reducing prolonged sedentary behavior at work: A systematic review. Smart Health, 7–8, 19–30. https://doi.org/10.1016/j.smhl.2018.05.002
Warner, M., & Fischer, D. (2019, April 09). Senators introduce bipartisan legislation to ban manipulative “Dark Patterns”. Retrieved March 22, 2022 from https://www.fischer.senate.gov/public/index.cfm/2019/4/senators-introduce-bipartisan-legislation-to-ban-manipulative-dark-patterns
Waters, T. E. A., & Fivush, R. (2014). Relations between narrative coherence, identity, and psychological well-being in emerging adulthood. Journal of Personality, 83(4), 441–451. https://doi.org/10.1111/jopy.12120
Wenker, K. (2022). A systematic literature review on persuasive technology at the workplace. ArXiv. https://doi.org/10.48550/arXiv.2201.00329.
Westin, F., & Chiasson, S. (2021). “It’s so difficult to sever that connection”: The role of FoMO in users’ reluctant privacy behaviours. CHI Conference on Human Factors in Computing Systems (CHI ’21), Yokohama, Japan. https://doi.org/10.1145/3411764.3445104
Wieczorek, M., O’Brolchain, F., Saghai, Y., & Gordijn, B. (2022). The ethics of self-tracking. A comprehensive review of the literature. Ethics & Behavior. https://doi.org/10.1080/10508422.2022.2082969.
Woodward, J. D. (1997). Biometrics: privacy’s foe or privacy’s friend?. Proceedings of the IEEE, 85(9), 1480–1492. https://doi.org/10.1109/5.628723
Wu, Q., Sang, Y., Wang, D., & Lu, Z. (2021). Malicious selling strategies in livestream shopping: A case study of Alibaba’s Taobao and ByteDance’s Douyin. ArXiv. https://doi.org/10.48550/arXiv.2111.10491.
Yamazaki, R., Nishio, S., Ishiguro, H., & Kase, H. (2018). Use of robotic media as persuasive technology and its ethical implications in care settings. Journal of Philosophy and Ethics in Health Care and Medicine, 12, 45–58.
Zagal, J. P., Björk, S., & Lewis, C. (2013). Dark patterns in the design of games. Foundations of digital games (FDG ’13). Society for the Advancement of the Science of Digital Games.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Funding
This research was supported by the Prime Minister Doctoral Research Fellowship granted by the Ministry of Human Resource Development, Government of India.
Author information
Authors and Affiliations
Contributions
This research was conducted as part of the doctoral dissertation of the first author and was supervised by the second author.
Corresponding author
Ethics declarations
Conflict of interest
The authors have no relevant financial or non-financial interests to disclose.
Ethical approval
Not applicable
Informed consent
Not applicable.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Ahuja, S., Kumar, J. Conceptualizations of user autonomy within the normative evaluation of dark patterns. Ethics Inf Technol 24, 52 (2022). https://doi.org/10.1007/s10676-022-09672-9
Published:
DOI: https://doi.org/10.1007/s10676-022-09672-9