Skip to main content

Optimization of what? For-profit health apps as manipulative digital environments

Abstract

Mobile health applications (‘health apps’) that promise the user to help her with some aspect of her health are very popular: for-profit apps such as MyFitnessPal, Fitbit, or Headspace have tens of millions of users each. For-profit health apps are designed and run as optimization systems. One would expect that these health apps aim to optimize the health of the user, but in reality they aim to optimize user engagement and, in effect, conversion. This is problematic, I argue, because digital health environments that aim to optimize user engagement risk being manipulative. To develop this argument, I first provide a brief analysis of the underlying business models and the resulting designs of the digital environments provided by popular for-profit health apps. In a second step, I present a concept of manipulation that can help analyze digital environments such as health apps. In the last part of the article, I use my concept of manipulation to analyze the manipulative potential of for-profit health apps. Although for-profit health can certainly empower their users, the conditions for empowerment also largely overlap with the conditions for manipulation. As a result, we should be cautious when embracing the empowerment discourse surrounding health apps. An additional aim of this article is to contribute to the rapidly growing literature on digital choice architectures and the ethics of influencing behavior through such choice architectures. I take health apps to be a paradigmatic example of digital choice architectures that give rise to ethical questions, so my analysis of the manipulative potential of health apps can also inform the larger literature on digital choice architectures.

This is a preview of subscription content, access via your institution.

Notes

  1. I thus do not focus on non-profit health apps, or on the more medical health apps that are built—often in cooperation with academic hospitals—to address a very specific medical problem. I focus on the big commercial players that develop ‘healthy lifestyle and wellness’ services for the general consumer population.

  2. Conversion refers to the principle of turning users into profitable users.

  3. Because of the limited space at my disposal, I will not explicitly engage with related challenges posed by health apps, such as challenges of informational privacy (Floridi 2005; Patterson 2013; Lanzing 2016) and surveillance (Lupton 2012).

  4. Thanks to Thomas Nys for pointing this out to me. See also Nys (2016).

  5. See Fahy et al. (2018) for a more elaborate analysis of Apple’s and Google’s developer platforms and the different kinds of monetization strategies apps can use.

  6. Apple, ‘Choosing a Business Model’: https://developer.apple.com/app-store/business-models/.

  7. Apple, ‘Choosing a Business Model’: https://developer.apple.com/app-store/business-models/.

    Google, ‘Earn more revenue with the right monetization options’: https://developer.android.com/distribute/best-practices/earn/monetization-options.

  8. Google, ‘Earn more revenue with the right monetization options’: https://developer.android.com/distribute/best-practices/earn/monetization-options.

  9. In the case of health apps this means advertising that is deliberately designed to ‘look and feel’ like health content, thereby obfuscating to the user the true commercial nature of the content.

  10. This option is not explicitly mentioned by Apple or Google. It is, however, a serious option. MyFitnessPal, an immensely popular calorie counting app with millions of users, is a good example. On its jobs page, MyFitnessPal announces that “MyFitnessPal has the largest database of human eating habits in the world. The opportunities for a data scientist here are almost endless.” (https://www.myfitnesspal.com/jobs) Forbes reports that health care providers and researchers can access the database when they enter into a “formal partnership” with MyFitnessPal (Olson 2014).

  11. Apple, ‘Using the Freemium Model’: https://developer.apple.com/app-store/freemium-business-model/.

  12. Google, ‘Improve Conversion Using Google Analytics for Firebase’: https://developer.android.com/distribute/best-practices/earn/improve-conversions.

  13. Google, ‘Improve Conversion Using Google Analytics for Firebase’: https://developer.android.com/distribute/best-practices/earn/improve-conversions.

  14. Google, ‘Improve Conversion Using Google Analytics for Firebase’: https://developer.android.com/distribute/best-practices/earn/improve-conversions.

  15. Headspace, ‘About Headspace’: https://www.headspace.com/about-us.

  16. This lack of access to the (business) operations of Big Tech companies could of course be criticized for a variety of reasons. For example, it makes it harder for investigative journalists and academics to scrutinize the practices of these companies. The same holds for policymakers and regulators who often have a hard time gaining access to Big Tech companies.

  17. Headspace, ‘Senior Data Analyst,’ job description that has since been removed, screenshot available here: https://imgur.com/a/qtSe4Ii.

  18. Headspace, ‘VP of Growth,’ job description that has since been removed, screenshot available here: https://imgur.com/a/qtSe4Ii.

  19. Headspace, ‘VP of Growth,’ job description that has since been removed, screenshot available here: https://imgur.com/a/qtSe4Ii.

  20. There is a rich literature on Foucauldian biopower and health and the role (digital) technologies can play in the exercise of biopower (e.g., Foucault 1975; Armstrong 1995; Petersen and Bunton 1997; Casper and Morrison 2010; Lupton 2012; Mayes 2015; Ajana 2017; Fotopoulou and O’Riordan 2017; Sanders 2017). Although this literature provides interesting and promising perspectives for my research, I do not have enough space at my disposal in this article to incorporate this complex literature into my argument.

  21. NativeAdBuzz, ‘This Health and Wellness Boom Has Been Building for Years… And It’s Finally About to ERUPT (Urgent: Your Free VIP Christmas Gift Has Arrived)’: http://www.nativeadbuzz.com/blog/this-health-and-wellness-boom-has-been-building-for-years-and-its-finally-about-to-erupt-urgent-your-free-vip-christmas-gift-has-arrived/.

  22. Fitbit, ‘Why Fitbit’: https://www.fitbit.com/whyfitbit.

  23. MyFitnessPal, ‘These Playlists Were Built to Make You Better’: https://blog.myfitnesspal.com/these-playlists-were-built-to-make-you-better/.

  24. MyFitnessPal, ‘These On-Ear Headphones Can Actually Withstand Your Workouts’: https://blog.myfitnesspal.com/these-on-ear-headphones-can-actually-withstand-your-workouts/.

  25. MyFitnessPal, ‘Why and How You Should Nix an Alarm Clock’: https://blog.myfitnesspal.com/why-and-how-you-should-nix-an-alarm-clock/.

  26. MyFitnessPal, ‘A Day in the Life of a Yoga Teacher’: https://blog.myfitnesspal.com/day-life-yoga-teacher/.

  27. MyFitnessPal, ‘How a Nutritionist Spends $50 at Whole Foods’: https://blog.myfitnesspal.com/how-a-nutritionist-spends-50-at-whole-foods/.

  28. Contrary to authors like Buss (2005) and Wood (2014) who treat manipulation as a non-moralized term.

  29. Susser et al. (2019a) speak of ‘vulnerabilities.’ I rather use the looser term ‘exploitable characteristics of a person’ because the term ‘vulnerabilities’ is sometimes associated (especially in legal discourse) with a fixed set of narrowly defined weakness, such as those that are the result of one’s age (‘the old and the young’) or of one’s psychical or mental infirmities (‘people with medically diagnosed handicaps’).

  30. Rudinow (1978, p. 346, emphasis added) explains that “the manipulator’s behavior is normally either deceptive or predicated on some privileged insight into the personality of his intended manipulee.”.

  31. Mills (2014, p. 138) provides a similar argument, referring to Gorin (2014) and Barnhill (2014): “Both Gorin and Barnhill point out that manipulation does not need to involve deception or covertness; these are not defining features of manipulation necessarily present in all cases of what we could agree to be manipulation. But most manipulators seek to hide the degree to which they are angling to achieve their desired result and would find the success of their project seriously compromised if their manipulative intentions were revealed.”.

  32. Susser et al. (2019a, b) provide a more elaborate discussion on the connection between autonomy and manipulation.

  33. Christman (1991, p. 10) has formulated a popular, somewhat weaker alternative: “What matters is what the agent thinks about the process of coming to have the desire, and whether she resists that process when (or if) given the chance.”

  34. I admit that the resulting picture can feel a bit messy or unclear. Someone can have a manipulative mindset, but, in the end, be drawn to techniques that are not manipulative in nature—An in principle manipulative mindset does not necessarily lead to manipulation. Although I agree that the resulting picture is messy, I see no way to avoid this. The empirical reality of data-driven dynamically adjustable choice architectures simply is very messy. The industry is constantly running (multiple, parallel) experiments to test a plethora of tweaks to their digital choice architectures to test whether those tweaks can successfully shape (patterns of) behavior. In this constant hunt for new behavior influence techniques, some will turn out to be manipulative, while some will turn out to be something else (e.g. coercive).

  35. It could of course still be argued that quite some business-to-consumer practices rise to the level of manipulation. Take, for example, Santilli (1983) and Crisp (1987) who argue that nearly all advertising is—at least slightly—manipulative. If one really wants to stretch my concept of manipulation, once could even try to argue that a billboard showing advertising is manipulative. Such a billboard with an advertisement for company X is (1) put up intentionally by company X, (2) with the aim to further the ends of company X without a genuine regard for the interests of the people passing by the billboard, (3) is designed by company X in such a manner that it targets either particular people in the street, or particular desires or fears of people in the street, and (4) the billboard does not explicitly communicate that “company X is trying to target you in such a manner that company X’s earns as much as possible.” Even if we agree that a billboard can, strictly speaking, be interpreted to be manipulative, it does not follow that every instance of manipulation warrants the same level of scrutiny. There is a significant difference between, one the one hand, a billboard that displays one and the same message to every person at a fixed location, and, on the other hand, a digital health environment which builds a relationship with the user over time, offers a continuous communication channel to the user, and can be personalized in real time based on what the continuous experiments tell will leads to maximum engagement. Unlike billboards, digital technologies like the health apps discussed can get to know their users over time and can at any time they see fit (e.g. through push notifications) try to leverage that information to manipulate every user personally.

  36. Headspace, ‘Senior Data Analyst,’ job description that has since been removed, screenshot available here: https://imgur.com/a/qtSe4Ii.

  37. NativeAdBuzz, ‘This Health and Wellness Boom Has Been Building for Years… And It’s Finally About to ERUPT (Urgent: Your Free VIP Christmas Gift Has Arrived)’: http://www.nativeadbuzz.com/blog/this-health-and-wellness-boom-has-been-building-for-years-and-its-finally-about-to-erupt-urgent-your-free-vip-christmas-gift-has-arrived/.

  38. Already in 1999, Hanson and Kysar used the concept of ‘market manipulation’ to identify such cases, a concept that was later updated by Calo (2014) who spoke of ‘digital market manipulation.’ Calo (2014, p. 1018) noted how “firms will increasingly be able to create suckers, rather than waiting for one to be born.” Spencer (2019, p. 34) has argued in a similar vein that “[r]ather than discovering existing vulnerabilities, marketers could exacerbate or even create vulnerabilities in individual subjects and then exploit those vulnerabilities.”

  39. For example, Headspace was funded through four funding rounds, raising $75 million (https://www.crunchbase.com/organization/headspace#section-investors). MyFitnessPal also received funding from venture capitalists (https://www.crunchbase.com/organization/myfitnesspal#section-investors) and was later acquired by Under Armour for $475 million (Olson 2015). Fitbit also saw four funding rounds raising $66 million from venture capitalists (https://www.crunchbase.com/organization/fitbit#section-investors).

  40. Consider also Culbert et al.’s (2015) meta-analysis of what causes people’s problematic relation to food. They emphasize that especially for perfectly healthy adolescent and young adult females, (digital) media exposure, and more specifically health ideals portrayed in those media, “have all been shown to prospectively predict increased levels of disordered eating cognitions and behaviors (e.g., body dissatisfaction, dieting, bulimic symptoms)” (Culbert et al. 2015, p. 1145).

  41. Here are a few examples. The blog post called ‘A Day in the Life of a Yoga Teacher’ (https://blog.myfitnesspal.com/day-life-yoga-teacher/) is, in reality, native advertising for skincare products that are framed as being part of a healthy, mindful life. The blog post called ‘These Playlists Were Built to Make You Better’ (https://blog.myfitnesspal.com/these-playlists-were-built-to-make-you-better/) is native advertising for a particular brand of headphones. The blog post called ‘Why and How You Should Nix an Alarm Clock’ (https://blog.myfitnesspal.com/why-and-how-you-should-nix-an-alarm-clock/) is native advertising for a company offering “certified sleep coaches” and for a company selling a wide range of sleep products.

References

  • Ajana, B. (2017). Digital health and the biopolitics of the quantified self. Digital Health, 3, 1–18.

    Google Scholar 

  • Alter, A. (2017). Irresistible: The rise of addictive technology and the business of keeping us hooked. New York: Penguin Press.

    Google Scholar 

  • Anderson, J. H. (2014). Autonomy and vulnerability entwined. In C. Mackenzie, W. Rogers, & S. Dodds (Eds.), Vulnerability: New essays in ethics and feminist philosophy (pp. 134–161). Oxford: Oxford University Press.

    Google Scholar 

  • Anderson, J. H., & Honneth, A. (2005). Autonomy, vulnerability, recognition, and justice. In J. Christman & A. Honneth (Eds.), Autonomy and the challenges to liberalism (pp. 127–149). Cambridge: Cambridge University Press.

    Google Scholar 

  • Armstrong, D. (1995). The rise of surveillance medicine. Sociology of Health & Illness, 17(3), 393–404.

    Google Scholar 

  • Barnhill, A. (2014). What is manipulation? In C. Coons & M. Weber (Eds.), Manipulation: Theory and practice (pp. 51–72). Oxford: Oxford University Press.

    Google Scholar 

  • Baron, M. (2003). Manipulativeness. Proceedings and Addresses of the American Philosophical Association, 77(2), 37–54.

    Google Scholar 

  • Boorse, C. (1975). On the distinction between disease and illness. Philosophy and Public Affairs, 5(1), 49–68.

    Google Scholar 

  • Boorse, C. (1977). Health as a theoretical concept. Philosophy of Science, 44(4), 542–573.

    Google Scholar 

  • Bovens, L. (2009). The ethics of nudge. In T. Grüne-Yanoff & S. Hansson (Eds.), Preference change: Approaches from philosophy, economics and psychology. New York: Springer.

    Google Scholar 

  • Brodesser-Akner, T. (2018). How Goop’s haters made Gwyneth Paltrow’s Company Worth $250 million: Inside the growth of the most controversial brand in the wellness industry. The New York Times Magazine, July 25, 2018. Retrieved from https://www.nytimes.com/2018/07/25/magazine/big-business-gwyneth-paltrow-wellness.html.

  • Buss, S. (2005). Valuing autonomy and respecting persons: Manipulation, seduction, and the basis of moral constraints. Ethics, 115(2), 195–235.

    Google Scholar 

  • Calo, R. (2014). Digital market manipulation. George Washington Law Review, 82(4), 995–1051.

    Google Scholar 

  • Casper, M., & Morrison, D. (2010). Medical sociology and technology: Critical Engagements. Journal of Health and Social Behavior, 51(1), 12–32.

    Google Scholar 

  • Cederström, C., & Spicer, A. (2015). The wellness syndrome. Cambridge: Polity Press.

    Google Scholar 

  • Chaykowski, K. (2017). Meet headspace, the app that made meditation a $250 million business. Forbes, January 8, 2017. Retrieved from https://www.forbes.com/sites/kathleenchaykowski/2017/01/08/meet-headspace-the-app-that-made-meditation-a-250-million-business/#7641f4f81f1b.

  • Christman, J. (1991). Autonomy and personal history. Canadian Journal of Philosophy, 21(1), 1–24.

    Google Scholar 

  • Christman, J. (2004). Relational autonomy, liberal individualism, and the social constitution of selves. Philosophical Studies, 117(1/2), 143–164.

    Google Scholar 

  • Code, L. (1991). What can she know? Feminist theory and the construction of knowledge. Ithaca: Cornell University Press.

    Google Scholar 

  • Cohen, S. (2018). Manipulation and deception. Australian Journal of Philosophy, 96(3), 483–497.

    Google Scholar 

  • Crawford, R. (2006). Health as a meaningful social practice. Health, 10(4), 401–420.

    Google Scholar 

  • Crisp, R. (1987). Persuasive advertising, autonomy, and the creation of desire. Journal of Business Ethics, 6(5), 413–418.

    Google Scholar 

  • Culbert, K. M., Racine, S. E., & Klump, K. L. (2015). Research review: What we have learned about the causes of eating disorders – a synthesis of sociocultural, psychological, and biological research. Journal of Child Psychology and Psychiatry, 56(11), 1141–1164.

    Google Scholar 

  • Eikey, E. V., & Reddy, M. C. (2017). “It’s definitely been a journey”: A qualitative study on how women with eating disorders use weight loss apps. In: Proceedings on the 2017 CHI Conference on Human Factors in Computing Systems, pp. 642–654.

  • Eikey, E. V., Reddy, M. C., Booth, K. M., Kvasny, L., Blair, J. L., Li, V., & Poole, E. (2017). Desire to be underweight: Exploratory study on weight loss app community and user perceptions of the impact on disordered eating behaviors. JMIR mHealth and uHealth, 5(10), e150.

    Google Scholar 

  • European Commission (2012). Communication from the Commission to the Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, Health Action Plan 2012–2020Innovative healthcare for the 21st century (Brussels, 6.12.2012. COM/2012/0736 final). Brussels: European Union.

  • European Commission. (2014). Green paper on mobile health (Brussels, 10.4.2014, COM(2014) 135 final). Available at https://ec.europa.eu/digital-single-market/en/news/green-paper-mobile-health-mhealth.

  • Eyal, N. (2014). Hooked: How to build habit-forming products. New York: Portfolio/Penguin.

    Google Scholar 

  • Faden, R. R., & Beauchamp, T. L. (1986). A history and theory of informed consent. New York: Oxford University Press.

    Google Scholar 

  • Fahy, R., Van Hoboken, J., & Van Eijk, N. (2018). Data privacy, transparency and the data-driven transformation of games to services. In Proceedings of the IEEE Games, Entertainment, Media Conference, pp. 1–9.

  • Floridi, L. (2005). The ontological interpretation of informational privacy. Ethics and Information Technology, 7(4), 185–200.

    Google Scholar 

  • Fotopoulou, A., & O’Roirdan, K. (2017). Training to self-care: Fitness tracking, biopedagogy and the healthy consumer. Health Sociology Review, 26(1), 54–68.

    Google Scholar 

  • Foucault, M. (1975). The birth of the clinic: An archeology of medical perception. New York: Vintage Books.

    Google Scholar 

  • Frischmann, B., & Selinger, E. (2018). Re-engineering humanity. Cambridge: Cambridge University Press.

    Google Scholar 

  • Gorin, M. (2014). Towards a theory of interpersonal manipulation. In C. Coons & M. Weber (Eds.), Manipulation: Theory and practice (pp. 73–97). Oxford: Oxford University Press.

    Google Scholar 

  • Greenspan, P. (2003). The problem with manipulation. American Philosophical Quarterly, 40(2), 155–164.

    Google Scholar 

  • Hanson, J. D., & Kysar, D. A. (1999). Taking behavioralism seriously: The problem of market manipulation. New York University Law Review, 74(3), 630–749.

    Google Scholar 

  • Hardin, R. (2002). Trust and trustworthiness. New York: Russell Sage Foundation.

    Google Scholar 

  • Hayek, F. A. (2006 [1960]). Freedom and coercion. In D. L. Miller (Ed.), The liberty reader (pp. 80–99). New York: Routledge.

  • Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.

    Google Scholar 

  • Lanzing, M. (2016). The transparent self. Ethics and Information Technology, 18(1), 9–16.

    Google Scholar 

  • Lanzing, M. (2018). “Strongly recommended”: revisiting decisional privacy to judge hypernudging in self-tracking technologies. Philosophy & Technology, online first.

  • Petersen, A., & Bunton, R. (Eds.). (1997). Foucault, health, and medicine. London: Routledge.

    Google Scholar 

  • Lupton, D. (2012). M-Health and health promotion: The digital cyborg and surveillance society. Social Theory & Health, 10(3), 229–244.

    Google Scholar 

  • Lupton, D. (2013). Quantifying the body: Monitoring and measuring health in the age of mHealth technologies. Critical Public Health, 23(4), 393–403.

    Google Scholar 

  • Lupton, D. (2018). I just want it to be done, done, done! Food tracking apps, affects, and agential capacities. Multimodal Technologies Interact, 2(2), 29–44.

    MathSciNet  Google Scholar 

  • Mackenzie, C., & Stoljar, N. (2000). Introduction: Autonomy refigured’. In C. Mackenzie & N. Stoljar (Eds.), Relational autonomy: Feminist perspectives on autonomy, agency, and the social self (pp. 3–31). Oxford: Oxford University Press.

    Google Scholar 

  • Mayes, C. (2015). The biopolitics of lifestyle: Foucault, ethics and healthy choices. London: Routledge.

    Google Scholar 

  • Meyers, D. (1989). Self, society and personal choice. Oxford: Oxford University Press.

    Google Scholar 

  • Mills, C. (2014). Manipulation as an aesthetic flaw. In C. Coons & M. Weber (Eds.), Manipulation: Theory and practice (pp. 135–150). Oxford: Oxford University Press.

    Google Scholar 

  • Nedelsky, J. (1989). Reconceiving autonomy: Sources, thoughts and possibilities. Yale Journal of Law and Feminism, 1, 7–36.

    Google Scholar 

  • Noggle, R. (1996). Manipulative actions: A conceptual and moral analysis. American Philosophical Quarterly, 33(1), 43–55.

    Google Scholar 

  • Nordenfelt, L. (1986). Health and disease: Two philosophical perspectives. Journal of Epidemiology and Community Health, 41, 281–284.

    Google Scholar 

  • Nordenfelt, L. (1987). On the nature of health: An action theoretic approach. Dordrecht: Reidel.

    Google Scholar 

  • Nys, T. R. V. (2016). Autonomy, trust, and respect. Journal of Medicine and Philosophy, 41(1), 10–24.

    Google Scholar 

  • Olson, P. (2014). MyFitnessPal Starts tracking steps to grow the world’s largest nutrition database. Forbes, May 1, 2014. Retrieved from https://www.forbes.com/sites/parmyolson/2014/05/01/myfitnesspal-starts-tracking-steps-to-grow-the-worlds-largest-nutrition-database/#341b09d05968.

  • Olson, P. (2015). Under armour buys health-tracking app MyFitnessPal for $475 million. Forbes, February 4, 2015. Retrieved from https://www.forbes.com/sites/parmyolson/2015/02/04/myfitnesspal-acquisition-under-armour/#1a75350c6935.

  • Overdorf, R., Kulynych, B., Balsa, E., Troncoso, C., & Gürses, S. (2018). POTs: Protective optimization technologies. arXiv: 1806.02711.

  • Patterson, H. (2013). Contextual expectations of privacy in self-generated health information flows. In: The 41st Research Conference on Communication, Information and Internet Policy. Retrieved from https://ssrn.com/abstract=2242144.

  • Rudinow, J. (1978). Manipulation. Ethics, 88(4), 338–347.

    Google Scholar 

  • Sanders, R. (2017). Self-tracking in the digital era: Biopower, patriarchy, and the new biometric body projects. Body & Society, 23(1), 36–63.

    Google Scholar 

  • Santilli, P. C. (1983). The informative and persuasive functions of advertising: A moral appraisal. Journal of Business Ethics, 2(1), 27–33.

    Google Scholar 

  • Sax, M., Helberger, N., & Bol, N. (2018). Health as a Means Towards Profitable Ends: mHealth Apps, user autonomy, and unfair commercial practices. Journal of Consumer Policy, 41(2), 103–134.

    Google Scholar 

  • Spencer, S. B. (2019). The problem of online manipulation. Retrieved from https://ssrn.com/abstract=3341653.

  • Stoljar, N. (2011). Informed consent and relational conceptions of autonomy. The Journal of Medicine and Philosophy, 36(4), 375–384.

    Google Scholar 

  • Susser, D. (2019). Invisible influence: Artificial intelligence and the ethics of adaptive choice architectures. In AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society, January 27–28, Honolulu, Hawaii, USA. Retrieved from http://www.aies-conference.com/wp-content/papers/main/AIES-19_paper_54.pdf.

  • Susser, D., Roessler, B., & Nissenbaum, H. (2019a). Online manipulation: hidden influences in a digital World. Georgetown Law Technology Review, 4(1), 1–45.

    Google Scholar 

  • Susser, D., Roessler, B., & Nissenbaum, H. (2019b). Technology, autonomy, and manipulation. Internet Policy Review. https://doi.org/10.14763/2019.2.1410.

    Article  Google Scholar 

  • Thaler, R. H., & Sunstein, C. R. (2008). Nudge: improving decision about health, wealth, and happiness. New Haven, CT: Yale University Press.

    Google Scholar 

  • Wood, A. W. (2014). Coercion, manipulation, exploitation. In C. Coons, M. Weber (Eds.), Manipulation: Theory and Practice (pp. 17–50). Oxford: Oxford University Press.

    Google Scholar 

  • Wojdynski, B. W., & Evans, N. J. (2016). Going native: Effects of disclosure position and language on the recognition and evaluation of online advertising. Journal of Advertising, 45(2), 157–168.

    Google Scholar 

  • Wojdynski, B. W., Bang, H., Keib, K., Jefferson, B. N., Choi, D., & Malson, J. L. (2017). Building a better native advertising disclosure. Journal of Interactive Advertising, 17(2), 150–161.

    Google Scholar 

  • World Health Organization. (2011). mHealth. New horizons for health through mobile technologies. Report based on the findings of the second global survey on eHealth. Geneva: World Health Organization.

    Google Scholar 

  • Yeung, K. (2017). ‘Hypernudge’: Big data as a mode of regulation by design. Information, Communication & Society, 20(1), 118–136.

    Google Scholar 

  • Zuboff, S. (2015). Big other: Surveillance capitalism and the prospect of an information civilization. Journal of Information Technology, 30(1), 75–89.

    Google Scholar 

  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the frontier of power. New York: Public Affairs.

    Google Scholar 

Download references

Acknowledgments

This article was written during a research visit to The Digital Life Initiative at Cornell Tech. I would like to thank Helen Nissenbaum, Eran Toch, Elizabeth O’Neill, Jake Goldenfein, Michael Byrne, Erica Du, Yvonne Wang, Lauren van Haaften-Schick, Mason Marks, and Jessie Taft for providing me with a welcoming and intellectually stimulating environment during my research visit. Special thanks to Elizabeth O’Neill, Eva Groen-Reijman, Thomas Nys, and Beate Roessler for extensive feedback on earlier versions of this article. I would also like to thank the anonymous reviewers for their constructive reviews and helpful suggestions for improvement of the article. The study was supported by the Research Priority Area ‘Personalised Communication’ of the University of Amsterdam.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marijn Sax.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Sax, M. Optimization of what? For-profit health apps as manipulative digital environments. Ethics Inf Technol 23, 345–361 (2021). https://doi.org/10.1007/s10676-020-09576-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-020-09576-6

Keywords

  • Manipulation
  • Health apps
  • mHealth
  • Autonomy
  • Choice architectures