A Human-Centric Perspective on Digital Consenting: The Case of GAFAM

  • Soheil HumanEmail author
  • Florian Cech
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 189)


According to different legal frameworks such as the European General Data Protection Regulation (GDPR), an end-user’s consent constitutes one of the well-known legal bases for personal data processing. However, research has indicated that the majority of end-users have difficulty making sense of what they are consenting to in the digital world. Moreover, it was demonstrated that marginalized people are confronted with even more difficulties dealing with their own digital privacy. In this paper, using an enactivist perspective in cognitive science, we develop a basic human-centric framework regarding digital consent. We argue the action of consenting is a sociocognitive action and includes cognitive, collective, and contextual aspects. Based on this theoretical framework, we present our qualitative evaluation of the practice of gaining consent conducted by the five big tech companies, i.e. Google, Amazon, Facebook, Apple, and Microsoft (GAFAM). The evaluation shows that these companies are lacking in their efforts to empower end-users by considering the human-centric aspects of the action of consenting. We use this approach to argue that the consent gaining mechanisms violate principles of fairness, accountability and transparency and suggest that our approach might even raise doubts regarding the lawfulness of the acquired consent–particularly considering the basic requirements of lawful consent within the legal framework of the GDPR.



This work is partially funded through the EXPEDiTE project (Grant 867559) by the Austrian Federal Ministry for Climate Action, Environment, Energy, Mobility, Innovation and Technology under the program “ICT of the Future” between September 2018 and February 2020. We would like to express our great appreciation for valuable criticism and ideas contributed by Gustaf Neumann, Seyedeh Anahit Kazzazi, Seyedeh Mandan Kazzazi, Stefano Rossetti, Kemal Ozan Aybar, Rita Gsenger, and Niklas Kirchner.


  1. 1. Advertising Preferences. (2019). Accessed 22 Aug 2019
  2. 2.
    Microsoft account \(\vert \) Privacy. (2019). Accessed 23 Aug 2019
  3. 3.
    Microsoft Privacy Statement—Microsoft privacy. (2019). Accessed 23 Aug 2019
  4. 4.
    Acquisti, A.: Privacy in electronic commerce and the economics of immediate gratification. In: Proceedings of the 5th ACM Conference on Electronic Commerce, pp. 21–29. ACM (2004)Google Scholar
  5. 5.
    Allen, M., Friston, K.J.: From cognitivism to autopoiesis: towards a computational framework for the embodied mind. Synthese 195(6), 2459–2482 (2018)CrossRefGoogle Scholar
  6. 6.
    Alt, R., Human, S., Neumann, G.: End-user empowerment in the digital age. In: Proceedings of the 53rd Hawaii International Conference on System Sciences, pp. 4099–4101 (2020)Google Scholar
  7. 7.
    Anderson, R.J.: Representations and requirements—the value of ethnography in system design. Hum.-Comput. Interact. 9(2), 151–182 (1994). Scholar
  8. 8.
    Aybar, K.O., Human, S., Gesenger, R.: Digital inequality: call for sociotechnical privacy management approaches. Workshop on Engineering Accountable Information Systems. European Conference on Information Systems—ECIS 2019 (2019)Google Scholar
  9. 9.
    Blackmon, M.H., Polson, P.G., Kitajima, M., Lewis, C.H.: Cognitive walkthrough for the web. CHI p. 463 (2002).
  10. 10.
    Bösch, C., Erb, B., Kargl, F., Kopp, H., Pfattheicher, S.: Tales from the dark side: privacy dark strategies and privacy dark patterns. Proc. Priv. Enhanc. Technol. 2016(4), 237–254 (2016). Scholar
  11. 11.
    Boyd, D.: It’s Complicated: The Social Lives of Networked Teens. Yale University Press (2014)Google Scholar
  12. 12.
    Boyd, D., Marwick, A.: Social privacy in networked publics: teens attitudes, practices, and strategies. In: Decade in Internet Time: Symposium on the Dynamics of the Internet and Society. Oxford, UK (2011)Google Scholar
  13. 13.
    Busch, A.: Privacy, technology, and regulation: why one size is unlikely to fit all. Social Dimensions of Privacy: Interdisciplinary Perspectives, pp. 303–323. Cambridge University Press, Cambridge (2015)Google Scholar
  14. 14.
    Chromik, M., Eiband, M., Völkel, S.T., Buschek, D.: Dark Patterns of Explainability, Transparency, and User Control for Intelligent Systems. IUI Workshops, vol. 2327 (2019)Google Scholar
  15. 15.
    Clark, A.: Whatever next? Predictive brains, situated agents, and the future of cognitive science. Behav. Brain Sci. 36(3), 181–204 (2013)CrossRefGoogle Scholar
  16. 16.
    Commission, E.: Special Eurobarometer 431: Data Protection (2015)Google Scholar
  17. 17.
    Das, S., Kramer, A.D., Dabbish, L.A., Hong, J.I.: The role of social influence in security feature adoption. In: Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 1416–1426. ACM (2015)Google Scholar
  18. 18.
    Dourish, P.: Implications for design. In: The SIGCHI Conference, p. 541. ACM Press, New York, NY, USA (2006).
  19. 19.
    Emami Naeini, P., Degeling, M., Bauer, L., Chow, R., Cranor, L.F., Haghighat, M.R., Patterson, H.: The influence of friends and experts on privacy decision making in iot scenarios. In: Proceedings of the ACM on Human-Computer Interaction, vol. 2(CSCW), p. 48 (2018)CrossRefGoogle Scholar
  20. 20.
    EU: Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing. Off. J. European Union 1–88 (2016)Google Scholar
  21. 21.
    Fox, J.: The uncertain relationship between transparency and accountability. Dev. Pract. 17(4–5), 663–671 (2007). Scholar
  22. 22.
    Granovetter, M.: Economic action and social structure: the problem of embeddedness. Am. J. Sociol. 91(3), 481–510 (1985)CrossRefGoogle Scholar
  23. 23.
    Gray, C.M., Kou, Y., Battles, B., Hoggatt, J., Toombs, A.L.: The dark (patterns) side of UX design. In: Proceedings of the Conference on Human Factors in Computing Systems. Purdue University, West Lafayette, United States (2018).
  24. 24.
    Greenberg, S., Boring, S., Vermeulen, J., Dostal, J.: Dark patterns in proxemic interactions—a critical perspective. In: Proceedings of the Conference on Designing Interactive Systems, pp. 523–532 (2014).
  25. 25.
    Huber, J., Payne, J.W., Puto, C.: Adding asymmetrically dominated alternatives: violations of regularity and the similarity hypothesis. J. Consum. Res. 9(1), 90–98 (1982)CrossRefGoogle Scholar
  26. 26.
    Human, S., Bidabadi, G., Peschl, M.F., Savenkov, V.: An enactive theory of need satisfaction. In: Müller, V.C. (ed.) Philosophy and Theory of Artificial Intelligence 2017, pp. 40–42. Springer International Publishing, Cham (2018)CrossRefGoogle Scholar
  27. 27.
    Human, S., Bidabadi, G., Savenkov, V.: Supporting pluralism by artificial intelligence: conceptualizing epistemic disagreements as digital artifacts. In: Müller, V.C. (ed.) Philosophy and Theory of Artificial Intelligence 2017. Springer, Cham (2018)Google Scholar
  28. 28.
    Human, S., Fahrenbach, F., Kragulj, F., Savenkov, V.: Ontology for representing human needs. In: Różewski, P., Lange, C. (eds.) Knowledge Engineering and Semantic Web, pp. 195–210. Springer International Publishing, Cham (2017)CrossRefGoogle Scholar
  29. 29.
    Human, S., Gsenger, R., Neumann, G.: End-user empowerment: an interdisciplinary perspective. Hawaii Int. Conf. Syst. Sci. 2020, 4102–4111 (2020)Google Scholar
  30. 30.
    Human, S., Neumann, G., Peschl, M.: [how] can pluralist approaches to computational cognitive modeling of human needs and values save our democracies? Intellectica 70, 165–180 (2019)Google Scholar
  31. 31.
    Human, S., Wagner, B.: Is informed consent enough? considering predictive approaches to privacy. In: CHI2018 Workshop on Exploring Individual Differences in Privacy. Montréal (2018)Google Scholar
  32. 32.
    Hutto, D.D.: Surfing uncertainty: prediction, action and the embodied mind, by andy clark, pp. xviii+ 401,£ 19.99 (hardback), 2016. Oxford University Press, New york (2018)Google Scholar
  33. 33.
    Kain, E.: Facebook Turned a Blind Eye to ‘Friendly Fraud’ as Kids Racked up Thousands on Games. Forbes (2019)Google Scholar
  34. 34.
    Kemper, J., Kolkman, D.: Transparent to whom? No algorithmic accountability without a critical audience. Inf. Commun. Soc. 1–16 (2018). Scholar
  35. 35.
    Kirchner, N., Human, S., Neumann, G.: Context-sensitivity of informed consent: the emergence of genetic data markets. Workshop on Engineering Accountable Information Systems. European Conference on Information Systems—ECIS 2019 (2019)Google Scholar
  36. 36.
    Kumaraguru, P., Cranor, L.F.: Priv. Indexes : A Surv. Westin’s Stud. (2005). Scholar
  37. 37.
    Lehtiniemi, T., Kortesniemi, Y.: Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach. Big Data Soc. 4(2), 2053951717721,935 (2017)CrossRefGoogle Scholar
  38. 38.
    Lehtiniemi, T., Kortesniemi, Y.: Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach. Big Data & Soc. 4(2), 205395171772,193 (2017). Scholar
  39. 39.
    Lupton, D.: Personal data practices in the age of lively data. Digit. Sociol. 335–50 (2016)Google Scholar
  40. 40.
    Lyon, D.: Surveillance Capitalism, Surveillance Culture and Data Politics, pp. 1–15 (2018)Google Scholar
  41. 41.
    Madden, M.: Privacy, security, and digital inequality: how technology experiences and resources vary by socioeconomic status, race, and ethnicity. Data Soc. (2017)Google Scholar
  42. 42.
    Marwick, A., Fontaine, C., Boyd, D.: Nobody sees it, nobody gets mad: social media, privacy, and personal responsibility among low-ses youth. Soc. Media+ Soc. 3(2), 2056305117710,455 (2017)CrossRefGoogle Scholar
  43. 43.
    Marwick, A.E., Boyd, D.: Networked privacy: how teenagers negotiate context in social media. New Media Soc. 16(7), 1051–1067 (2014)CrossRefGoogle Scholar
  44. 44.
    Marwick, A.E., Boyd, D.: Privacy at the margins understanding privacy at the margins-introduction. Int. J. Commun. 12, 9 (2018)Google Scholar
  45. 45.
    Mathur, A., Acar, G., Friedman, M., Lucherini, E., Mayer, J., Chetty, M., Narayanan, A.: Dark patterns at scale—findings from a crawl of 11K shopping websites. CoRR 1907 (2019). arXiv:1907.07,032
  46. 46.
    Obar, J.A., Oeldorf-Hirsch, A.: The biggest lie on the internet: ignoring the privacy policies and terms of service policies of social networking services. Inf. Commun. Soc. (2018)Google Scholar
  47. 47.
    Rudolph, M., Feth, D., Polst, S.: Why users ignore privacy policies–a survey and intention model for explaining user privacy behavior. In: International Conference on Human-Computer Interaction, pp. 587–598. Springer (2018)Google Scholar
  48. 48.
    Tene, O., Polonetsky, J.: A theory of creepy: technology, privacy and shifting social norms. Yale J. Law Technol. 16, 59 (2013)Google Scholar
  49. 49.
    Tidwell, J.: Designing interfaces. Patterns for Effective Interaction Design. O’Reilly Media, Inc. (2005)Google Scholar
  50. 50.
    Van Dijck, J., Poell, T., De Waal, M.: The Platform Society: Public Values in a Connective World. Oxford University Press (2018)Google Scholar
  51. 51.
    Varela, F.J., Thompson, E., Rosch, E.: The Embodied Mind: Cognitive Science and Human Experience. MIT Press (2017)Google Scholar
  52. 52.
    Zuboff, S.: The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books (2019)Google Scholar

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021

Authors and Affiliations

  1. 1.Sustainable Computing Lab & Institute for Information Systems and New Media, Vienna University of Economics and Business (WU Wien)ViennaAustria
  2. 2.Centre for Informatics and Society, Vienna University of Technology (TU Wien)ViennaAustria

Personalised recommendations