Skip to main content

Privacy and Data Protection in the Era of Recommendation Systems: A Postphenomenological Approach

  • Conference paper
  • First Online:
Privacy and Identity Management (Privacy and Identity 2022)

Abstract

Privacy and data protection are two fundamental rights. As complex concepts, they lend themselves to various interpretations aimed at protecting individuals. In this paper, I explore the concepts of ‘privacy’ and ‘data protection’ directly related to the protection of ‘identity’. I argue that the ability for privacy and data protection law to protect identity is being challenged by recommendation systems. In particular, I explore how recommendation systems are continuously influencing people based on what can be predicted about them, while the legal tools that we have do not fully protect individuals in this regard. This paper aims at breaching this gap, by focusing on the study of Porcedda, who examines four different notions of privacy related to identity under article 7 of the European Charter of Fundamental Rights. Through the huge capacity for analytics that draws on a lawful combination of consent and non-personal data, this paper examines why data protection regulation does not, in fact, fully protect individuals. In this paper it is explored how the notion of privacy, understood as the protection of identity, is especially relevant to understand the limitations of data protection law, and I explore postphenomenology to help us better contextualize the relationship between identity and recommendation systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 79.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    This gave ‘privacy’ the acknowledgement of a human right; and inspired the creation in 1950 of article 8 of the European Convention of Human Rights (ECHR). Article 17 of the International Covenant on Civil and Political Rights (ICCPR) of 1976 followed, which is the first legally binding instrument in Europe related to privacy, as it refers to an international treaty which was joined by 180 states, however it was not legally binding for the European Union.

  2. 2.

    Specifically, the right to privacy in EU law was not legally binding until the Lisbon Treaty in the same year 2009.

  3. 3.

    Broadly speaking, national laws started to emerge, not regarding the concept of privacy but data laws. In 1973 Sweden created the first national privacy law named ‘Data Act’, followed by the German Federal Data Protection Act of 1978.

  4. 4.

    Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, OJ L 281, 23.11.1995.

  5. 5.

    There will be new EU regulations such as Data Act or AI Act that will deal with non-personal data, but these are not enforced yet and refer to a data transformation by 2030. [25] https://www.europarl.europa.eu/factsheets/en/sheet/64/digital-agenda-for-europe.

    Data Governance Act has been approved in 2022 but will only be enforced in 2023. And it only applies to the facilitation of public data sources. See: https://digital-strategy.ec.europa.eu/en/policies/data-governance-act.

    In addition, especially relevant for this paper is the Digital Service Package that contains two regulations: Digital Service Act (DSA) and Digital Markets Act (DMA), that will enter into force in 2024: https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package.

  6. 6.

    Article 4(1) GDPR.

  7. 7.

    Article 6 GDPR.

  8. 8.

    Herein lies another path, which merits further exploration in the future: could the fundamental right to ‘freedom of thought, conscience and religion (Article 9, EU Charter) also be considered applicable in protecting identity in the digital sphere? – However, in this paper I center the discussion on privacy and data protection as sufficient to understand this issue.

  9. 9.

    Page 74 – Porcedda, M.G.: Cybersecurity and privacy rights in EU law: moving beyond the trade-off model to appraise the role of technology.

  10. 10.

    The right to the protection of personal data as related to private life is also one at the core of the European culture of data protection since the German Constitutional Court claimed that we have the “right to informational self-determination”.

  11. 11.

    Page 80 – Porcedda, M.G.: Cybersecurity and privacy rights in EU law: moving beyond the trade-off model to appraise the role of technology.

  12. 12.

    Although Porcedda’s work mentions how the notion of ‘personal data’ does not account for new inventions of computerized systems and unprecedented (personal) data processing capabilities, such as transborder data flows (p.93).

  13. 13.

    Page 83 – Porcedda, M.G.: Cybersecurity and privacy rights in EU law: moving beyond the trade-off model to appraise the role of technology.

  14. 14.

    Page 87 – Porcedda, M.G.: Cybersecurity and privacy rights in EU law: moving beyond the trade-off model to appraise the role of technology.

  15. 15.

    Taylor (1989) in Sources of the self argues that the fundamental interaction of identity is of love and family “the increasing possibility to choose freely one’s partners, which places love at the heart of the family, makes family life instrumental to the development of identity” (Idem).

  16. 16.

    Page 88 – Porcedda, M.G.: Cybersecurity and privacy rights in EU law: moving beyond the trade-off model to appraise the role of technology.

  17. 17.

    Page 90 – Porcedda, M.G.: Cybersecurity and privacy rights in EU law: moving beyond the trade-off model to appraise the role of technology.

  18. 18.

    Page 91 – Idem.

  19. 19.

    Page 97 – Porcedda, M.G.: Cybersecurity and privacy rights in EU law: moving beyond the trade-off model to appraise the role of technology.

  20. 20.

    The field of Science and Technology Studies (STS) has traditionally considered technology as socially constructed e.g., Trevor Pich; Langdon Winner.

  21. 21.

    Further investigations should account for new regulatory tools such as the AI Act and the influence or manipulation of recommendation systems.

  22. 22.

    Page 100 - Porcedda, M.G.: Cybersecurity and privacy rights in EU law: moving beyond the trade-off model to appraise the role of technology.

References

  1. Sunstein, C.R.: Echo chambers: Bush v. Gore, impeachment, and beyond. Princeton University Press, Princeton (2001)

    Google Scholar 

  2. Pariser, E.: The Filter Bubble: What The Internet Is Hiding From You. Penguin UK (2011)

    Google Scholar 

  3. Cadwalladr, C., Graham-Harrison, E.: Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach (2018). https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election

  4. Wylie, C.: Mindf*ck: Inside Cambridge Analytica’s Plot to Break the World. Profile Books (2019)

    Google Scholar 

  5. Horwitz, J., Seetharaman, D.: Facebook Executives Shut Down Efforts to Make the Site Less Divisive (2020). https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499

  6. Posetti, J.: News industry transformation: digital technology, social platforms and the spread of misinformation and disinformation. 15 (2018)

    Google Scholar 

  7. The online world still can’t quit the ‘Big Lie’. https://www.politico.com/news/2022/01/06/social-media-donald-trump-jan-6-526562. Accessed 11 Oct 2022

  8. Bellanova, R., González Fuster, G.: No (Big) Data, no fiction? Thinking surveillance with/against Netflix. Presented at the February 9 (2018)

    Google Scholar 

  9. Hallinan, B., Striphas, T.: Recommended for you: the Netflix Prize and the production of algorithmic culture. New Media Soc. 18, 117–137 (2016). https://doi.org/10.1177/1461444814538646

    Article  Google Scholar 

  10. Meyer, M.N.: Everything You Need to Know About Facebook’s Controversial Emotion Experiment. https://www.wired.com/2014/06/everything-you-need-to-know-about-facebooks-manipulative-experiment/

  11. As algorithms take over, YouTube’s recommendations highlight a human problem. https://www.nbcnews.com/tech/social-media/algorithms-take-over-youtube-s-recommendations-highlight-human-problem-n867596. Accessed 17 Mar 2022

  12. Thompson, C.: YouTube’s Plot to Silence Conspiracy Theories. https://www.wired.com/story/youtube-algorithm-silence-conspiracy-theories/

  13. The Age of Surveillance Capitalism: The Fight for a Human Future at the New ... - Shoshana Zuboff - Google Libros. https://books.google.be/books?hl=es&lr=&id=W7ZEDgAAQBAJ&oi=fnd&pg=PT12&dq=surveillance+capitalism+book&ots=dpn6GTSDu0&sig=nlwmFMGAZqt2bwVPdNloOhCRSXQ&redir_esc=y. Accessed 10 Dec 2021

  14. edpb_03-2022_guidelines_on_dark_patterns_in_social_media_platform_interfaces_en.pdf. https://edpb.europa.eu/system/files/2022-03/edpb_03-2022_guidelines_on_dark_patterns_in_social_media_platform_interfaces_en.pdf

  15. Biddle, S.: Facebook Uses Artificial Intelligence to Predict Your Future Actions for Advertisers, Says Confidential Document. https://theintercept.com/2018/04/13/facebook-advertising-data-artificial-intelligence-ai/. Accessed 01 Mar 2022

  16. How to clear your viewing history in Netflix. https://www.imore.com/how-clear-your-viewing-history-netflix. Accessed 19 Aug 2022

  17. Facebook’s ad algorithms are still excluding women from seeing jobs. https://www.technologyreview.com/2021/04/09/1022217/facebook-ad-algorithm-sex-discrimination/. Accessed 10 Aug 2022

  18. Nkem, F.U., Chima, O.A., Martins, O.P., Ifeanyi, A.L., Fiona, O.N.: Portrayal of Women in Advertising on Facebook and Instagram (2020). https://doi.org/10.5281/ZENODO.4006048

  19. 20190513.Working_Paper_González_Fuster_Hijmans.pdf. https://brusselsprivacyhub.eu/events/20190513.Working_Paper_Gonza%CC%81lez_Fuster_Hijmans.pdf

  20. EU Charter of Fundamental Rights. https://ec.europa.eu/info/aid-development-cooperation-fundamental-rights/your-rights-eu/eu-charter-fundamental-rights_en. Accessed 05 Aug 2022

  21. Cohen, J.E.: What privacy is for symposium: privacy and technology. Harv. Law Rev. 126, 1904–1933 (2012)

    Google Scholar 

  22. Gutwirth, S., De Hert, P.: Privacy, data protection and law enforcement. Opacity of the individual and transparency of power. Direito Público. 18 (2022). https://doi.org/10.11117/rdp.v18i100.6200

  23. Warren, S.D., Brandeis, L.D.: The right to privacy. Harv. Law Rev. 4, 193–220 (1890). https://doi.org/10.2307/1321160

    Article  Google Scholar 

  24. Westin, A.F.: Special report: legal safeguards to insure privacy in a computer society. Commun. ACM. 10, 533–537 (1967). https://doi.org/10.1145/363566.363579

    Article  Google Scholar 

  25. GDPR EUR-Lex - 32016R0679 - EN - EUR-Lex, https://eur-lex.europa.eu/eli/reg/2016/679/oj. Accessed 01 Mar 2022

  26. Digital Agenda for Europe | Fact Sheets on the European Union | European Parliament. https://www.europarl.europa.eu/factsheets/en/sheet/64/digital-agenda-for-europe. Accessed 13 Dec 2022

  27. Meta Privacy Policy – How Meta collects and uses user data. https://www.facebook.com/privacy/policy/?entry_point=data_policy_redirect&entry=0. Accessed 05 Aug 2022

  28. Solove, D.J.: A taxonomy of privacy. Univ. Pa. Law Rev. 154, 477–564 (2005)

    Article  Google Scholar 

  29. Bellanova, R.: Digital, politics, and algorithms: governing digital data through the lens of data protection. Eur. J. Soc. Theory 20, 329–347 (2017). https://doi.org/10.1177/1368431016679167

    Article  Google Scholar 

  30. Taylor, L., Floridi, L.: Group Privacy: New Challenges of Data Technologies. Group Priv. (2017)

    Google Scholar 

  31. O’Neil, C.: Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown (2016)

    Google Scholar 

  32. Porcedda, M.G.: Cybersecurity and privacy rights in EU law : moving beyond the trade-off model to appraise the role of technology (2017). http://hdl.handle.net/1814/26594. https://doi.org/10.2870/4605

  33. Schoeman, F.D.: Philosophical Dimensions of Privacy: An Anthology. Cambridge University Press, Cambridge (1984)

    Google Scholar 

  34. Naughton, J.: Facebook Saves The Stuff You Type — Even If You Have Second Thoughts And Delete It BEFORE You Post. https://www.businessinsider.com/facebook-saves-stuff-you-start-typing-and-the-delete-2013-12. Accessed 14 Dec 2022

  35. Hill, K.: How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did. https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/. Accessed 12 Aug 2022

  36. What’s in a name?: identiteitsfraude en -diefstal. Maklu (2012)

    Google Scholar 

  37. How to Filter, Block, and Report Harmful Content on Social Media | RAINN. https://www.rainn.org/articles/how-filter-block-and-report-harmful-content-social-media. Accessed 12 Aug 2022

  38. Cheney-Lippold, J.: A new algorithmic identity: soft biopolitics and the modulation of control. Theory Cult. Soc. 28, 164–181 (2011). https://doi.org/10.1177/0263276411424420

    Article  Google Scholar 

  39. The Digital Subject: People as Data as Persons. https://journals.sagepub.com/doi/epub/https://doi.org/10.1177/0263276419840409. https://doi.org/10.1177/0263276419840409. Accessed 22 Oct 2022

  40. Rodotà, S.: Data protection as a fundamental right. In: Gutwirth, S., Poullet, Y., De Hert, P., de Terwangne, C., Nouwt, S. (eds.) Reinventing Data Protection?, pp. 77–82. Springer, Dordrecht (2009). https://doi.org/10.1007/978-1-4020-9498-9_3

    Chapter  Google Scholar 

  41. Agre, P.E.: The architecture of identity: embedding privacy in market institutions. Inf. Commun. Soc. 2, 1–25 (1999). https://doi.org/10.1080/136911899359736

    Article  Google Scholar 

  42. Winner, L.: Do artifacts have politics? Daedalus 109, 121–136 (1980)

    Google Scholar 

  43. Ihde, D.: Postphenomenology: Essays in the Postmodern Context. Northwestern University Press (1995)

    Google Scholar 

  44. Verbeek, P.-P.: What Things Do: Philosophical Reflections on Technology, Agency, and Design. Penn State University Press (2021). https://doi.org/10.1515/9780271033228

Download references

Acknowledgements

This paper has received valuable comments and support of my supervisors Rosamunde Van Brakel, Trisha Meyer and Rocco Bellanova; the previous work and inspiration of Maria Grazia Porcedda, and Paul de Hert.

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 813497.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ana Fernández Inguanzo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Fernández Inguanzo, A. (2023). Privacy and Data Protection in the Era of Recommendation Systems: A Postphenomenological Approach. In: Bieker, F., Meyer, J., Pape, S., Schiering, I., Weich, A. (eds) Privacy and Identity Management. Privacy and Identity 2022. IFIP Advances in Information and Communication Technology, vol 671. Springer, Cham. https://doi.org/10.1007/978-3-031-31971-6_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-31971-6_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-31970-9

  • Online ISBN: 978-3-031-31971-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics