Abstract
This contribution looks at the legal grounds for data processing (‘when is one allowed to collect and use data on others?’) according to the General Data Protection Regulation (GDPR). It then addresses the specific regime for profiling both by solely automated and non-automated means. What is the most suitable lawful basis for this specific, sometimes controversial kind of processing?
The vagueness and subjectivity of various relevant GDPR provisions in this matter can undermine legal certainty. Data protection principles such as transparency and overall fairness as enshrined in Article 5 GDPR may in this case serve as a resort to identify appropriate checks and balances. Additional understanding can be found outside data protection legislation—for instance, in competition law.
Similar content being viewed by others
Notes
Regulation (EU) 2016/679, General Data Protection Regulation [43].
Art. 2 GDPR.
An earlier version of this paper was presented at the ERA Brussels Annual Conference on EU Data Protection Law (DPL) in April 2018.
Directive 95/46 [19].
Hildebrandt [26].
Charter of Fundamental Rights of the European Union [13].
Convention for the Protection of Human Rights and Fundamental Freedoms [14].
European Data Protection Supervisor (EDPS) [20].
For a deeper understanding of this legal ground, see Kosta [33].
Information Commissioner’s Office [30].
Recital 43 GDPR and Art. 29 Working Party [2].
Art. 7(4) GDPR and Art. 29 Working Party [2].
In addition to the requirements of consent provided for in the GDPR, the future e-Privacy Regulation (which at the time of writing is undergoing its drafting process) is likely to address tracking walls by imposing a ban on them.
Privacy & Information Security Law Blog [42].
Martin [37].
Reidenberg, Russell, Callen, Qasir, and Norton [44].
Kuner et al. [34].
This issue was christened ‘the paradox of transparency’ by Baroccas and Nissebaum [4].
Culnan and Bruening [16].
Recital 32 GDPR.
Culnan and Bruening [16].
The Information Commissioner’s Office privacy notice can be found here: https://ico.org.uk/global/privacy-notice/.
For a further analysis of legitimate interests see Ferretti [21]. Ferretti argues in favour of a restrictive interpretation of legitimate interests as a lawful basis for processing in order to provide what he called a ‘rights’ perspective (i.e., a people’s rights protective approach).
Ferretti [21].
For further analysis on the importance of processing personal and other types of data for security purposes see Hoffman and Rimo [27].
Art. 29 Working Party [3].
Data protection Network [17].
Data protection Network [17].
Kamara and de Hert [32].
Van der Sloot and Borgesius [46].
Centre for Information Policy Leadership (CIPL) [12]. The report makes an argument in favour of enabling the use of legitimate interest as a basis for the processing of electronic communications personal data (both content and metadata), contrary to the position taken by the ePrivacy proposed Regulation, which allows for the processing on this data with the consent of the user but not under the legitimate interest of the service provider. Specifically, the report asserts that, in practice, individual privacy rights may ultimately benefit from a strong protection if an organization relies on legitimate interests for processing personal data, including electronic communications data. This is because where a company relies on Art. 6(1)(f) GDPR, it must implement additional measures to pass the balancing test. On the other side, it is argued in the report, although consent gives individuals the highest level of control over their personal data (or at least, in theory), it may not always result in a better protection of their rights in practice.
Recital 47.
Kamara and de Hert [32].
Art. 5 GDPR.
Council of Europe [15].
Lammerant and de Hert [35].
Regardless of whether data is anonymised, the mere collection of data that is personal makes the data protection framework applicable. In addition, when considering large or rich datasets, as is often the case nowadays, together with powerful analytical tools, the process of anonymising data gives rise to serious concerns. Therefore, the most common approach is to consider that this stage involves coded data (i.e., pseudonymised data) rather than anonymised data. Consequently, data protection rules must be respected during this process.
On the contrary, merely selecting individuals on the basis of their real characteristics cannot be deemed as profiling. For instance, if a financial institution selects customers with an income of over 8000 euros/month and with assets valued at over one million euros and groups all these customers together, the bank has not engaged in profiling but rather in a mere selection, which does not involve a margin of error.
Vedder [48].
Lammerant and de Hert [35].
Vedder [48].
Information Commissioner’s Office [29].
Recital 71.
Case C-216/2017 National Non-Discrimination and Equality Tribunal of Finland, EU:C:2017:216:TOC [39].
Recital 71.
Some voices have arisen that seem to criticize this provision. See, as an example, the Centre for Information Policy Leadership (CIPL) [11]. The report highlights the fact that legitimate interests are not among the lawful basis for automated decision-making including profiling, while valid consent may prove difficult to be obtained: ‘How can consent be ‘specific, informed and unambiguous’ if an organization may not be fully aware of how collected data will be used, or of all subsequent purposes of processing at the time of collection?’.
Gil González [23].
It is also arguable whether the controller is compelled to inform about inferred data under Art. 15.b on the ‘categories of personal data’ concerned or under Art. 15(3) ‘the controller shall provide a copy of the personal data undergoing processing’.
In its report, the Centre for Information Policy Leadership (CIPL) [11] also expresses the difficulties for meeting the requirement of transparency and information with respect to decisions made by complex algorithms, which cannot be often anticipated. In addition, the report states that, in technically complex scenarios where surprising correlations are found, such as when artificial intelligence is being applied, it turns difficult to know in advance what qualifies as ‘necessary’ for the purpose of the processing, or even what are the purposes of the processing, as these may change as the machine learns.
It is not clear whether this Art. indeed sets a right to object or a prohibition on the controller. On the one hand, the wording of the Art. states that it is a right, and it is located in Chapter 3 GDPR, entitled ‘Rights of the data subject’, which would mean that the provision only applies if actively invoked by the data subject. On the other hand, the Art. 29 Working Party interpreted it as a general prohibition. This discussion, however, is beyond the scope of this paper.
Veale and Edwards [47].
Recital 71 mentions a right to obtain an explanation of the decision reached which is not included in Art. 22. This discordancy has triggered numerous debates about the existence or not of a right to an explanation. An exhaustive discussion about this can be found in Wachter, Mittelstadt and Floridi [49]; Malgieri and Comandé [36] and Goodman and Flaxman [24].
Art. 29 Working Party [1].
Information Commissioner’s Office [28].
For a critical analysis of the ICO’s report on fairness and the GDPR, see Butterworth [5].
Art. 29 Working Party [1].
Kalimo and Majcher [31].
Case C-457/10, AstraZeneca [6].
Privacy & Information Security Law Blog [42].
Graef, Clifford and Valcke [25].
However, one must keep in mind that competition law prohibits the abuse of market power, not market power in itself, so that an assessment on whether the personal data price was fair or not is needed.
The Netherlands, Autoriteit Persoonsgegevens, z2013-00194 [45].
References
Article 29 Working Party: Guidelines on automated individual decision-making and profiling for the purposes of Regulation 2016/679 (WP 251 rev. 01) (2018). Available at: http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053
Article 29 Working Party: Guidelines on consent under Regulation 2016/679 (WP 259 rev. 01) (2018). Available at: http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=623051
Article 29 Working Party: Opinion on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46 (WP 217) (2014). Available at: http://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf
Baroccas, S., Nissebaum, H.: Big data’s end run around anonymity and consent. In: Lane, J., Stodden, V., Bender, S., Nissenbaum, H. (eds.) Privacy, Big Data and the Public Good. Cambridge University Press, Cambridge (2014)
Butterworth, M.: The ICO and artificial intelligence. The role of fairness in the GDPR framework. Comput. Law Secur. Rev. (2018). Available at: https://reader.elsevier.com/reader/sd/F4A2552841043362FF7A3BB9555F86DC14B1A531E9EDFDA585EEE202EA038D31B5878BDFC85756C5661695A4956A63BD
C-457/10, AstraZeneca, ECLI:EU:C:2012:770
Case C-13/16 Rigas, ECLI:EU:C:2017:336
Case C-212/13 Ryneš, ECLI:EU:C:2014:2428
Case C-398/15 Manni, ECLI:EU:C:2017:197
Cases C-468/10 and 469/10 ASNEF and FECEMD, ECLI:EU:C:2011:777
Centre for Information Policy Leadership: Delivering Sustainable AI Accountability in Practice. First Report: Artificial Intelligence and Data Protection in Tension (2018)
Centre for Information Policy Leadership: The ePrivacy Regulation and the EU. Charter of Fundamental Rights (2018)
Charter of Fundamental Rights of the European Union [2000] OJ L C 364/01
Convention for the Protection of Human Rights and Fundamental Freedoms as amended by Protocols No. 11 and No. 14, Rome, 4 November 1950
Council of Europe: The protection of individuals with regard to automatic processing of personal data in the context of profiling. Recommendation CM/Rec (2010) 13 and explanatory memorandum (2010). Available at: https://rm.coe.int/16807096c3
Culnan, M.J., Bruening, P.: Privacy notices limitations, challenges, and opportunities. In: Selinger, E., Polonetsky, J., Tene, O. (eds.) The Cambridge Handbook of Consumer Privacy (2018)
Data protection Network: Guidance on the use of Legitimate Interests under the EU General Data Protection Regulation (version 2.0) (2018). Available at: https://www.fairtrade.org.uk/~/media/FairtradeUK/Resources%20Library/Data%20Protection%20Network%20-%20Guidance%20on%20the%20use%20of%20legitimate%20interest.pdf
Diakopoulos, N.: Algorithmic-accountability: the investigation of Black Boxes. Tow Center for Digital Journalism (2014)
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L 281/31
European Data Protection Supervisor (EDPS): Developing a toolkit for assessing the necessity of measures that interfere with fundamental rights (2016). Available at: https://edps.europa.eu/sites/edp/files/publication/16-06-16_necessity_paper_for_consultation_en.pdf
Ferretti, F.: Data protection and the legitimate interest of data controllers: much ado about nothing or the winter of rights? Common Mark. Law Rev. 51(3), 843 (2014)
Gil González, E.: Big data y datos personales: ¿es el consentimiento la mejor manera de proteger nuestros datos? Diario La Ley, ISSN 1989-6913, No. 9050 (2017)
Gil González, E.: Aproximación al estudio de las decisiones automatizadas en el seno del Reglamento General Europeo de Protección de Datos a la luz de las tecnologías big data y de aprendizaje computacional. Revista Internacional de Transparencia 5 (2017)
Goodman, B., Flaxman, S.: European Union regulations on algorithmic decision-making and a “right to explanation”. AI Mag. 38(3) (2017)
Graef, I., Clifford, D., Valcke, P.: Fairness and enforcement: bridging competition, data protection and consumer law. Forthcoming in International Data Privacy Lawa (2018). Available at: https://www.researchgate.net/publication/326668711_Fairness_and_Enforcement_Bridging_Competition_Data_Protection_and_Consumer_Law_forthcoming_in_International_Data_Privacy_Law_2018
Hildebrandt, M.: Privacy as protection of the incomputable self: from agnostic to agonistic machine learning. DRAFT PAPER for the International Conference ‘The Problem of Theorizing Privacy’, submitted for publication in the special issue of Theoretical Inquiries in Law (TIL), and defended at PLSC-Europe (2018). Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3081776
Hoffman, D.A., Rimo, P.A.: It takes data to protect data. In: Selinger, E., Polonetsky, J., Tene, O. (eds.) The Cambridge Handbook of Consumer Privacy (2018). Available at: https://www.yvtltk.fi/en/index/opinionsanddecisions/decisions.html
Information Commissioner’s Office: Big data, artificial intelligence, machine learning and data protection (version 2.2) (2017). Available at: https://ico.org.uk/media/for-organisations/documents/2013559/big-data-ai-ml-and-data-protection.pdf
Information Commissioner’s Office: Guide to the General Data Protection Regulation (2018). Available at: https://ico.org.uk/media/for-organisations/guide-to-the-general-data-protection-regulation-gdpr-1-0.pdf
Information Commissioner’s Office: Lawful basis for processing, consent (2018). Available at: https://ico.org.uk/media/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/consent-1-0.pdf
Kalimo, H., Majcher, K.: The concept of fairness: linking EU competition and data protection law in the digital marketplace. Eur. Law Rev. 42(2), 210 (2017). Available at: https://www.researchgate.net/publication/317217197_The_concept_of_fairness_Linking_EU_competition_and_data_protection_law_in_the_digital_marketplace
Kamara, I., de Hert, P.: Understanding the balancing act behind the legitimate interest of the controller ground. A pragmatic approach. In: Selinger, E., Polonetsky, J., Tene, O. (eds.) The Cambridge Handbook of Consumer Privacy (2018)
Kosta, E.: Consent in European Data Protection Law. Martinus Nijhoff Publishers, Leiden (2013)
Kuner, C., Svantesson, D., Kate, F.H., Lynskey, O., Millard, C.: Machine learning with personal data: is data protection law smart enough to meet the challenge? Int. Data Priv. Law 7(1), 1 (2017)
Lammerant, H., de Hert, P.: Predictive profiling and its legal limits: effectiveness gone forever? In: van der Sloot, B., Broeders, D., Schrijvers, E. (eds.) Exploring the Boundaries of Big Data. Amsterdam University Press, The Hague/Amsterdam (2016)
Malgieri, G., Comandé, G.: Why a right to legibility of automated decision-making exists in the general data protection regulation. Int. Data Priv. Law 7(4), 243 (2017)
Martin, K.: Privacy notices as tabula rasa: an empirical investigation into how complying with a privacy notice is related to meeting privacy expectations online. J. Public Policy Mark. 34(2), 210 (2015)
Moerel, L., Prins, C.: Privacy for the homo digitalis: proposal for a new regulatory framework for data protection in the light of Big Data and the internet of things. SSRN Electronic Journal (2016)
National Non-Discrimination and Equality Tribunal of Finland. C-216/2017
Nymity and Future for Privacy Forum: Processing personal data on the basis of legitimate interests under the GDPR: practical cases (2018). Available at: https://info.nymity.com/hubfs/Landing%20Pages/Nymity%20FPF%20-%20Legitimate%20Interests%20Report/Deciphering_Legitimate_Interests_Under_the_GDPR.pdf?hsCtaTracking=9cf491f2-3ced-4f9c-9ffa-5d73a77a773e%7C7469b2ec-e91c-4887-b5db-68d407654e23
Pasquale, F.: The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press, Cambridge, London (2015)
Privacy & Information Security Law Blog: UK ICO Issues Warning to Washington Post Over Cookie Consent Practices, November 21 (2018)
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free Government of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), [2016] OJ L 119/1
Reidenberg, J., Russell, N., Callen, A., Qasir, S., Norton, T.: Privacy harms and the effectiveness of the notice and choice framework. 2014 TPRC Conference Paper, Fordham Law Legal Studies Research Paper No. 2418247 (2014). http://papers.ssrn.com/sol3/Papers.cfm?abstract_id=2418247
The Netherlands, Autoriteit Persoonsgegevens, z2013-00194, Google. English (non-official) translation available at: https://autoriteitpersoonsgegevens.nl/sites/default/files/downloads/mijn_privacy/en_rap_2013-google-privacypolicy.pdf
Van der Sloot, B., Borgesius, F.: The EU General Data Protection Regulation: a new global standard for information privacy (Working draft)
Veale, M., Edwards, L.: Clarity, surprises, and further questions in the Article 29 working party draft guidance on automated decision-making and profiling. Comput. Law Secur. Rev. 34(2), 398 (2018)
Vedder, A.: KDD: the challenge to individualism. Ethics Inf. Technol. 1(4), 275 (1999)
Wachter, S., Mittelstadt, B., Floridi, L.: Why a right to explanation of automated decision-making does not exist in the General Data Protection Regulation. Int. Priv. Data Law 7(2), 76 (2016)
World Economic Forum and The Boston Consulting Group: Rethinking Personal Data: Strengthening Trust (2012)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
E.G. González is visiting researcher at Instituut voor Informatierecht (IViR), Universiteit van Amsterdam, funded by the 2018 CEINDO CEU-Banco Santander grant.
Rights and permissions
About this article
Cite this article
Gil González, E., de Hert, P. Understanding the legal provisions that allow processing and profiling of personal data—an analysis of GDPR provisions and principles. ERA Forum 19, 597–621 (2019). https://doi.org/10.1007/s12027-018-0546-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12027-018-0546-z