Skip to main content

The Right Not to be Subject to Automated Decisions Based on Profiling

  • Chapter
  • First Online:

Abstract

In this chapter, a critical analysis is undertaken of the provisions of Art. 22 of the European Union’s General Data Protection Regulation of 2016, with lines of comparison drawn to the predecessor for these provisions—namely Art. 15 of the 1995 Data Protection Directive. Article 22 places limits on the making of fully automated decisions based on profiling when the decisions incur legal effects or similarly significant consequences for the persons subject to them. The basic argument advanced in the chapter is that Art. 22 on its face provides persons with stronger protections from such decision making than Art. 15 of the Directive does. However, doubts are raised as to whether Art. 22 will have a significant practical impact on automated profiling.

Work on this chapter was carried out partly under the aegis of the research project ‘Security in Internet Governance and Networks: Analysing the Law’ (SIGNAL), funded by the Norwegian Research Council and Norid AS. References to legal instruments are to their amended form as of 1 May 2017. Thanks are due to Luca Tosoni for useful input, particularly regarding Italian law. Nonetheless, the usual disclaimer applies.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    See further, e.g., Hildebrandt (2008), p. 19.

  2. 2.

    This refers to the customisation of online advertisements to a person based on his or her online profile. See further Leon et al. (2012); Borgesius (2015), ch.2.

  3. 3.

    E-recruiting refers to the automated ranking of job applicants, which in turn can be used to select automatically persons for job interviews or to reject automatically other applicants. See further Faliagka et al. (2012), p. 557.

  4. 4.

    Weblining refers to a situation in which a person visiting a website is offered products or services at a higher price than other (assumedly more valued) consumers have to pay, or the person is denied an opportunity of purchasing products/services that are made available to others, based on the data gleaned from the person’s online activities. See further Stepanek (2000), Andrews (2011).

  5. 5.

    Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, OJ L281/31. The provisions of Art. 15 are set out in the next section.

  6. 6.

    Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC, OJ L119/1.

  7. 7.

    Loi no. 78-17 du 6. janvier 1978 relative à l’informatique, aux fichiers et aux libertés. Article 2 of the Act in its original form stipulated: ‘Aucune décision de justice impliquant une appréciation sur en comportement humain ne peut avoir pour fondement un traitement automatisé d’informations donnant une définition du profil ou de la personnalité de l’intéressé. Aucune décision administrative ou privée impliquant une appréciation sur en comportement humain ne peut avoir seul fondement un traitement automatisé d’informations donnant une définition du profil ou de la personnalité de l’intéressé’. Article 3 stated: ‘Toute personne a le droit de connaître et de contester les informations et les raisonnements utilisés dans les traitements automatisés dont les résultats lui sont opposés’. In amendments to the Act in 2004, the provisions of Art. 2 were moved to Art. 10 while the provisions of Art. 3 were moved to Art. 39(1). Both sets of provisions were also reformulated to align better with the DPD.

  8. 8.

    In practice, though, the distinction between decision and data processing is blurred as decisions inevitably involve the processing of data.

  9. 9.

    For instance, Italy’s implementation of Art. 15 has prohibited judicial or administrative decisions involving assessment of a person’s conduct that are based solely on the automated processing of personal data aimed at defining the person’s profile or personality, whereas similar decisions made by private sector actors have been simply subject to a qualified right to object by the data subject: see Art. 14 of the Personal Data Protection Code of 2003 (Decreto legislativo 30 giugno 2003, n. 196: Codice in Materia di Protezione dei Dati Personali).

  10. 10.

    Bygrave (2002), p. 2. Further on traditional ‘fair information practice’ principles, see, e.g., Bygrave (2014), ch. 5 and references cited therein.

  11. 11.

    For the seminal analysis of these provisions in light of Art. 15, see Bygrave (2002), pp. 334–357.

  12. 12.

    In 2014, the German Federal Court of Justice (Bundesgerichtshof) handed down an appeal judgment that touches briefly on the scope of the German rules that transpose DPD Art. 15. See further n. 36 below.

  13. 13.

    The view of the Working Party on the Protection of Individuals with regard to the Processing of Personal Data established pursuant to DPD Art. 29 has been that the principle established by Art. 15 does not qualify as a ‘basic’ principle but as an ‘additional principle to be applied to specific types of processing’, at least in determining adequacy assessments of third countries under DPD Art. 25: see Working Party on the Protection of Individuals with regard to the Processing of Personal Data (1998), pp.6–7. Nonetheless, Art. 15 is sometimes taken into consideration in the context of approving Binding Corporate Rules (BCRs) for cross-border data transfer: see, e.g., approval by the Spanish Data Protection Authority (Agencia Española Protección de Datos) of the BCRs for Latham & Watkins (file number TI/00030/2017).

  14. 14.

    See Commission Decision 2000/520/EC of 26 July 2000 on the adequacy of the protection provided by the safe harbour privacy principles and related frequently asked questions issued by the US Department of Commerce, OJ L 215/7. The decision was invalidated by the CJEU in Maximillian Schrems v Data Protection Commissioner, Case C-362/14, Judgment of 6 October 2015.

  15. 15.

    Commission Implementing Decision (EU) 2016/1250 of 12 July 2016 on the adequacy of the protection provided by the EU-U.S. Privacy Shield, OJ L 207/1.

  16. 16.

    See generally Bosco et al. (2015), pp. 39, 42.

  17. 17.

    See Draft modernised Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data [ETS 108], drawn up by the Council of Europe’s Ad hoc Committee on Data Protection (version of September 2016). According to Art. 8(1) of the draft, ‘[e]very individual shall have a right: (a) not to be subject to a decision significantly affecting him or her based solely on an automated processing of data without having his or her views taken into consideration; […] (d) to object at any time, on grounds relating to his or her situation, to the processing of personal data concerning him or her unless the controller demonstrates compelling legitimate grounds for the processing which override his or her interests or rights and fundamental freedoms’. See too the Council of Europe’s Guidelines on the Protection of Individuals with Regard to the Processing of Personal Data in the World of Big Data (adopted 23 January 2017; T-PD(2017)01), especially principles 7.1 (‘The use of Big Data should preserve the autonomy of human intervention in the decision-making process’), 7.3 (‘Where decisions based on Big Data might affect individual rights significantly or produce legal effects, a human decision-maker should, upon request of the data subject, provide her or him with the reasoning underlying the processing, including the consequences for the data subject of this reasoning’) and 7.4 (‘On the basis of reasonable arguments, the human decision-maker should be allowed the freedom not to rely on the result of the recommendations provided using Big Data’).

  18. 18.

    See particularly principles 5.5 (‘Decisions concerning a worker should not be based solely on the automated processing of that worker’s personal data’), 5.6 (‘Personal data collected by electronic monitoring should not be the only factors in evaluating worker performance’), 6.10 (‘Polygraphs, truth-verification equipment or any other similar testing procedure should not be used’) and 6.11 (‘Personality tests or similar testing procedures should be consistent with the provisions of this code, provided that the worker may object to the processing’).

  19. 19.

    Its replication outside Europe has occurred principally in a handful of African jurisdictions: see Senegal’s Data Protection Act of 2008 s. 48; Angola’s Law No.22/11 on Data Protection of 2011 Art. 29; Lesotho’s Data Protection Act of 2012 s. 51; and South Africa’s Protection of Personal Information Act of 2013 s. 71. By contrast, the only jurisdiction in the Asia-Pacific region with an equivalent to DPD Art. 15 is the Macau Special Administrative Region: see its Act 8/2005 on Personal Data Protection Art. 13.

  20. 20.

    Bygrave (2001), p. 21; Bygrave (2002), p. 364.

  21. 21.

    See also Bygrave (2001), p. 21; Bygrave (2002), p. 357.

  22. 22.

    Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, OJ L119/89. Article 11 stipulates: ‘1. Member States shall provide for a decision based solely on automated processing, including profiling, which produces an adverse legal effect concerning the data subject or significantly affects him or her, to be prohibited unless authorised by Union or Member State law to which the controller is subject and which provides appropriate safeguards for the rights and freedoms of the data subject, at least the right to obtain human intervention on the part of the controller. 2. Decisions referred to in paragraph 1 of this Article shall not be based on special categories of personal data referred to in Article 10, unless suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place. 3. Profiling that results in discrimination against natural persons on the basis of special categories of personal data referred to in Article 10 shall be prohibited, in accordance with Union law.’ Article 11(1) essentially replicates Art. 7 of the predecessor to this Directive—Council Framework Decision 2008/977/JHA of 27 November 2008 on the protection of personal data processed in the framework of police and judicial cooperation in criminal matters, OJ L 350/60. Article 7 of the Framework Decision is expressed as a permission rather than prohibition (i.e., automated decisions ‘shall be permitted only if authorised by a law which also provides measures to safeguard the data subject’s legitimate interests’), but the effect is the same as for Art. 11(1). The second and third paragraphs of Art. 11 are new—i.e., there are no equivalent provisions in the Framework Decision.

  23. 23.

    Directive (EU) 2016/681 of the European Parliament and of the Council of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime, OJ L119/132. Article 6(5) provides: ‘Member States shall ensure that any positive match [leading to the identification of persons who may be involved in terrorism or serious crime and who thus need to be subject to further examination by the competent authorities] resulting from the automated processing of PNR data … is individually reviewed by non-automated means to verify whether the competent authority … needs to take action under national law’.

  24. 24.

    Explanatory text for Proposal for a Council Directive concerning the protection of individuals in relation to the processing of personal data, COM(90) 314 final—SYN 287, p.29. The forerunner to Art. 15(1) granted a person the right ‘not to be subject to an administrative or private decision involving an assessment of his conduct which has as its sole basis the automatic processing of personal data defining his profile or personality’ (Art. 14(1) of the 1990 Proposal). These provisions were then changed in the 1992 Amended Proposal for a Council Directive on the protection of individuals with regard to the processing of personal data and on the free movement of such data (COM(92) 422 final—SYN 287) such that a person was granted a right ‘not to be subjected to an administrative or private decision adversely affecting him which is based solely on automatic processing defining a personality profile’ (Art. 16(1)).

  25. 25.

    COM(92) 422 final—SYN 287, p. 26.

  26. 26.

    See also Bygrave (2001), p. 18; Bygrave and Berg (1995), p. 32. Cf. recital 2 in the preamble to the Directive (‘Whereas data-processing systems are designed to serve man; whereas they must, whatever the nationality or residence of natural persons, respect their fundamental rights and freedoms … and contribute to … the well-being of individuals’).

  27. 27.

    They are reflected in the preamble to the GDPR. In particular, Recital 71 evidences concern about the potentially poor quality of fully automated decision-making processes, with emphasis put on the need ‘to ensure, in particular, that factors which might result in inaccuracies in personal data are corrected and the risk of errors is minimised’. Recital 71 also stresses the need ‘to secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons on the basis of’ the categories set out in Art. 9(1).

  28. 28.

    European Parliament (2013), p. 93 (outlining proposed Art. 20(1)). However, two other committees in the Parliament, namely the Committee on Internal Market and Consumer Protection (IMCO) and the Committee on Industry, Research and Energy (ITRE), were friendlier to profiling than the Committee on Civil Liberties, Justice and Home Affairs (which had the lead role in negotiating the GDPR). See further European Parliament (2013), p. 308, 309, 471, 472.

  29. 29.

    Children are also singled out in respect of the right to data erasure. Recital 65 states that such a right is ‘relevant in particular where the data subject has given his or her consent as a child and is not fully aware of the risks involved by the processing, and later wants to remove such personal data, especially on the Internet’.

  30. 30.

    Recitals are not legally binding. Thus, they do not create rights or obligations that are contrary to, or not inherent in, the Articles: see, e.g., CJEU, Giuseppe Manfredi v. Regione Puglia, Case C-308/97, Judgment of 25 November 1998, paras. 29–30; CJEU, Criminal Proceedings against Nilsson, Hagelgren & Arrborn, Case C-162/97, Judgment of 19 November 1998, para. 54.

  31. 31.

    Cf. Art. 11(1) of Directive (EU) 2016/680 – supra n. 22– which operates expressly as a prohibition.

  32. 32.

    The case, for instance, in Belgium: see Art. 12bis of the 1998 Belgian data protection legislation (Wet tot omzetting van de Richtlijn 95/46/EG van 24 oktober 1995 van het Europees Parlement en de Raad betreffende de Bescherming van Natuurlijke Personen in verband met de Verwerkung van Persoonsgegevens en betreffende de Vrij Verkeer van die Gegegevens, van 11 december 1998; Loi transposant la Directive 95/46/CE du 24 octobre 1995 du Parlament Euroéen et du Conseil relative à la Protection des Personnes Physiques à l’égard du Traitement de Données à Caractère Personnel et à la Libre circulation des ces Données, du 11 decembre 1998).

  33. 33.

    As has been done, e.g., in Norway: see s. 25 of the Personal Data Act of 2000 (lov om behandling av personopplysninger av 14. april 2000 nr. 31).

  34. 34.

    The case with Italy: see Art. 14 of the Personal Data Protection Code of 2003.

  35. 35.

    Further on this aim, see, e.g., Recitals 1, 6, 11 and 71 in the preamble to the GDPR.

  36. 36.

    See also COM(92) 422 final—SYN 287, p. 26: ‘what is prohibited is the strict application by the user [data controller] of the results produced by the system. Data processing may provide an aid to decision-making, but it cannot be the end of the matter; human judgement must have its place. It would be contrary to this principle, for example, for an employer to reject an application from a job-seeker on the sole basis of his results in a computerized psychological evaluation, or to use such assessment software to produce lists giving marks and classing job applicants in order of preference on the sole basis of a test of personality’. See too the judgment of the German Federal Court of Justice in the so-called SCHUFA case concerning the use of automated credit-scoring systems: judgment of 28 January 2014, VI ZR 156/13. Here, the court held, on appeal, that the credit-scoring system fell outside the ambit of the German rules that transpose DPD Art. 15 (the relevant provisions are found in §6a of Germany’s Federal Data Protection Act 1990 (Bundesdatenschutzgesetz – Gesetz zum Fortentwicklung der Datenverarbeitung und des Datenschutzes vom 20 Dezember 1990)), as amended) because the automated elements of the decisional process pertained to the preparation of evidence; the actual decision to provide credit was made by a person. In the words of the court: ‘Von einer automatisierten Einzelentscheidung kann im Falle des Scorings nur dann ausgegangen werden, wenn die für die Entscheidung verantwortliche Stelle eine rechtliche Folgen für den Betroffenen nach sich ziehende oder ihn erhebliche beeinträchtigende Entscheidung ausschließlich aufgrund eines Score-Ergebnisses ohne weitere inhaltliche Prüfung trifft, nicht aber, wenn die mittels automatisierter Datenverarbeitung gewonnenen Erkenntnisse lediglich Grundlage für eine von einem Menschen noch zu treffende abschließende Entscheidung sind’: para. 34.

  37. 37.

    Cf. the European Parliament Committee on Civil Liberties, Justice and Home Affairs proposed provisions to capture profiling when it is based ‘solely or predominantly on automated processing’ (proposed Art. 20(5)): European Parliament (2013), p. 94. The omission of any reference to ‘predominantly’ in the final version of Art. 22(1) and, indeed, in Recital 71 underlines that ‘solely’ is the sole operative criterion.

  38. 38.

    Church and Millard (2010), p. 84.

  39. 39.

    Bygrave (2001), p. 19; see too Bygrave (2002), p. 322.

  40. 40.

    See, e.g., Damman and Simitis (1997), p. 220. The Commission seems also to have taken the view that simply sending a commercial brochure to a list of persons selected by computer does not significantly affect the persons under Art. 15(1): COM(92) 422 final—SYN 287, pp. 26–27. This view, though, related to a draft provision expressly requiring an adverse effect—a requirement that was omitted from the wording of the final version of Art. 15(1).

  41. 41.

    See too Vermeulen (2013), p. 12. Vermeulen takes this view from the initial version of the right proposed by the Commission in 2012, but that version had the same structure and mostly the same wording as the current Art. 22(1).

  42. 42.

    Savin (2014), p. 4.

  43. 43.

    Bygrave (2001), p. 21; Bygrave (2002), p. 327.

  44. 44.

    The European Parliament Committee on Civil Liberties, Justice and Home Affairs also listed such a right in its proposed Art. 20(5) of the draft Regulation: European Parliament (2013), p. 94.

  45. 45.

    Supra n. 30 and references cited therein.

  46. 46.

    See especially Wachter and others (2017). Articles 13(2)(f), 14(2)(g) and 15(1)(h) all concern the supply of ‘meaningful information about the logic involved [in automated decision-making], as well as the significance and envisaged consequences of such processing for the data subject’, albeit in different contexts.

  47. 47.

    However, the provisions of Art. 13—which concern supply of information about processing when the data have been obtained from the data subject—are best understood as only concerned with supply of information prior to an automated decision being made. This follows from the fact that the supply of information shall occur ‘at the time when personal data are obtained’ (Art. 13(2)). Article 14—which concerns supply of information where personal data have been obtained from other sources than the data subject—does not operate with the latter delimitation. Nor does Art. 15.

  48. 48.

    The notion of ‘fair’ here builds primarily on the dual principles identified by Galligan (1996), p. 419: ‘one is that people ought to know how they will be treated by those holding power over them; the other is that people ought to be treated equally in the sense that the criteria are applied generally and consistently.’ The giving of reasons rests on both these principles. It is, at the same time, rooted in several interlinked interests, including a relatively technocratic concern about the quality of decision making (e.g., ensuring the reduction of decisional error and unwarranted bias: see further Galligan (1996), pp. 431–433) and a more dignitarian concern related to treating persons with respect. In terms of the latter, ‘[g]iving reasons expresses respect just as a refusal or failure to do so—where the failure evinces disregard for a person’s opinion of the justice of his treatment—expresses contempt’: Allan (1998), p. 500; see too Galligan (1996), p. 433.

  49. 49.

    Ireland, however, has operated with such an exception (see s. 6B(2)(b) of Ireland’s Data Protection (Amendment) Act 2003), but it has been the only country in Europe to do so.

  50. 50.

    This follows also from Arts. 13(2)(f), 14(2)(g) and 15(1)(h).

  51. 51.

    See, e.g., Bygrave (2015), pp. 31–32.

  52. 52.

    Kuner and others (2017), p. 1.

References

  • Allan TRS (1998) Procedural fairness and the duty of respect. Oxf J Leg Stud 18:497–515

    Article  Google Scholar 

  • Andrews L (2011) I know who you are and I saw what you did. Social networks and the death of privacy. Free Press, New York

    Google Scholar 

  • Borgesius FJZ (2015) Improving privacy protection in the area of behavioural targeting. Kluwer Law International, Alphen aan den Rijn

    Google Scholar 

  • Bosco F, D’Angelo E, Vermeersch E (2015) National data protection authorities’ views on profiling. In: Creemers N, Guagnin D, Koops BJ (eds) Profiling technologies in practice. Wolf Legal Publishers, Oisterwijk, pp 21–46

    Google Scholar 

  • Bygrave LA (2001) Minding the machine: Article 15 of the EC data protection directive and automated profiling. Comput Law Secur Rev 17:17–24

    Article  Google Scholar 

  • Bygrave LA (2002) Data protection law: approaching its rationale, logic and limits. Kluwer Law International, Alphen aan den Rijn

    Google Scholar 

  • Bygrave LA (2014) Data privacy law: an international perspective. Oxford University Press, Oxford

    Book  Google Scholar 

  • Bygrave LA (2015) Internet governance by contract. Oxford University Press, Oxford

    Book  Google Scholar 

  • Bygrave LA, Berg JP (1995) Reflections on the rationale for data protection laws. In: Bing J, Torvund O (eds) 25 years anniversary anthology in computers and law. Tano, Oslo, pp 3–39

    Google Scholar 

  • Church P, Millard C (2010) Comments on the data protection directive. In: Büllesbach A, Gijrath S, Poullet Y, Prins C (eds) Concise European IT law, 2nd edn. Kluwer Law International, Alphen aan den Rijn, pp 83–85

    Google Scholar 

  • Damman U, Simitis S (1997) EG-Datenschutzrichtlinie: Kommentar. Nomos, Baden-Baden

    Google Scholar 

  • European Parliament (2013) Report on the proposal for a regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) (COM(2012)0011 – C7-0025/2012 – 2012/0011(COD))(A7-0402/2013; PE501.927v05-00)

    Google Scholar 

  • Faliagka E, Tsakalidis A, Tzimas G (2012) An integrated e-recruitment system for automated personality mining and applicant ranking. Internet Res 22:551–568

    Article  Google Scholar 

  • Galligan DJ (1996) Due process and fair procedures. A study of administrative procedures. Clarendon Press, Oxford

    Google Scholar 

  • Hildebrandt M (2008) Defining profiling: a new type of knowledge? In: Hildebrandt M, Gutwirth S (eds) Profiling the European citizen. Springer, Dordrecht, pp 17–47

    Chapter  Google Scholar 

  • Kuner C, Svantesson DJB, Cate FH, Lynskey O, Millard C (2017) Machine learning with personal data: is data protection law smart enough to meet the challenge? Int Data Priv Law 7:1–2

    Article  Google Scholar 

  • Leon PG, Cranshaw J, Cranor LF, Graves J (2012) What do online behavioral advertising privacy disclosures communicate to users? In: Proceedings of the 2012 ACM workshop on privacy in the electronic society. Association for Computing Machinery (ACM), New York, pp 19–30

    Google Scholar 

  • Savin A (2014) Profiling and automated decision making in the present and new EU data protection frameworks. Copenhagen business school open archive. http://openarchive.cbs.dk/handle/10398/8914. Accessed 1 May 2017

  • Stepanek M (2000) Weblining. Business Week 3 April: pp EB26–EB34

    Google Scholar 

  • Vermeulen M (2013) Regulating profiling in the general data protection regulation: an interim insight into the drafting of Article 20. 1 September 2013, EMSOC project (User empowerment in a social media culture), Brussels, http://emsoc.be/wp-content/uploads/2013/11/D3.2.2-Vermeulen-Emsoc-deliverable-profiling-Formatted1.pdf. Accessed 1 May 2017

  • Wachter S, Mittelstad B, Floridi L (2017) Why a right to explanation of automated decision-making does not exist in the general data protection regulation. Int Data Priv Law 7:76–99

    Article  Google Scholar 

  • Working Party on the Protection of Individuals with Regard to the Processing of Personal Data (1998) Transfers of personal data to third countries: Applying Articles 25 and 26 of the EU data protection directive. Working Document adopted 24 July, 1998, http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/1998/wp12_en.pdf. Accessed 1 May 2017

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lee A. Bygrave .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Mendoza, I., Bygrave, L.A. (2017). The Right Not to be Subject to Automated Decisions Based on Profiling. In: Synodinou, TE., Jougleux, P., Markou, C., Prastitou, T. (eds) EU Internet Law. Springer, Cham. https://doi.org/10.1007/978-3-319-64955-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-64955-9_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-64954-2

  • Online ISBN: 978-3-319-64955-9

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics