Skip to main content

Discrimination in the Digital Market: Protection from Different Sides

  • Chapter
  • First Online:
The Risk of Discrimination in the Digital Market

Part of the book series: SpringerBriefs in Law ((BRIEFSLAW))

  • 286 Accesses

Abstract

This chapter shows that the use of algorithms by digital service providers leads to insidious new forms of discrimination that cannot be protected by European anti-discrimination legislation alone. This legislation has limitations that can be traced back to two factors. Firstly, the European legislation is too closely linked to lists and typical causes of discrimination. Secondly, the European legislation shows gaps regarding indirect discrimination, as it does not provide adequate protection against proxies and hidden discrimination. It is more difficult to identify discrimination in the case of characteristics that are formally neutral, but which recur frequently in protected categories. Considering the scenario described, effective protection of online discrimination cannot be entrusted to sector-specific rules, but to a systematic interpretation, which is inevitably also attentive to the rules on unfair commercial practices and data protection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 49.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Empirical evidence of bias in automated decision-making will require us to revisit many of the fundamental challenges to the conceptual apparatus of discrimination law—from academic debates about the discipline’s normative foundations to courts’ reluctance to approach causation through a ‘legalistic’ lens in the equality context. In these terms Adams-Prassl et al. (2023) p. 145. For an overview of European anti-discrimination legislation, see Wójcik (2022), p. 95. The author points out that “according to article 2 of the Treaty on the European Union, equality is one of the founding values of the EU. Pursuant to article 3 of the treaty, equality, non-discrimination, and social justice also remain the EU’s objectives. Furthermore, in Mangold, the European Court of Justice confirmed that non-discrimination constitutes a general principle of EU law”. On the fact that the existing categories of EU anti-discrimination law do not provide an easy fit for algorithmic decision making, see Hacker (2018), p. 1143; Wachter et al. (2021), p. 11.

  2. 2.

    Gerards and Xenidis (2021), p. 11, in their report, they state that ‹‹at the legal level, such measures notably include adopting the draft Horizontal Directive under negotiation at the Council since 2008 to equalise the scope of EU non-discrimination law, addressing the gaps linked to the exceptions in the material scope of the Gender Goods and Services Directive in relation to the media, advertising and education, and bringing clarity on the prohibition of intersectional discrimination. Turning to the role of the Court of Justice, an expansive interpretation of the personal scope of EU equality law—both in terms of the scope of single protected grounds and the exhaustive nature of the list established in Article 19 TFEU considering the open-ended provision in Article 21 of the Charter—would enhance the law’s capacity to address algorithmic discrimination››.

  3. 3.

    As to the debate on whether a discriminatory intent is necessary to apply the anti-discriminatory legislation see Mattarella (2020), p. 696. With reference to the thesis that discriminatory intent may be lacking, see Morozzo Della Rocca (2002), p. 123; Morozzo Della Rocca (2007), p. 289; Sitzia (2011), p. 99. From a different perspective see Maffeis (2015), p. 171; Navarretta (2020) p. 25. Concerning proxies, there are no unambiguous indications from the Court of Justice of the European Union. See on this point Case C-177/88, Elisabeth Johanna Pacifica Dekker mv. Stichting Vormingscentrum voor Jong Volwassenen (VJV-Centrum) Plus and Case C-668/15, Jyske Finans A/S v Ligebehandlingsnævnet, acting on behalf of Ismar Huskic. In the latter case, the Court of Justice of the European Union ruled that unequal treatment based on the plaintiff’s country of origin does not constitute discrimination based on ethnic origin, while in the Dekker case it held that discrimination based on pregnancy is a form of sex discrimination.

  4. 4.

    Gerards and Xenidis (2021), p. 8.

  5. 5.

    See Zuiderveen Borgesius (2019), p. 10; Friedman and Nissenbaum (1996); Timmis et al. (2015), p. 1, emphasise that the risk of social exclusion needs to be part of the agenda for any implementation of technology enhanced assessment.

  6. 6.

    Compare Zuiderveen Borgesius (2023), p. 11.

  7. 7.

    Zuiderveen Borgesius (2019), p. 13. The author shows that if a bank uses an artificial intelligence system to predict which loan applicants will have problems repaying it, and the artificial intelligence system is trained on data that does not contain information on protected characteristics such as skin colour, this does not exclude that the training data cannot be equally discriminatory.

  8. 8.

    Prince and Schwarcz (2020). The author states that the proxy discrimination is a particularly pernicious subset of disparate impact. Like all forms of disparate impact, it involves a facially neutral practice that disproportionately harms members of a protected class.

  9. 9.

    See the Report of the Defender of Rights in partnership with the CNIL, Algorithms preventing automated discrimination. https://www.defenseurdesdroits.fr., p. 4.

  10. 10.

    See Schönberger (2019), p. 184. The author points out that, especially in the health sector, where indirect discrimination is also very common, it is not difficult to pass the propoporality test.

  11. 11.

    Femia (1996), p. 456. Picker (2003), p. 701, believes that liberal social and economic systems are congenitally discriminatory.

  12. 12.

    See Zuiderveen Borgesius (2020), p. 3. Compare Perlingieri (2020b) p. 451. Hacker (2018), p. 1143, says that empirical evidence is mounting that artificial intelligence applications threaten to discriminate against legally protected groups. Whereas this evidence, the author shows how the concepts of anti-discrimination law may be combined with algorithmic audits and data protection impact assessments to unlock the algorithmic black box. On the need for multi-level anti-discrimination protection, see Capuzzo (2020), p. 89.

  13. 13.

    See Franzoni (2021), p. 6; Messina (2022), p. 196: Calzolaio (2017), p. 598.

  14. 14.

    Gilman (2020), p. 1 shows that algorithms analyze data, sort people into categories, and serve as gatekeepers to life’s necessities. Yet people remain largely in the dark about these systems, creating an informational asymmetry whose harmful consequences fall most harshly on low-income people.

  15. 15.

    See COM (2020) 264 final, Communication from the Commission to the European Parliament and the Council, Data protection as a pillar of citizens’ empowerment and the EU’s approach to the digital transition—two years of application of the General Data Protection Regulation. https://eur-lex.europa.eu. Regarding the circumstance that GDPR rules have the advantage of being based on ex ante and not only ex post protection mechanisms, taking a risk-based approach to prevent the misuse of personal data, compare Li (2022) p. 4; Contissa et al. (2018); Mantelero (2018), p. 299.

  16. 16.

    Hoeren and Niehoff (2022); De Franceschi and Schulze (2019), p. 22. The author emphasises that European Union started bringing consumer protection in line with the digital age by acknowledging data as counter-performance for the purposes of the new Digital Content Directive. See also Mendola (2023), p. 423; Schmidt-Kessel (2023), p. 153; Gardiner (2022).

  17. 17.

    About data as fragments of people’s lives to which they refer see Tomassini (2020). These are fragments that have a value not because of the intent of the person who left them, often unconsciously, but because they are collected and processed according to a certain procedure by a machine learning algorithm. The value of this data is determined because of the work of the person who collects and processes it and by doing so obtains a certain useful result that can be spent on the network. See Franzoni (2021), p. 6.

  18. 18.

    https://eur-lex.europa.eu. Regarding profiling, reference must be made to Article 4 of the GDPR, which states that profiling means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements. With reference to the relationship between profiling and technological evolution, compare: Perlingieri (2020a), p. 177; Perlingieri (2018), p. 481; Garofalo (2021), p. 1505; Pellecchia (2018), p. 1209; D’ Ippolito (2021), p. 87.

  19. 19.

    See Mendoza and Bygrave (2017) p. 77, where, in particular, a critical analysis is made of Article 22 of the GDPR in comparison with Article 15 of Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data. Article 22 of the GDPR, according to the authors, while offering individuals stronger protection than Article 15 of the 1995 directive, is doubtful to have a significant practical impact on automated profiling.

  20. 20.

    The prohibition of decisions based solely on automated processing is not absolute, there are exceptions, specified in paragraph 2 of Article 22, namely where such a decision: is necessary for entering into, or performance of, a contract between the data subject and a data controller; is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or is based on the data subject’s explicit consent. Compare Grasso (2023) p. 29.

  21. 21.

    Convention 108, Convention for the protection of individuals regarding the processing of personal data. Council of Europe. https://www.europarl.europa.eu. Convention 108 constitutes the first binding international instrument which is aimed at protecting individuals against abuses which may accompany the collection and processing of personal data, and which seeks to regulate, at the same time, the cross-border flow of personal data.

  22. 22.

    See Veale and Edwards (2018), p. 398; Pizzetti (2018).

  23. 23.

    Skitka et al. (1999), p. 991. The authors are referred to for an analysis of some computerised human decision support and aid systems in some common contexts, in comparison with unsupported decisions.

  24. 24.

    Zadeh et al. (2017).

  25. 25.

    Griffin (2022), p. 1; MCMillan Cottom (2020), p. 444.

  26. 26.

    A recent empirical study showed how the targeting of online ads can reinforce stereotyping and segregation on the labour market: during the experiment, researchers used the Facebook advertising platform to neutrally disseminate various employment ads. In the end, cashier positions in supermarkets reached an audience composed of 85% women, while ads for taxi driver positions reached a 75% black audience and ads for lumberjack positions reached an audience that was 90% male and 72% white. A global consensus has emerged among both researchers and policymakers that risks of algorithmic discrimination are pervasive and multifaceted. In this context, understanding these risks and the types of legal challenges they create is key to ensuring equality and combating discrimination. On this point, please refer to Gerards and Xenidis (2021), p. 7; De Franceschi (2022), p. 73; Frosini (2020), p. 402.

  27. 27.

    Griffin (2022), p. 1.

  28. 28.

    The regulation appears to convey a clear message: to restore centrality to the human factor, although without the necessary logical and legal precedence of the human agent being an obstacle to technological innovation and to the improvement of the processes that public and private parties put in place. This is what has been affirmed by Pajno et al. (2019), p. 13.

  29. 29.

    www.italgiure.giustizia.it.

  30. 30.

    https://eur-lex.europa.eu.

  31. 31.

    The Article 10 of the Directive (EU) 2016/680 states that processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be allowed only where strictly necessary, subject to appropriate safeguards for the rights and freedoms of the data subject, and only: where authorised by Union or Member State law; to protect the vital interests of the data subject or of another natural person; or where such processing relates to data which are manifestly made public by the data subject.

  32. 32.

    https://brasil.mylex.net/legislacao.

  33. 33.

    Freire de Carvalho (2020), p. 6.

  34. 34.

    See Ricciuto (2019), p. 54; Ricciuto (2022); Carapezza Figlia (2022), p. 1372; Tullio (2023), p. 78.

  35. 35.

    Irti (2021), p. 40, points out that while Article 1 of Directive 95/46/EC stated that Member States shall protect the fundamental rights and freedoms of natural persons, and in particular their right to privacy with respect to the processing of personal data, Article 1(2) of the GDPR, stated that GDPR protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data. The author dwells, in particular, on the implications of the different content of the rules in question.

  36. 36.

    Compare Gentili (2022) p. 707.

  37. 37.

    Perlingieri (2018), p. 481. The author emphasises that the innovation marks the transition from a conception based exclusively on informed consent to one characterised predominantly by control. The author also states that it is no longer possible to disguise the fact that consent is not sufficient and in fact in some respects misleading and unsuitable to guarantee respect for the person. Compare Francesca (2022), p. 563; Thobani (2015), p. 459; Mignone (2014), p. 14.

  38. 38.

    Irti (2021), p. 42; Petit and Van Cleynenbreugel (2020), p. 4, underline that Digital businesses and online platforms defy traditional notions of market power grounded in a well-defined relevant product/price. In the context of digital businesses consumers rather pay with their own data or with their time, which may require different tools to analyze market dominance. Often considered as powerful economic actors, a platform is dependent on the ability of the technology underlying it to maintain and attract users to its services. The compilation and use of an ever-increasing amount of personal data and big data plays an essential role in this regard.

  39. 39.

    See Leistner (2021), p. 515; Jabłonowska (2022), p. 67, says that the European Union institutions have recently intensified their efforts to pass new legislation for the digital economy.

  40. 40.

    The Data Act lays down harmonised rules on making data generated by the use of a product or related service available to the user of that product or service, on the making data available by data holders to data recipients, and on the making data available by data holders to public sector bodies or Union institutions, agencies or bodies, where there is an exceptional need, for the performance of a task carried out in the public interest. In the Data Act, attention is also paid to the fact that the clauses on the provision of data are non-discriminatory. Whereas 41 in fact provides that to compensate for the lack of information on the conditions of different contracts, which makes it difficult for the data recipient to assess if the terms for making the data available are non-discriminatory, it should be on the data holder to demonstrate that a contractual term is not discriminatory. It is not unlawful discrimination, where a data holder uses different contractual terms for making data available or different compensation if those differences are justified by objective reasons. On this point see Orlando (2022a, b); Sganga (2022), p. 65; Caloprisco (2021), p. 169.

  41. 41.

    See in particular Article 26(2) of the Data Governance Act. On this point compare Alfonsi (2022) p. 294. About the European Strategy Data, see Arisi (2022) p. 33. For a comparison between the GDPR and the Data Governance Act, see Resta (2022), p. 971. The author, p. 984, argues that Articles 15 of the GDPR do not always work at the operational level, either because the incentives needed to overcome the costs of inertia involved in exercising the right are lacking or because the very possibilities of super vision as to how data can be used are limited. The Data Governance Act, on the other hand, prefigures more or less advanced forms of support to individuals both in the pre-consent phase (Art. 12(1)(m)); and in the phase of exercising the data subject’s rights and exercising the relevant remedies (Art. 2(1)(11); Art. 10(1)(b)).

  42. 42.

    Perlingieri (2018), p. 484; Perlingieri (2018), p. 179.

  43. 43.

    Gentili (2009), p. 207; Hernu (2020), p. 44.

  44. 44.

    See Alpa et al. (2009), p. 1417; Schulze (2008), p. 11; De Cristofaro (2009); Tommasi (2011), p. 119.

  45. 45.

    Principles, Definitions and Model Rules of European Private Law Draft Common Frame of Reference (DCFR), Outline Edition, 2009, p. 65 (7). See Books II and III—2:101 to II—2:105 and III—1:105.

  46. 46.

    For a detailed reconstruction on the point, see Checchini (2019), p. 4. The problem is usually approached in terms of the prevalence of the principle of equality over the principle of contractual freedom, neglecting that these are principles of equal weight operating in different fields, since equality is pertinent to cases of distributive justice: equality in a community, equality before the law. Contract, however, is a case of retributive justice, to which the principle of freedom pertains. On these aspects compare Galgano (2002), p. 53; Buonocore (2008), p. 551; Gentili (2009), p. 221. The author states that individual relationships do not lend themselves to being subjected to a judgement of equality of treatment which, in fact, presupposes the existence of two terms to be compared and a criterion of comparison that are not found in a strictly individual relationship. For a critique of attempts to transfer aspirations of distributive justice onto the contract, see D’Amico (2019), p. 48. With specific reference to the impact of the principle of non-discrimination in contractual relations, see Piraino (2015) p. 233; Perlingieri (2017), p. 1629, says that each contract is characterised by its own “proclaimed social function”, which represents the reference parameter for appreciating its merits. Roppo (2011), p. 79, considers that to the extent that it is the realm of freedom, the contract can also be the realm of inequality and discrimination, based on the free choices of the contracting parties. For the author, private autonomy is intrinsically discriminatory and between contract and equality there is an ambiguous relationship, which suffers from the ambiguity inherent in the very idea of equality: declined, as is well known, in terms of formal equality or substantial equality. See also Roppo (2020), p. 28.

  47. 47.

    See Resta (2019), p. 211. The author suggests distinguishing issues concerning personal data from those concerning non-personal data. For a comparison between the Artificial Intelligence Act and the EU data protection acquis see Mazzini and Scalzo (2022) p. 35. Regarding the relationship between GDPR and consumer discipline, compare Pagliantini (2022), p. 1560.

  48. 48.

    Lanni (2020), p. 120.

  49. 49.

    Auer (2021), p. 14, fears the danger of moving backwards from the intellectual standards of decades, if not centuries, about discrimination. Montinaro (2022), p. 352, points out that the omnibus directive refers to the GDPR and that from this it can be deduced that the unfairness of the practice is not excluded by mere compliance with the obligation of transparency of parameters. In particular, the author observes that there could be the collection of data to personalise prices without the consent of consumers; the use of data to personalise prices when such data were requested for other stated reasons; the use of knowledge of a specific vulnerable condition obtained through real-time observation of an individual consumer.

  50. 50.

    See Li (2022), p. 4. On the fact that artificial Intelligence (AI) systems are increasingly being deployed by marketing entities in connection with consumers’ interaction and manipulation as a substantive legal measure for drawing the line between fair and unfair practices, see Galli (2022), p. 1. On how instruments in EU consumer law could alleviate certain asymmetries in power and information, see Durovic and Watson (2022), p. 153. In particular, the authors point out that consumer data has become a driving force in the digital economy. As the number of data interactions increases, so too the insights into ever more intimate aspects of one’s daily life, behaviour and personality. Amongst the various products and services, one innovative advancement in the world of data-driven technology stands out as deserving particular attention: the capability to infer emotions from (personal) data and to use such information to respond to an individual’s needs on a highly intimate level. Whereas the technology has considerable potential, it is controversial not least due to the highly sensitive and private nature of emotions but also due to its questionable reliability as well as potential adverse effects.

  51. 51.

    https://eur-lex.europa.eu. Compare Capobianco (2018), p. 13.

  52. 52.

    Orlando (2022a, b), p. 357. Compare Barenghi (2020), p. 195; Barba (2021), p. 104.

  53. 53.

    Duivenvoorde (2022), p. 43. The authors show that despite the liability exemptions in the E-Commerce Directive and the Digital Services Act, the Unfair Commercial Practices Directive provides significant room to hold on-line marketplaces liable.

  54. 54.

    The Directive (EU) 2019/2161 of the European Parliament and of the Council, of 27 November 2019, amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules. https://eur-lex.europa.eu.

  55. 55.

    See whereas 45.

  56. 56.

    Benedetti (2021), p. 413, analyses the implications of no longer seeking out potential customers with contractual proposals, but offering them what we already know they will accept, by means of invasive profiling against which there is a need for new forms of protection.

  57. 57.

    See Guffanti Pesenti (2021), p. 635.

  58. 58.

    See Smuha et al. (2021), p. 23. The authors criticise Article 5(c) of the Proposed Artificial Intelligence Act by stating that this article focuses on social behaviour or known or predicted personal or personality characteristics, while it is well documented that proxies may be employed for where such personal data is protected. These proxies could be drawn from factors not mentioned in the Article, such geographical location (postcodes, etc.). Scoring on these measures can be as discriminatory and devastating for individuals as those drawn from the characteristics included in the Proposal.

References

  • Adams-Prassl J, Binns R, Kelly-Lyth A (2023) Directly discriminatory algorithms. Mod Law Rev 86:144–175

    Article  Google Scholar 

  • Alfonsi R (2022) Approvato il Data Governance Act: Regolamento (Ue) 2022/868 del 30 maggio 2022 sulla governance europea dei dati. Persona e Mercato. Osservatorio OGID 8:294–298

    Google Scholar 

  • Alpa G, Iudica G, Perfetti U, Zatti P (eds) (2009) Il Draft common frame of reference del diritto privato europeo. Cedam, Padua

    Google Scholar 

  • Arisi M (2022) Open knowledge. Access and re–use of research data in the European Union open data directive and the implementation in Italy. Ital Law J 8:34–73

    Google Scholar 

  • Auer M (2021) Granular norms and the concept of law: a critique. In: Busch C, De Franceschi A (eds) Algorithmic regulation and personalized law. CH Beck/Harts, München - Oxford - Baden-Baden, pp 137–151

    Google Scholar 

  • Barenghi A (2020) Diritto dei consumatori. Wolters Kluwer, Milan

    Google Scholar 

  • Barba A (2021) Capacità del consumatore e funzionamento del mercato. Valutazione e divieto delle pratiche commerciali. Giappichelli, Turin

    Google Scholar 

  • Benedetti AM (2021) Contratto, algoritmi e diritto civile transnazionale: cinque questioni e due scenari. Rivista di diritto civile 3:411–423

    Google Scholar 

  • Buonocore V (2008) Principio di uguaglianza e diritto commerciale. Giurisprudenza commerciale 35:551–582

    Google Scholar 

  • Caloprisco F (2021) Data Governance Act. Condivisione e “altruismo” dei dati. Quaderni Aisdue 2:169–188

    Google Scholar 

  • Calzolaio S (2017) Protezione dei dati personali. In: Dig Disc Pubbl Agg. Utet Giuridica—Wolters Kluwer, Turin, pp 594–635

    Google Scholar 

  • Capobianco E (2018) Le pratiche commerciali scorrette nel settore bancario. Diritto del mercato assicurativo e finanziario 3:13–29

    Google Scholar 

  • Capuzzo G (2020) “Do algorithms dream about electric sheep?” Percorsi di studio in tema di discriminazione e processi decisori algoritmici tra le due sponde dell’Atlantico. Rivista di diritto dei media 2:89–106

    Google Scholar 

  • Carapezza Figlia G (2022) “L’equivoco della privacy”. Circolazione dei dati personali e tutela della persona. Jus Civile 5:1372–1377

    Google Scholar 

  • Checchini B (2019) Discriminazione contrattuale e dignità della persona. Giappichelli, Turin

    Google Scholar 

  • Contissa G, Docter K, Lagioia F, Lippi M, Micklitz HW, Przemysla P, Sartor G, Torroni P (2018) Claudette meets Gdpr: automating the evaluation of privacy policies using artificial intelligence. European Consumer Organisation (BEUC) study report. http://hdl.handle.net/1814/60795. Accessed 10 June 2022

  • D’Amico G (2019) Giustizia contrattuale e contratti asimmetrici. Europa e diritto privato 1:1–49

    Google Scholar 

  • De Cristofaro G (ed) (2009) I «Principi» del diritto comunitario dei contratti. Acquis communautaire e diritto privato europeo. Giappichelli, Turin

    Google Scholar 

  • De Franceschi A (2022) Consumers’ vulnerability in the digital economy: personal data as counterperformance and digital obsolescence. Eur J Consum Law 2:73–93

    Google Scholar 

  • D’Ippolito G (2021) Profilazione e pubblicità targettizzata online. Real–Time Bidding e behavioural advertising. ESI, Naples

    Google Scholar 

  • Duivenvoorde B (2022) The liability of online marketplaces under the unfair commercial practices directive, the e–commerce directive and the digital services act. J Eur Consum Market Law 11:43–52

    Google Scholar 

  • Durovic M, Watson J (2022) AI, Consumer data protection and privacy, In: Ienca M, Pollicino O, Liguori L, Stefanini E, Andorno R (eds) The cambridge handbook of information technology, life sciences and human rights. Cambridge University Press, Cambridge, pp 273–287

    Google Scholar 

  • Femia P (1996) Interessi e conflitti culturali nell’autonomia privata e nella responsabilità civile. ESI, Naples

    Google Scholar 

  • Francesca M (2022) Il grande gioco. Buon costume e buon costume stipulativo alla prova dei social network. Teoria e prassi del diritto 2:559–572

    Google Scholar 

  • De Franceschi A, Schulze R (2019) New challenges and perspectives. In: De Franceschi A, Schulze R, Graziadei M, Pollicino O, Riente F, Sica S, Sirena P (eds.), Digital revolution–new challenges for law. Data protection, artificial intelligence, smart products, blockchain technology and virtual currencies, Beck/Hart, München-Baden-Baden, pp 1–15

    Google Scholar 

  • Franzoni M (2021) Lesione dei diritti della persona, tutela della privacy e intelligenza artificiale. Jus Civile 1:4–21

    Google Scholar 

  • Freire de Carvalho B (2020) Discriminação algorítmica e transparência na lei geral de proteção de dados pessoais. Revista de Direito e as Novas Tecnologias. https://bd.tjdft.jus.br/jspui/handle/tjdft/49512. Accessed 20 June 2023

  • Friedman B, Nissenbaum H (1996) Bias in computer systems. ACM Trans Inf Syst 14:330–347

    Article  Google Scholar 

  • Frosini TE (2020) Le sfide attuali del diritto ai dati personali. In: Perlingieri P, Giova S, Prisco I (eds) Il trattamento algoritmico dei dati tra etica, diritto ed economia. ESI, Naples, pp 395–402

    Google Scholar 

  • Galgano F (2002) Il negozio giuridico. Giuffrè, Milan

    Google Scholar 

  • Galli F (2022) Algorithmic marketing and EU law on unfair commercial practices. Springer, Cham

    Book  Google Scholar 

  • Gardiner C (2022) Unfair contract terms in the digital age. The challenge of protecting European consumers in the online marketplace. Edward Elgar, Cheltenham

    Google Scholar 

  • Garofalo G (2021) Identità digitale e diritto all’oblio: questioni aperte all’indomani dell’approvazione del GDPR. Diritto di famiglia e delle persone 3:1506–1518

    Google Scholar 

  • Gentili A (2009) Il principio di non discriminazione nei rapporti civili. Rivista critica del diritto privato 2:207–231

    Google Scholar 

  • Gentili A (2022) La volontà nel contesto digitale: interessi del mercato e diritti delle persone. Rivista trimestrale di diritto e procedura civile 76:702–716

    Google Scholar 

  • Gerards J, Xenidis R (2021) Algorithmic discrimination in Europe: challenges and opportunities for gender equality and non discrimination law. European Commission, Directorate-General for Justice and Consumers. Publications Office. https://data.europa.eu/doi/10.2838/544956

  • Gilman ME (2020) Poverty lawgorithms: a poverty lawyer’s guide to fighting automated decision–making harms on low–income communities. Data Soc. https://ssrn.com/abstract=3699650. Accessed 20 June 2023

  • Grasso AG (2023) GDPR Feasibility and algorithmic non-statutory discrimination. ESI, Naples

    Google Scholar 

  • Griffin R (2022) Tackling discrimination in targeted advertising: US regulators take very small steps in the right direction—but where is the EU? https://verfassungsblog.de/targete-ad. Accessed 19 July 2023

  • Guffanti Pesenti L (2021) Pratiche commerciali scorrette e rimedi nuovi. La difficile trasposizione dell’art. 3, co. 1, N. 5), dir. 2019/2161/UE. Europa e diritto privato 4:635–683

    Google Scholar 

  • Hacker P (2018) Teaching fairness to artificial intelligence: existing and novel strategies against algorithmic discrimination under EU law. Common Mark Law Rev 55:1143–1185

    Google Scholar 

  • Hernu R (2020) Le principe d’égalité et le principe de non–discrimination dans la jurisprudence de la CJUE. https://www.conseil-constitutionnel.fr. Accessed 19 July 2023

  • Hoeren T, Niehoff M (2022) Artificial intelligence and data protection law. In: Corrales Compagnucci M, Wilson M, Fenwick M, Forgó N, Bärnighausen T (eds) AI in eHealth: human autonomy. Data Governance and Privacy in Healthcare. Cambridge University Press, Cambridge, pp 147–165

    Google Scholar 

  • Irti C (2021) Consenso «negoziato» e circolazione dei dati personali. Giappichelli, Turin

    Google Scholar 

  • Jabłonowska A (2022) Consumer protection in the age of data-driven behaviour modification. J Eur Consum Mark Law 11:67–71

    Google Scholar 

  • Lanni S (2020) Dataquake: intelligenza artificiale e discriminazione del consumatore. Nuovo diritto civile 2:97–123

    Google Scholar 

  • Leistner M (2021) The commission’s digital markets and services package—new rules for big tech and big data. J Eur Int IP 70:515–516

    Google Scholar 

  • Li Z (2022) Affinity–based algorithmic pricing: a dilemma for EU data protection law. Comput Law Secur Rev. https://doi.org/10.2139/ssrn.4144571

    Article  Google Scholar 

  • Maffeis D (2015) Il diritto contrattuale antidiscriminatorio nelle indagini dottrinali recenti. Le nuove leggi civili e commentate 1:161–180

    Google Scholar 

  • Mantelero A (2018) La gestione del rischio nel GDPR: limiti e sfide nel contesto dei big data e delle applicazioni di Artificial intelligence. In: Mantelero A, Poletti D (eds), Regolare la tecnologia. Il Reg. UE 2016/679 e la protezione dei dati personali. Un dialogo tra Italia e Spagna. University Press, Pisa, pp 289–305

    Google Scholar 

  • Mattarella G (2020) Big data e accesso al credito degli immigrati: discriminazioni algoritmiche e tutela del consumatore. Giurisprudenza commerciale 4:696–716

    Google Scholar 

  • Mazzini G, Scalzo S (2022) The proposal for the artificial intelligence act: considerations around some key concepts. In: Camardi C (ed) La via europea per l’intelligenza artificiale. Atti del Convegno del Progetto Dottorale di Alta Formazione in Scienze Giuridiche Ca’ Foscari Venezia, 25–26 novembre 2021, Wolters Kluwer–Cedam, Milan pp. 21–51

    Google Scholar 

  • McMillan Cottom T (2020) Where platform capitalism and racial capitalism meet: the sociology of race and racism in the digital society. Sociology of Race and Ethnicity, 6(4):441–449. https://doi.org/10.1177/2332649220949473

  • Mendola A (2023) Libertà di espressione e “costo” del consenso nell’era della condivisione digitale. Analisi comparatistica. Wolters Kluwer-Cedam, Milan

    Google Scholar 

  • Mendoza I, Bygrave LA (2017) The right not to be subject to automated decisions based on profiling. In: Synodinou T, Jougleux P, Markou C, Prastitou T (eds) EU Internet law. Springer, Cham, pp 77–98

    Google Scholar 

  • Messina D (2022) La proposta di regolamento europeo in materia di Intelligenza Artificiale: verso una “discutibile” tutela individuale di tipo consumer-centric nella società dominata dal “pensiero artificiale.” Rivista di diritto dei media 2:196–232

    Google Scholar 

  • Mignone C (2014) Identità della persona e potere di disposizione, ESI, Naples

    Google Scholar 

  • Montinaro R (2022) I sistemi di raccomandazione nelle interazioni tra professionisti e consumatori: il punto di vista del diritto dei consumi. Persona e Mercato 3:368–391

    Google Scholar 

  • Morozzo Della Rocca P (2002) Gli atti discriminatori nel diritto civile, alla luce degli artt. 43 e 44 del T.U. sull’immigrazione. Il diritto di famiglia e delle persone 31:112–147

    Google Scholar 

  • Morozzo Della Rocca P (2007) Le discriminazioni nei contratti di scambio di beni e servizi. In: Barbera M (eds) Il nuovo diritto antidiscriminatorio. Il quadro nazionale e comunitario. Giuffrè, Milan, pp 289–341

    Google Scholar 

  • Navarretta E (2020) Principio di eguaglianza e diritto civile. Questione giustizia 1:23–27

    Google Scholar 

  • Orlando S (2022a) Regole di immissione sul mercato e «pratiche di intelligenza artificiale» vietate nella proposta di Artificial Intelligence Act. Persona e Mercato 3:346–367

    Google Scholar 

  • Orlando S (2022b) Verso il Data Act: la proposta di Regolamento del Parlamento e del Consiglio su regole armonizzate sull’accesso equo e l’uso dei dati (legge sui dati) COM(2022) 68 final del 23.2.2022. Persona e Mercato. Osservatorio OGID 1:166–168

    Google Scholar 

  • Pagliantini S (2022) L'attuazione minimalista della Dir. 2019/770/UE: riflessioni sugli artt. 135 octies-135 vicies ter c. cons. La nuova disciplina dei contratti b-to-c per la fornitura di contenuti e servizi digitali. Nuove leggi civili commentate 6:1499–1560

    Google Scholar 

  • Pajno A, Bassini M, De Gregorio G, Macchia M, Patti FP, Pollicino O, Quattrocolo S, Simeoli D, Sirena P (2019) AI: profili giuridici. Intelligenza Artificiale: criticità emergenti e sfide per il giurista. Rivista di BioDiritto 3:205–235

    Google Scholar 

  • Pellecchia E (2018) Profilazione e decisioni automatizzate al tempo della black box society: qualità dei dati e leggibilità dell’algoritmo nella cornice della responsible research and innovation. Le nuove leggi civili commentate 5:1209–1236

    Google Scholar 

  • Perlingieri P (2017) «Controllo» e «conformazione» degli atti di autonomia negoziale. In Caterini E, Di Nella L, Flamini A, Mezzasoma L, Polidori S (eds) Scritti in onore di Vito Rizzo. II. ESI, Naples pp 1635–1657

    Google Scholar 

  • Perlingieri P (2018) Privacy digitale e protezione dei dati personali tra persona e mercato. Il Foro napoletano 2:481–484

    Google Scholar 

  • Perlingieri C (2020a) Creazione e circolazione del bene prodotto dal trattamento algoritmico dei dati. In: Perlingieri P, Giova S, Prisco I (eds) Il trattamento algoritmico dei dati tra etica, diritto ed economia, ESI, Naples, pp 177–197

    Google Scholar 

  • Perlingieri P (2020b) Il diritto civile nella legalità costituzionale. II. Fonti e interpretazione. ESI, Naples

    Google Scholar 

  • Petit N, Van Cleynenbreugel P (2020) Questionnaire Topic 3: EU Competition Law and the Digital Economy. In: Mândrescu D (ed) EU Competition Law and the Digital Economy: Protecting Free and Fair Competition in an Age of Technological (R)evolution. The XXIX FIDE congress in The Hague. 2020 Congress Publications. Eleven international publishing, Chicago, pp 1–8

    Google Scholar 

  • Picker E (2003) L’antidiscriminazione come programma per il diritto privato. Rivista critica del diritto privato 21: 687–703

    Google Scholar 

  • Piraino F (2015) Il diritto europeo e la «giustizia contrattuale». Europa e diritto privato 2:233–293

    Google Scholar 

  • Pizzetti F (2018) Intelligenza artificiale, protezione dei dati personali e regolazione. Giappichelli, Turin

    Google Scholar 

  • Prince A, Schwarcz DB (2020) Proxy discrimination in the age of artificial intelligence and big data. Iowa Law Rev 105:1257–1318

    Google Scholar 

  • Resta G (2019) Governare l’innovazione tecnologica: decisioni algoritmiche, diritti digitali e principio di uguaglianza. Politica del diritto 50:199–236

    Google Scholar 

  • Resta G (2022) Pubblico, privato e collettivo nel sistema europeo di governo dei dati. Rivista trimestrale di diritto pubblico 4:971–995

    Google Scholar 

  • Ricciuto V (2019) La patrimonializzazione dei dati personali. Contratto e mercato nella ricostruzione del fenomeno. In: Cuffaro V, D’Orazio R, Ricciuto V (eds) I dati personali nel diritto europeo. Giappichelli, Turin, pp 23–59

    Google Scholar 

  • Ricciuto V (2022) L’equivoco della privacy. Persona vs dato personale. ESI, Naples

    Google Scholar 

  • Roppo V (2011) Il contratto. Giuffrè, Milan

    Google Scholar 

  • Roppo V (2020) Contrastare le disuguaglianze nelle relazioni contrattuali: fra diritto comune dei contratti e diritto dei contratti del mercato. Questione giustizia 1:28–32

    Google Scholar 

  • Schmidt-Kessel M (2023) La responsabilità contrattuale del gestore di una piattaforma per la protezione dei dati personali del cliente nella prospettiva dei requisiti di qualità dell’adempimento. Pactum 1:153–162

    Google Scholar 

  • Schönberger D (2019) Artificial intelligence in healthcare: a critical analysis of the legal and ethical implications. Int J Law Inf Technol 27:171–203

    Google Scholar 

  • Schulze R (2008) Common frame of reference and existing EC contract law. Sellier de Gruyter, Munich

    Google Scholar 

  • Sganga C (2022) Ventisei anni di Direttiva Database alla prova della nuova Strategia Europea per i Dati: evoluzioni giurisprudenziali e percorsi di riforma. Diritto dell’informazione e dell’informatica 3:657–710

    Google Scholar 

  • Sitzia L (2011) Pari dignità e discriminazione. ESI, Naples

    Google Scholar 

  • Skitka J, Mosier KL, Burdick M (1999) Does automation bias decision–making? Int J Hum Comput Stud 5:991–1006

    Article  Google Scholar 

  • Smuha NA, Ahmed–Rengers E, Harkens A, Li W, MacLaren J, Piselli R, Yeung K (2021) How the EU can achieve legally trustworthy AI: a response to the European commission’s proposal for an Artificial Intelligence Act. https://doi.org/10.2139/ssrn.3899991

  • Thobani S (2015) Il consenso al trattamento dei dati come condizione per la fruizione di servizi online. In: Perlingieri C, Ruggieri L (eds), Internet e Diritto civile. ESI, Naples, pp 459–484

    Google Scholar 

  • Timmis S, Broadfoot P, Sutherland R, Oldfield A. (2015) Rethinking assessment in a digital age: opportunities, challenges and risks. Br Edu Res J 42:454–476

    Google Scholar 

  • Tomassini L (2020) Il grande salto: L’uomo, il digitale e la più importante evoluzione della nostra storia. Luiss University Press, Rome

    Google Scholar 

  • Tommasi S (2011) La non discriminazione nel DCFR. Rivista critica del diritto privato 1:119–128

    Google Scholar 

  • Tullio J (2023) Riflessioni sulla disposizione dei dati personali nei negozi di adesione a piattaforme online. Tecnologia e diritto 1:78–105

    Google Scholar 

  • Veale M, Edwards L (2018) Clarity, surprises, and further questions in the article 29 working party draft guidance on automated decision-making and profiling’. Comput Law Secur Rev 34:398–404

    Google Scholar 

  • Wachter S, Mittelstadt B, Russell C (2021) Why fairness cannot be automated: Bridging the gap. between EU non–discrimination law and AI. Comput Law Secur Rev 41:1–72

    Article  Google Scholar 

  • Wójcik MA (2022) Algorithmic discrimination: an eu law perspective. Health Hum Rights 24:93–103

    Google Scholar 

  • Zadeh NK, Robertson K, Green JA (2017) ‘At–risk’ individuals’ responses to direct to consumer advertising of prescription drugs: a nationally representative cross–sectional study. BMJ 7:1–10

    Google Scholar 

  • Zuiderveen Borgesius FJ (2019) Discrimination, artificial intelligence, and algorithmic decision–making. https://rm.coe.int/discrimination-artificial-intelligence-and-algorithmic-decision-making/1680925d73. Accessed 30 May 2023

  • Zuiderveen Borgesius FJ (2020) Strengthening legal protection against discrimination by algorithms. Int J Hum Rights 24:1–22

    Google Scholar 

  • Zuiderveen Borgesius FJ (2023) Digitale discriminatie en differentiatie: het recht is er nog niet klaar voor. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4520646. Accessed 31 July 2023

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sara Tommasi .

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Tommasi, S. (2023). Discrimination in the Digital Market: Protection from Different Sides. In: The Risk of Discrimination in the Digital Market. SpringerBriefs in Law. Springer, Cham. https://doi.org/10.1007/978-3-031-43640-6_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43640-6_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43639-0

  • Online ISBN: 978-3-031-43640-6

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics