Abstract
AI promises to provide fast, consistent, and rational assessments. Nevertheless, algorithmic decision-making, too, has proven to be potentially discriminatory. EU antidiscrimination law is equipped with an appropriate doctrinal tool kit to face this new phenomenon. This is particularly true in view of the legal recognition of indirect discriminations, which no longer require certain proofs of causality, but put the focus on conspicuous correlations, instead. As a result, antidiscrimination law highly depends on knowledge about vulnerable groups, both on a conceptual as well as on a factual level. This Chapter hence recommends a partial realignment of the law towards a paradigm of knowledge creation when being faced with potentially discriminatory AI.
The basic argument of this contribution stems from Tischbirek (2019).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
On AI’s potential to rationalize administrative decision making processes see Hermstrüwer, paras 3 et seq.
- 2.
In contrast, see Danziger et al. (2011) for an empirical study on the effects of a lunch break on (human) decision-making in court.
- 3.
- 4.
Barocas and Selbst (2016), pp. 692–693.
- 5.
For a good introduction, see O’Neil (2016), pp. 15–31.
- 6.
See Hermstrüwer, paras 21–38.
- 7.
Calders and Žliobaitė (2013), p. 51.
- 8.
- 9.
Regan (2016).
- 10.
Calders and Žliobaitė (2013), p. 50.
- 11.
- 12.
Wolfangel (2017).
- 13.
Möllers (2015), pp. 13–17.
- 14.
CJEU C-236/09 ‘Association belge des Consommateurs Test-Achats ASBL et al. v. Conseil des ministres’ (1 March 2011), 2011 E.C.R. 773, paras 30–32. To be precise: the Court invalidated Article 5(2) of Council Directive 2004/113/EC, (2004) O.J. L 373 37–43, which allowed for gender-specific tariffs under certain procedural conditions. It held that such (permanent) exemptions to the non-discrimination clause of Article 5(1) of the Directive constituted a violation of Articles 21 and 23 of the EU Charter of Fundamental Rights.
- 15.
For Germany, see Statistisches Bundesamt (2017), pp. 12, 21.
- 16.
The World Bank (2017).
- 17.
For an attempt to statistically and sociologically justify gender-specific insurance tariffs in the U.K. see The Social Issues Research Center Oxford (2004).
- 18.
In Test-Achats, the CJEU only referred to the problem of differences in life expectancy since Article 5 (3) of Council Directive 2004/113/EC explicitly prohibits to actuarially impose the costs of pregnancy on women alone.
- 19.
- 20.
For a discussion of ‘vitality’ programs in health insurance see Ernst, paras 2–3.
- 21.
See Wischmeyer, paras 9 et seq.; reverse-engineering may even be undesired by the users of AI in order to prevent gaming, see Hermstrüwer, paras 65–69.
- 22.
Cf. Wischmeyer (2018), pp. 42–46.
- 23.
- 24.
Fredman (2011), p. 203.
- 25.
- 26.
Mangold (2016), p. 223.
- 27.
U.S. Supreme Court ‘Brown v. Board of Education’ (17 May 1954), 347 U.S. 483.
- 28.
CJEU 80/70 ‘Defrenne v. Sabena I’ (25 May 1971), 1971 E.C.R. 445; 43/75 ‘Defrenne v. Sabena II’ (8 April 1976), 1976 E.C.R. 455; 149/77 ‘Defrenne v. Sabena III’ (15 June 1978) 1978 E.C.R. 1365.
- 29.
Council Directive 2000/43/EC, [2000] O.J. L 180 22–26. Corresponding provisions can be found in Article 10(1) of Directive 2000/78/EC, [2000] O.J. L 303 16–22, Article 9(1) of Directive 2004/113/EC and Article 19(1) of Directive 2006/54/EC, [2006] O.J. L 204 23–36.
- 30.
Ellis and Watson (2012), pp. 157–163.
- 31.
It must be mentioned that police action is generally beyond the scope of Council Directive 2000/43/EC. The same is not necessarily true for Article 21 of the Charter of Fundamental Rights, however, and a shift in the burden of proof can also result from constitutional law or even conventional administrative law doctrine without any reference to the EU anti-discrimination directives.
- 32.
- 33.
Fredman (2011), pp. 177–189, 203–204.
- 34.
- 35.
Cf. the classic conception of ‘tacit knowledge’ by Polanyi (1966), p. 4: ‘we can know more than we can tell’.
- 36.
Tourkochoriti (2017).
- 37.
U.S. Supreme Court ‘Griggs et al. v. Duke Power Co.’ (8 Mar 1971) 401 U.S. 424.
- 38.
See the direct quotation in the Opinion of Advocate General Warner to CJEU 96/80 ‘Jenkins v. Kinsgate’ (delivered 28 January 1981), 1981 E.C.R. 911, 936.
- 39.
CJEU 96/80 ‘Jenkins v. Kinsgate’ (31 March 1981), 1981 E.C.R. 911, para 13.
- 40.
CJEU 96/80 ‘Jenkins v. Kinsgate’ (31 March 1981), 1981 E.C.R. 911, 913.
- 41.
CJEU 170/84 ‘Bilka v. Weber von Hartz’ (13 May 1986), 1986 E.C.R. 1607, para 29.
- 42.
See, for example, Fassbender (2006), pp. 248–249.
- 43.
See Wischmeyer, paras 13–14.
- 44.
- 45.
Hacker (2018), pp. 1160–1165, by contrast, believes proportionality to offer a ‘relatively straightforward route to justification’ on economic grounds at least when it comes to cases of proxy discrimination.
- 46.
Most recently, see CJEU C-68/17 ‘IR v. JQ’ (11 September 2018), ECLI:EU:C:2018:696, para 67.
- 47.
Cf. Baer (2009).
- 48.
For the category of race, see most recently Feldmann et al. (2018).
- 49.
Preamble lit. e and Article 1 (2) of the UN Convention on the Rights of People with Disabilities.
- 50.
Cf. Ainsworth (2015).
- 51.
For a critical account, see Keaton (2010).
- 52.
Cf. Liebscher et al. (2012).
- 53.
Barskanmaz (2011).
- 54.
Conseil Constitutionnel Decision CC 2007-557 DC concerning the constitutionality of the Loi relative á la maîtrise de l’immigration, á l’intégration et á l’asile (15 November 2007), ECLI:FR:CC:2007:2007.557.DC.
- 55.
Regulation 2016/679, (2016) O.J. L 119 1–88.
- 56.
Cf. Arthur Glass (2010), pp. 65–68; on the concepts underlying European privacy law see also Marsch, paras 5 et seq.
- 57.
- 58.
Ladeur (2012), pp. 77–80, 82–86.
- 59.
Berghahn et al. (2016), pp. 101, 161–162.
- 60.
Cf. Berghahn et al. (2016), pp. 101–102.
- 61.
Barocas and Selbst (2016), p. 719; see Wischmeyer, paras 10–15.
- 62.
Hacker (2018), pp. 1177–1179.
References
Ainsworth C (2015) Sex redefined. Nature 518:288–291. https://doi.org/10.1038/518288a
Baer S (2009) Chancen und Grenzen positiver Maßnahmen nach § 5 AGG. www.rewi.hu-berlin.de/de/lf/ls/bae/w/files/ls_aktuelles/09_adnb_baer.pdf. Accessed 4 Feb 2019
Barocas S, Selbst AD (2016) Big data’s disparate impact. Calif Law Rev 104:671–732
Barskanmaz C (2011) Rasse - Unwort des Antidiskriminierungsrechts. Kritische Justiz 44:382–389
Berghahn S, Klapp M, Tischbirek A (2016) Evaluation des Allgemeinen Gleichbehandlungsgesetzes. Reihe der Antidiskriminierungsstelle des Bundes. Nomos, Baden-Baden
Calders T, Žliobaitė I (2013) Why unbiased computational processes can lead to discriminative decision procedures. In: Custers B, Calders T, Schermer B, Zarsky T (eds) Discrimination and privacy in the information society. Studies in applied philosophy, epistemology and rational ethics. Springer, Berlin, pp 27–42
Caliskan A, Bryson JJ, Narayanan A (2017) Semantics derived automatically from language corpora contain human-like biases. Science 356:183–186. https://doi.org/10.1126/science.aal4230
Chayes A (1976) The role of the judge in public law litigation. Harv Law Rev 89:1281–1316. https://doi.org/10.2307/1340256
Crenshaw K (1989) Demarginalizing the intersection of race and sex: a black feminist critique of antidiscrimination doctrine, feminist theory and antiracist politics. Univ Chicago Legal Forum 1989:139–167
Danziger S, Levav J, Avnaim-Pesso L (2011) Extraneous factors in judicial decisions. Proc Natl Acad Sci USA 108:6889–6892
Davis KC (1942) An approach to problems of evidence in the administrative process. Harv Law Rev 55:364–425. https://doi.org/10.2307/1335092
Delgado R, Stefancic J (2017) Critical race theory, 3rd edn. New York University Press, New York
Ellis E, Watson P (2012) EU anti-discrimination law. Oxford EU law library, 2nd edn. Oxford University Press, Oxford
Fassbender B (2006) Wissen als Grundlage staatlichen Handelns. In: Isensee J, Kirchhof P (eds) Handbuch des Staatsrechts, vol IV, 3rd edn. C.F. Müller, Heidelberg, pp 243–312
Feldmann D, Hoffmann J, Keilhauer A, Liebold R (2018) “Rasse” und “ethnische Herkunft” als Merkmale des AGG. Rechtswissenschaft 9:23–46
Ferguson A (2015) Big data and predictive reasonable suspicion. Univ Pa Law Rev 163:327–410
Fredman S (2011) Discrimination law. Clarendon law series, 2nd edn. Oxford University Press, Oxford
Friedman B, Nissenbaum H (1996) Bias in computer systems. ACM Trans Inf Syst 14:330–347. https://doi.org/10.1145/230538.23056
Gellert R, de Vries K, de Hert P, Gutwirth S (2013) A comparative analysis of anti-discrimination and data protection legislations. In: Custers B, Calders T, Schermer B, Zarsky T (eds) Discrimination and privacy in the information society. Studies in applied philosophy, epistemology and rational ethics. Springer, Berlin, pp 61–88
Glass A (2010) Privacy and law. In: Blatterer H, Johnson P (eds) Modern privacy. Palgrave Macmillan, London, pp 59–72
Hacker P (2018) Teaching fairness to artificial intelligence: existing and novel strategies against algorithmic discrimination under EU law. Common Mark Law Rev 55:1143–1186
Hand DJ (2006) Classifier technology and the illusion of progress. Stat Sci 21:1–14. https://doi.org/10.1214/088342306000000060
Jolls C, Sunstein CR (2006) The law of implicit bias. Calif Law Rev 94:969–996
Keaton TD (2010) The politics of race-blindness: (anti)blackness and category-blindness in contemporary France. Du Bois Rev Soc Sci Res Race 7:103–131. https://doi.org/10.1017/S1742058X10000202
Krieger LH (1995) The content of our categories: a cognitive bias approach to discrimination and equal employment opportunity. Stanford Law Rev 47:1161–1248
Kroll JA, Huey J, Barocas S, Felten EW, Reidenberg JR, Robinson DG, Yu H (2017) Accountable algorithms. Univ Pa Law Rev 165:633–795
Ladeur K-H (2012) Die Kommunikationsinfrastruktur der Verwaltung. In: Hoffmann-Riem W, Schmidt-Assmann E, Voßkuhle A (eds) Grundlagen des Verwaltungsrechts, vol 2. C.H. Beck, München, pp 35–106
Liebscher D, Naguib T, Plümecke T, Remus J (2012) Wege aus der Essentialismusfalle: Überlegungen zu einem postkategorialen Antidiskriminierungsrecht. Kritische Justiz 45:204–218
Mangold AK (2016) Demokratische Inklusion durch Recht. Habilitation Treatise, Johann-Wolfgang von Goethe-Universität, Frankfurt a.M
Möllers C (2015) Die Möglichkeit der Normen. Suhrkamp, Berlin
O’Neil C (2016) Weapons of math destruction. How big data increases inequality and threatens democracy. Crown, New York
Polanyi M (1966) The tacit dimension. The University of Chicago Press, Chicago
Rademacher T (2017) Predictive Policing im deutschen Polizeirecht. Archiv des öffentlichen Rechts 142:366–416
Regan J (2016) New Zealand passport robot tells applicant of Asian descent to open eyes. Available via Reuters. www.reuters.com/article/us-newzealand-passport-error/new-zealand-passport-robot-tells-applicant-of-asian-descent-to-open-eyes-idUSKBN13W0RL. Accessed 10 Oct 2018
Statistisches Bundesamt (2017) Verkehrsunfälle: Unfälle von Frauen und Männern im Straßenverkehr 2016. Available via DESTATIS. www.destatis.de/DE/Publikationen/Thematisch/TransportVerkehr/Verkehrsunfaelle/UnfaelleFrauenMaenner5462407167004.pdf?__blob=publicationFile. Accessed 10 Oct 2018
The Social Issues Research Center Oxford (2004) Sex differences in driving and insurance risk. www.sirc.org/publik/driving.pdf. Accessed 10 Oct 2018
The World Bank (2017) Data: Life expectancy at birth. data.worldbank.org/indicator/SP.DYN.LE00.MA.IN (male) and data.worldbank.org/indicator/SP.DYN.LE00.FE.IN (female). Accessed 10 Oct 2018
Thüsing G (2015) Allgemeines Gleichbehandlungsgesetz. In: Säcker FJ (ed) Münchener Kommentar zum BGB, vol 1, 7th edn. Beck, München, pp 2423–2728
Tischbirek A (2019) Wissen als Diskriminierungsfrage. In: Münkler (ed) Dimensionen des Wissens im Recht. Mohr Siebeck, Tübingen, pp 67–88
Tourkochoriti I (2017) Jenkins v. Kinsgate and the migration of the US disparate impact doctrine in EU law. In: Nicola F, Davies B (eds) EU law stories. Cambridge University Press, Cambridge, pp 418–445
Wischmeyer T (2018) Regulierung intelligenter Systeme. Archiv des öffentlichen Rechts 143:1–66. https://doi.org/10.1628/aoer-2018-0002
Wolfangel E (2017) Künstliche Intelligenz voller Vorurteile. Available via NZZ. www.nzz.ch/wissenschaft/selbstlernende-algorithmen-kuenstliche-intelligenz-voller-vorurteile-ld.1313680. Accessed 10 Oct 2018
Žliobaitė I, Custers B (2016) Using sensitive personal data may be necessary for avoiding discrimination in data-driven decision models. Artif Intell Law 24:183–201. https://doi.org/10.1007/s10506-016-9182-5
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Tischbirek, A. (2020). Artificial Intelligence and Discrimination: Discriminating Against Discriminatory Systems. In: Wischmeyer, T., Rademacher, T. (eds) Regulating Artificial Intelligence. Springer, Cham. https://doi.org/10.1007/978-3-030-32361-5_5
Download citation
DOI: https://doi.org/10.1007/978-3-030-32361-5_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-32360-8
Online ISBN: 978-3-030-32361-5
eBook Packages: Law and CriminologyLaw and Criminology (R0)