Skip to main content

Artificial Intelligence and Discrimination: Discriminating Against Discriminatory Systems

  • Chapter
  • First Online:
Regulating Artificial Intelligence

Abstract

AI promises to provide fast, consistent, and rational assessments. Nevertheless, algorithmic decision-making, too, has proven to be potentially discriminatory. EU antidiscrimination law is equipped with an appropriate doctrinal tool kit to face this new phenomenon. This is particularly true in view of the legal recognition of indirect discriminations, which no longer require certain proofs of causality, but put the focus on conspicuous correlations, instead. As a result, antidiscrimination law highly depends on knowledge about vulnerable groups, both on a conceptual as well as on a factual level. This Chapter hence recommends a partial realignment of the law towards a paradigm of knowledge creation when being faced with potentially discriminatory AI.

The basic argument of this contribution stems from Tischbirek (2019).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    On AI’s potential to rationalize administrative decision making processes see Hermstrüwer, paras 3 et seq.

  2. 2.

    In contrast, see Danziger et al. (2011) for an empirical study on the effects of a lunch break on (human) decision-making in court.

  3. 3.

    See, inter alia, Calders and Žliobaitė (2013), Barocas and Selbst (2016), Žliobaitė and Custers (2016), O’Neil (2016), Caliskan et al. (2017), Kroll et al. (2017), Hacker (2018); for an early account, see Friedman and Nissenbaum (1996).

  4. 4.

    Barocas and Selbst (2016), pp. 692–693.

  5. 5.

    For a good introduction, see O’Neil (2016), pp. 15–31.

  6. 6.

    See Hermstrüwer, paras 21–38.

  7. 7.

    Calders and Žliobaitė (2013), p. 51.

  8. 8.

    Ferguson (2015), pp. 401–403; for German police law see Rademacher, paras 35 et seq. and Rademacher (2017), p. 376.

  9. 9.

    Regan (2016).

  10. 10.

    Calders and Žliobaitė (2013), p. 50.

  11. 11.

    Barocas and Selbst (2016), p. 681, with reference to Hand (2006), p. 10; see Ernst, paras 4–5.

  12. 12.

    Wolfangel (2017).

  13. 13.

    Möllers (2015), pp. 13–17.

  14. 14.

    CJEU C-236/09 ‘Association belge des Consommateurs Test-Achats ASBL et al. v. Conseil des ministres’ (1 March 2011), 2011 E.C.R. 773, paras 30–32. To be precise: the Court invalidated Article 5(2) of Council Directive 2004/113/EC, (2004) O.J. L 373 37–43, which allowed for gender-specific tariffs under certain procedural conditions. It held that such (permanent) exemptions to the non-discrimination clause of Article 5(1) of the Directive constituted a violation of Articles 21 and 23 of the EU Charter of Fundamental Rights.

  15. 15.

    For Germany, see Statistisches Bundesamt (2017), pp. 12, 21.

  16. 16.

    The World Bank (2017).

  17. 17.

    For an attempt to statistically and sociologically justify gender-specific insurance tariffs in the U.K. see The Social Issues Research Center Oxford (2004).

  18. 18.

    In Test-Achats, the CJEU only referred to the problem of differences in life expectancy since Article 5 (3) of Council Directive 2004/113/EC explicitly prohibits to actuarially impose the costs of pregnancy on women alone.

  19. 19.

    For discussions of Test-Achats in light of AI decision making, cf. Gellert et al. (2013), pp. 79–81; Hacker (2018), pp. 1166–1167.

  20. 20.

    For a discussion of ‘vitality’ programs in health insurance see Ernst, paras 2–3.

  21. 21.

    See Wischmeyer, paras 9 et seq.; reverse-engineering may even be undesired by the users of AI in order to prevent gaming, see Hermstrüwer, paras 65–69.

  22. 22.

    Cf. Wischmeyer (2018), pp. 42–46.

  23. 23.

    For a more skeptical assessment of current antidiscrimination law doctrine, see Barocas and Selbst (2016), and Hacker (2018).

  24. 24.

    Fredman (2011), p. 203.

  25. 25.

    Ellis and Watson (2012), pp. 163–165; Thüsing (2015), § 3 AGG at para 9.

  26. 26.

    Mangold (2016), p. 223.

  27. 27.

    U.S. Supreme Court ‘Brown v. Board of Education’ (17 May 1954), 347 U.S. 483.

  28. 28.

    CJEU 80/70 ‘Defrenne v. Sabena I’ (25 May 1971), 1971 E.C.R. 445; 43/75 ‘Defrenne v. Sabena II’ (8 April 1976), 1976 E.C.R. 455; 149/77 ‘Defrenne v. Sabena III’ (15 June 1978) 1978 E.C.R. 1365.

  29. 29.

    Council Directive 2000/43/EC, [2000] O.J. L 180 22–26. Corresponding provisions can be found in Article 10(1) of Directive 2000/78/EC, [2000] O.J. L 303 16–22, Article 9(1) of Directive 2004/113/EC and Article 19(1) of Directive 2006/54/EC, [2006] O.J. L 204 23–36.

  30. 30.

    Ellis and Watson (2012), pp. 157–163.

  31. 31.

    It must be mentioned that police action is generally beyond the scope of Council Directive 2000/43/EC. The same is not necessarily true for Article 21 of the Charter of Fundamental Rights, however, and a shift in the burden of proof can also result from constitutional law or even conventional administrative law doctrine without any reference to the EU anti-discrimination directives.

  32. 32.

    This is referred to as ‘structural’ or ‘institutional’ discrimination. See Delgado and Stefancic (2017), pp. 31–35. For structural discriminations concerning two or more ‘intersecting’ categories, see Crenshaw (1989).

  33. 33.

    Fredman (2011), pp. 177–189, 203–204.

  34. 34.

    Krieger (1995), and Jolls and Sunstein (2006).

  35. 35.

    Cf. the classic conception of ‘tacit knowledge’ by Polanyi (1966), p. 4: ‘we can know more than we can tell’.

  36. 36.

    Tourkochoriti (2017).

  37. 37.

    U.S. Supreme Court ‘Griggs et al. v. Duke Power Co.’ (8 Mar 1971) 401 U.S. 424.

  38. 38.

    See the direct quotation in the Opinion of Advocate General Warner to CJEU 96/80 ‘Jenkins v. Kinsgate’ (delivered 28 January 1981), 1981 E.C.R. 911, 936.

  39. 39.

    CJEU 96/80 ‘Jenkins v. Kinsgate’ (31 March 1981), 1981 E.C.R. 911, para 13.

  40. 40.

    CJEU 96/80 ‘Jenkins v. Kinsgate’ (31 March 1981), 1981 E.C.R. 911, 913.

  41. 41.

    CJEU 170/84 ‘Bilka v. Weber von Hartz’ (13 May 1986), 1986 E.C.R. 1607, para 29.

  42. 42.

    See, for example, Fassbender (2006), pp. 248–249.

  43. 43.

    See Wischmeyer, paras 13–14.

  44. 44.

    See for U.S. law: Barocas and Selbst (2016), pp. 701–702; for EU law: Hacker (2018), pp. 1152–1154.

  45. 45.

    Hacker (2018), pp. 1160–1165, by contrast, believes proportionality to offer a ‘relatively straightforward route to justification’ on economic grounds at least when it comes to cases of proxy discrimination.

  46. 46.

    Most recently, see CJEU C-68/17 ‘IR v. JQ’ (11 September 2018), ECLI:EU:C:2018:696, para 67.

  47. 47.

    Cf. Baer (2009).

  48. 48.

    For the category of race, see most recently Feldmann et al. (2018).

  49. 49.

    Preamble lit. e and Article 1 (2) of the UN Convention on the Rights of People with Disabilities.

  50. 50.

    Cf. Ainsworth (2015).

  51. 51.

    For a critical account, see Keaton (2010).

  52. 52.

    Cf. Liebscher et al. (2012).

  53. 53.

    Barskanmaz (2011).

  54. 54.

    Conseil Constitutionnel Decision CC 2007-557 DC concerning the constitutionality of the Loi relative á la maîtrise de l’immigration, á l’intégration et á l’asile (15 November 2007), ECLI:FR:CC:2007:2007.557.DC.

  55. 55.

    Regulation 2016/679, (2016) O.J. L 119 1–88.

  56. 56.

    Cf. Arthur Glass (2010), pp. 65–68; on the concepts underlying European privacy law see also Marsch, paras 5 et seq.

  57. 57.

    Davis (1942), pp. 402–403; Chayes (1976), pp. 1282–1283.

  58. 58.

    Ladeur (2012), pp. 77–80, 82–86.

  59. 59.

    Berghahn et al. (2016), pp. 101, 161–162.

  60. 60.

    Cf. Berghahn et al. (2016), pp. 101–102.

  61. 61.

    Barocas and Selbst (2016), p. 719; see Wischmeyer, paras 10–15.

  62. 62.

    Hacker (2018), pp. 1177–1179.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Tischbirek .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Tischbirek, A. (2020). Artificial Intelligence and Discrimination: Discriminating Against Discriminatory Systems. In: Wischmeyer, T., Rademacher, T. (eds) Regulating Artificial Intelligence. Springer, Cham. https://doi.org/10.1007/978-3-030-32361-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32361-5_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32360-8

  • Online ISBN: 978-3-030-32361-5

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics