Skip to main content

Gender Equality and Artificial Intelligence: SDG 5 and the Role of the UN in Fighting Stereotypes, Biases, and Gender Discrimination

  • Chapter
  • First Online:
Women’s Empowerment and Its Limits

Abstract

Gender inequalities and discrimination are a global problem affecting all countries of the United Nations (UN). Since the rise of digital technologies, such as artificial intelligence (AI) and algorithms, new challenges have arisen for achieving gender equality. While some argue that digital technologies could be an opportunity to overcome some of the gender inequalities, many warn of the new dangers associated with the use of AI and algorithms, particularly for disadvantaged groups of society suffering from inequalities in their daily life since centuries ago. To assess the UN’s role of preserving gender equality in the age of algorithms, this chapter is framed by the Sustainable Development Goals (SDG), especially SDG 5, and includes a brief overview of the broader policy and institutional framework of the UN. It is argued here that an adequate legal and policy framework with a clear gender dimension at global level is needed and that it could be achieved by involving the main gender equality actors of the UN system. Drawing on the available literature and policy proposals, the last section will sketch out some of the elements of a forward-looking, sustainable, and antifragile UN framework that could be used to address the challenges of AI for gender equality.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    For gender bias and policy options from a behavioral science perspective, see Iris Bohnet (2016). The author would like to thank the UN library in Geneva and its helpful team for having been able to conduct some of the research and writing of this chapter in the UN library.

  2. 2.

    Fry (2018), pp. 77–80.

  3. 3.

    On algorithms and inequality, see, for example, Eubanks (2018).

  4. 4.

    United Nations High Commissioner for Human Rights. 2021. The Right to Privacy in the Digital Age. A/HRC/48/31; United Nations Human Rights Council. 2022. Rights of Persons with Disabilities, Report of the Special Rapporteur on the Rights of Persons with Disabilities. A/HRC/49/52; European Commission, Directorate-General for Justice and Consumers, Gerards, J., Xenidis, R. 2021. Algorithmic discrimination in Europe: challenges and opportunities for gender equality and non-discrimination law. European Publications Office. https://data.europa.eu/doi/10.2838/544956; UNESCO. 2020. Artificial Intelligence and Gender Equality: Key Findings of UNESCO’s Global Dialogue. GEN/2020/AI/2 REV.

  5. 5.

    See Weforum. 2022. Gender Artificial Intelligence Driveless Cars. https://www.weforum.org/agenda/2022/08/gender-artificial-intelligence-driverless-cars-technology-opinion. Accessed 31 October 2022.

  6. 6.

    For a good overview, see Schulz (2022).

  7. 7.

    See Weforum. 2019. This is why AI has a gender problem. https://www.weforum.org/agenda/2019/06/this-is-why-ai-has-a-gender-problem/. Accessed 31 October 2022.

  8. 8.

    See Weforum. 2022. Gender Artificial Intelligence Driveless Cars. https://www.weforum.org/agenda/2022/08/gender-artificial-intelligence-driverless-cars-technology-opinion. Accessed 31 October 2022, Weforum. 2022. Global Gender Gap Report 2022. Insight Report. https://www3.weforum.org/docs/WEF_GGGR_2022.pdf. Accessed 31 October 2022 and Weforum. 2021. 136 years is the estimated journey time to gender equality. https://www.weforum.org/agenda/2021/04/136-years-is-the-estimated-journey-time-to-gender-equality/. Accessed 31 October 2022. In 2018 it was estimated that it will take 108 years, and in 2019, 99.5 years.

  9. 9.

    Lambrecht and Tucker (2019).

  10. 10.

    Vigdor, Neil. 2019. Apple Credit Card Investigation. The New York Times, November 10.

  11. 11.

    Latonero, Mark. 2018. Governing Artificial Intelligence: Upholding Human Rights & Dignity. Report. Data & Society, 9. https://datasociety.net/library/governing-artificial-intelligence. Accessed 31 October 2022.

  12. 12.

    See Dastin, Jeffrey. 2022. Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G. Accessed 31 October 2022.

  13. 13.

    Smith and Rustagi (2021).

  14. 14.

    Datta et al. (2014).

  15. 15.

    Kay et al. (2015).

  16. 16.

    Prates et al. (2020).

  17. 17.

    Kurita et al. (2019).

  18. 18.

    For the role of big technological companies in managing data, see also Zuboff (2019).

  19. 19.

    See UN Women. 2013. UN Women ad series reveals widespread sexism. https://www.unwomen.org/en/news/stories/2013/10/women-should-ads. Accessed 31 October 2022; Kartik Hosanagar, A Human’s Guide to Machine Intelligence: How Algorithms Are Shaping Our Lives and How We Can Stay in Control (Penguin Books, 2020), pp. 42–43.

  20. 20.

    Beard (2017), p. xi.

  21. 21.

    See MacAskill (2022b), Greaves and MacAskill (2019), and MacAskill (2022a), pp. 9–29.

  22. 22.

    For gender-responsive approaches, see Budlender et al. (2002), Kalbarczyk et al. (2022), and Nelson, Sibyl, and Sophia Huyer. 2016. A Gender-Responsive Approach to Climate-Smart Agriculture: Evidence and Guidance for Practitioners. Climate-Smart Agriculture Practice Brief. https://ccafs.cgiar.org/resources/publications/gender-responsive-approach-climate-smart-agriculture-evidence-and. Accessed 31 October 2022.

  23. 23.

    Gender-transformative approaches go further than mere gender-blind, gender-aware, or gender-responsive policies by addressing the causes of gender-based inequalities and working toward transforming harmful gender roles, norms, and power relations. See UNICEF. 2019. Technical note on gender-transformative approaches in the global programme to end child marriage phase II: A summary for practitioners. https://www.unicef.org/media/58196/file. Accessed 31 October 2022. Originally, the so-called gender equity continuum could be found in Pederson et al. (2015). Examples at UN level include UN Woman (like the UN Inter-Agency Network on Women and Gender Equality. 2021. Repository on Resources and Tools for Capacity Development on Gender Mainstreaming within the United Nations System. https://www.unwomen.org/sites/default/files/Headquarters/Attachments/Sections/How%20We%20Work/UNSystemCoordination/IANWGE/IANWGE-Resources-and-tools-for-capacity-development-in-gender-mainstreaming-en.pdf. Accessed 31 October 2022); the FAO (like FAO. n.d. Joint Programme on Gender Transformative Approaches for Food Security and Nutrition. https://www.fao.org/joint-programme-gender-transformative-approaches/overview/gender-transformative-approaches/en. Accessed 31 October 2022); and UNFPA (like UNFPA, UNICEF, UN Women. 2020. Technical Note on Gender-Transformative Approaches: A summary for practitioners. https://www.unfpa.org/resources/technical-note-gender-transformative-approaches-summary-practitioners. Accessed 31 October 2022).

  24. 24.

    Pasquale (2020).

  25. 25.

    Schulz (2022), p. 40.

  26. 26.

    This chapter will not discuss technical details of AI and algorithms and will use the terms of AI and algorithms to describe the multiple techniques typically regrouped under them. For a basic understanding of how algorithms work, notably in the area of gender equality, see Table 9.2. and the non-technical explanations in Lütz (2022a). For a more technical description, see Früh and Haux (2022).

  27. 27.

    For more on SDG 5 in general, see Eden and Wagstaff (2021); Solomon et al. (2021); Ijjas (2021).

  28. 28.

    Buvinic and Levine (2016); Perez (2019).

  29. 29.

    European Commission, Digital Economy and Society Index (DESI) 2021, p. 16, available at: https://digital-strategy.ec.europa.eu/en/library/digital-economy-and-society-index-desi-2021 (Accessed 29 September 2022).

  30. 30.

    In the EU, only around 22% of programmers are women. See European Commission and Directorate-General for Communication. 2020. Striving for a Union of Equality: The Gender Equality Strategy 2020–2025. https://doi.org/10.2775/671326.

  31. 31.

    For more on noise in human decision-making and how algorithms could help overcome this decision error, see Kahneman, Sibony and Sunstein (2021).

  32. 32.

    Houser (2019).

  33. 33.

    For a legal view on the human rights implications and how to regulate gender-based algorithmic discrimination, see the chapter by Lütz (2023a).

  34. 34.

    See Lütz (2022a, 2022b).

  35. 35.

    European Commission. 2022. Communication on the European Care Strategy. COM(2022) 440 final.

  36. 36.

    Ibid., p. 2.

  37. 37.

    Ibid., p. 23.

  38. 38.

    Pauline Delage (2018), p. 47.

  39. 39.

    Beghini et al. (2022), p. 53.

  40. 40.

    European Commission. 2022. Communication on the European Care Strategy. COM(2022) 440 final, p. 12. See also the campaign of the European Commission #End-Gender-Stereotypes launched on 8th March 2023, https://end-gender-stereotypes.campaign.europa.eu/index_en (Accessed on 31 October 2022).

  41. 41.

    Oliveira et al. (2020).

  42. 42.

    For algorithmic discrimination and gender equality, see Lütz (2022a, 2022b).

  43. 43.

    For ways to assess potential biases and the discriminatory potential of algorithms, see Emre Kazim et al. (2021).

  44. 44.

    European Institute for Gender Equality. 2022. Artificial Intelligence, Platform Work and Gender Equality. https://doi.org/10.2839/805305.

  45. 45.

    See International Labour Office. Bureau for Gender Equality. 2004. Promoting Gender Equality: Guide on ILO Conventions and Recommendations of Particular Concern to Women Workers. Geneva: International Labour Organization.

  46. 46.

    See Lütz (2023b).

  47. 47.

    See the comparative Table 9.1.

  48. 48.

    Kahneman et al. (2021).

  49. 49.

    See Parliamentary Assembly Council of Europe. Committee on Equality and Non-Discrimination. 2020. Report Preventing Discrimination Caused by the Use of Artificial Intelligence. Doc 15,151, para. 66. On the Council of Europe, see the website https://www.coe.int/en/web/genderequality/-/artificial-intelligence-and-gender-equality. Accessed 31 October 2022.

  50. 50.

    See, for example, the categorization in the European Institute for Gender Equality. 2021. Gender Equality Index. https://eige.europa.eu/gender-equality-index/2021. Accessed 31 October 2022, UN Women. 2022. UN Women Highlights 2021–2022. https://www.unwomen.org/en/annual-report/2022#our-results. Accessed 31 October 2022 and the European Commission for examples of the use of the terms (female) economic empowerment, leadership, and political participation. Also see European Commission and Directorate-General for Communication. 2020. Striving for a Union of Equality: The Gender Equality Strategy 2020–2025. https://doi.org/10.2775/671326.

  51. 51.

    However, a problematic aspect of measuring progress on SDG 5 is the lack of sufficient data, as currently the data is only 47% of what would be sufficient according to the latest UN Women report on progress on the Sustainable Development Goals. See UN Women. 2022. Progress on the Sustainable Development Goals: The Gender Snapshot 2022. https://www.unwomen.org/en/digital-library/publications/2022/09/progress-on-the-sustainable-development-goals-the-gender-snapshot-2022. Accessed 31 October 2022.

  52. 52.

    United Nations. 2022. The Sustainable Development Goals Report 2022. https://unstats.un.org/sdgs/report/2022/. Accessed 31 October 2022.

  53. 53.

    Sachs et al. (2022), p. 6.

  54. 54.

    United Nations, Charter of the United Nations, 24 October 1945, 1 UNTS XVI, notably the Preamble and Article 1(3).

  55. 55.

    UN General Assembly, International Covenant on Civil and Political Rights, 16 December 1966, United Nations, Treaty Series, vol. 999, p. 171, notably Art. 3.

  56. 56.

    Convention on the Elimination of All Forms of Discrimination against Women (“CEDAW”), New York, 18 December 1979.

  57. 57.

    Art. 2b of CEDAW.

  58. 58.

    Article 5(a) of CEDAW.

  59. 59.

    See notably subsection III. 2. for the roles of different UN actors active in gender equality policies, see also Mona Lena Krook and Jacqui True (2012).

  60. 60.

    UNESCO. 2020. Artificial Intelligence and Gender Equality: Key Findings of UNESCO’s Global Dialogue. GEN/2020/AI/2 REV.

  61. 61.

    United Nations High Commissioner for Human Rights. 2021. The Right to Privacy in the Digital Age. A/HRC/48/31.

  62. 62.

    United Nations Human Rights Council. 2022. Rights of Persons with Disabilities, Report of the Special Rapporteur on the Rights of Persons with Disabilities. A/HRC/49/52.

  63. 63.

    Parts of (section 2) on the UN actors are modeled on a paragraph in one of my previous articles—F. Lütz (n32)—but they are basically a greatly expanded and revised version of the original paragraph.

  64. 64.

    Mégret and Alston (2020).

  65. 65.

    Mégret and Alston (2020), pp. 99–131.

  66. 66.

    Taylor and Mahon (2019).

  67. 67.

    United Nations Security Council. 2000. Resolution 1325. S/RES/1325. See Willett (2010), Puechguirbal (2010), and Shepherd (2008).

  68. 68.

    For a good overview of UN Women, see Bloch (2019).

  69. 69.

    See the website of UN Women at https://www.unwomen.org/en. Accessed 31 October 2022.

  70. 70.

    See UN Women. 2022. Are we track achieve gender equality 2030. https://data.unwomen.org/features/are-we-track-achieve-gender-equality-2030. Accessed 31 October 2022.

  71. 71.

    See Forum Generation Equality. n.d. https://forum.generationequality.org/home. Accessed 31 October 2022.

  72. 72.

    See Commitments Generation Equality Forum. https://commitments.generationequality.org. Accessed 31 October 2022.

  73. 73.

    See Commitments Generation Equality Forum. 2022. Technology and Innovation for Gender Equality. Action Coalition. https://commitments.generationequality.org/sites/default/files/2022-08/technology_and_innovation_1.pdf. Accessed 31 October 2022.

  74. 74.

    See UNDP. 2022. Gender Equality. https://www.undp.org/speeches/gender-equality-undp. Accessed 31 October 2022; UNDP. 2020. Gender Strategy 2020 Annual Report. https://www.undp.org/sites/g/files/zskgke326/files/2021-07/UNDP-Gender%20Strategy-2020-Annual-Report.pdf. Accessed 31 October 2022.

  75. 75.

    For criticism of the measuring via the gender-related development index, see Dijkstra and Hanmer (2000); Dijkstra (2002); and Permanyer (2013), p. 110.

  76. 76.

    Permanyer (2013), pp. 181–239; see also Fredman, Sandra, and Beth A. Goldblatt. 2015. Gender Equality and Human Rights. Discussion Paper. New York: United Nations Entity for Gender Equality and the Empowerment of Women. https://www.unwomen.org/sites/default/files/Headquarters/Attachments/Sections/Library/Publications/2015/Goldblatt-Fin.pdf. Accessed 31 October 2022.

  77. 77.

    United Nations Human Rights Council. 2022. Rights of Persons with Disabilities, Report of the Special Rapporteur on the Rights of Persons with Disabilities. A/HRC/49/52.

  78. 78.

    Mégret and Alston (2020), pp. 253–291. ECOSOC resolution 11(II) of 21 June 1946.

  79. 79.

    For a civil society actor’s view, see Rosche (2016).

  80. 80.

    Rosche (2016), pp. 393–439.

  81. 81.

    Bayefsky (2000).

  82. 82.

    Sokhi-Bulley (2006).

  83. 83.

    Cusack and Pusey (2013), Englehart and Miller (2014), and Sokhi-Bulley (2006).

  84. 84.

    Optional Protocol to the Convention on the Elimination of All Forms of Discrimination Against Women. 1999. UNGA 40; A/RES/54/4.

  85. 85.

    For this expression, see Abiteboul and Dowek (2020).

  86. 86.

    Although it is not exhaustive or representative of all the work done by UN Women, but still symptomatic, a search on the UN Women website yields only one search result for “Artificial Intelligence”. UN Women. 2021. Integrate intersecting inequalities leave no one behind. https://data.unwomen.org/features/integrate-intersecting-inequalities-leave-no-one-behind. Accessed 31 October 2022.

  87. 87.

    See Dictionary Cambridge. n.d. Sustainable. https://dictionary.cambridge.org/dictionary/english/sustainable. Accessed 31 October 2022.

  88. 88.

    Taleb (2012).

  89. 89.

    An example of this can be seen in EU legislation, where the Artificial Intelligence Act foresees an Annex that can be changed in the delegated acts procedure by the European Commission without the involvement of the co-legislators. This notably concerns the AI systems that fall under the regulation. Any regulation could be modeled on such an approach that entrusts a regulatory authority to respond to changing needs without modifying the core of the legislative framework.

  90. 90.

    Wachter et al. (2020, 2021); Bringas Colmenarejo et al. (2022); Angerschmid et al. (2022).

  91. 91.

    On the notion of “trust”, see Rossi (2018); Sutrop (2019); Thelisson et al. 2017. Regulatory Mechanisms and Algorithms Towards Trust in AI/Ml. Paper presented during the proceedings of the IJCAI 2017 Workshop on Explainable Artificial Intelligence (XAI), Melbourne, Australia; Eva Thelisson 2017 Towards Trust, Transparency and Liability in AI/AS Systems. Paper presented during the proceedings of the IJCAI 2017 Workshop on Explainable Artificial Intelligence (XAI), Melbourne, Australia; Middleton et al. (2022).

  92. 92.

    For the literature on “trustworthiness”, see notably Hamon et al. (2022); Ashoori and Weisz (2019); Jain et al. (2020); Larsson et al. (2020).

  93. 93.

    See in this regard also the Federal Department of Foreign Affairs of Switzerland. 2022. Artificial Intelligence and International Rules. Report for the Federal Council, pp. 8–9. https://www.newsd.admin.ch/newsd/message/attachments/71099.pdf. Accessed 31 October 2022.

  94. 94.

    For more on bias audits, see Brown et al. (2021).

  95. 95.

    On AI impact assessments, see notably Mantelero (2018).

  96. 96.

    Several policy proposals mention such processes—for example, the European Commission. 2021. Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts. Com(2021) 206 Final; Parliamentary Assembly Council of Europe. Committee on Equality and Non-Discrimination. 2020. Report Preventing Discrimination Caused by the Use of Artificial Intelligence. Doc 15,151, para. 3.3 (“Regular testing, evaluation, reporting and auditing against state-of-the-art standards related to completeness, relevance, privacy, data protection, other human rights, unjustified discriminatory impacts”) and the Council on Artificial Intelligence. 2019. Recommendation. OECD/LEGAL/0449, para. 2.3 (b) (“assessment mechanisms”).

  97. 97.

    Council on Artificial Intelligence. 2019. Recommendation. OECD/LEGAL/0449.

  98. 98.

    International Labor Organization. 2009. Committee on Gender Equality—International Labour Conference https://www.ilo.org/gender/Events/WCMS_111473/lang%2D%2Den/index.htm. Accessed 31 October 2022.

  99. 99.

    See notably the four key ILO gender equality Conventions: the Equal Remuneration Convention (No. 100), the Discrimination (Employment and Occupation) Convention (No. 111), the Workers with Family Responsibilities Convention (No. 156) and the Maternity Protection Convention (No. 183), all available at https://www.ilo.org/dyn/normlex/en/f?p=1000:12000:::NO:::. Accessed 31 October 2022.

  100. 100.

    See, for example, the discussion of the research department of the ILO, which reflects on inequalities and price discrimination but does not specifically address the risk of gender inequalities when AI is used in the world of work. Ernst et al. (2019).

  101. 101.

    The main outcome of the CSW67(2023) will be agreed conclusions. At the time of writing, the zero draft of these agreed conclusions suggested to “Establish mandatory requirements for impact assessments and due diligence mechanisms to identify, prevent and mitigate societal risks and the negative impacts of digital technology on women and girls, especially by including affected groups, women’s rights organizations and human rights experts” and to “Adopt regulations mandating evaluation and audit requirements for the development and use of artificial intelligence to provide a secure and high-quality data infrastructure and systems that are either continually improved or terminated if human rights violation or gendered bias are identified” (points ff and gg) of zero draft agreed conclusions, Commission on the Status of Women Sixty-seventh session 6–17 March 2023, Innovation and technological change, and education in the digital age for achieving gender equality and the empowerment of all women and girls, p. 14, available at: https://www.unwomen.org/sites/default/files/2023-02/CSW67%20Agreed%20Conclusions_zero%20draft_1%20February%202023.pdf).

References

  • Abiteboul, Serge, and Gilles Dowek. 2020. The Age of Algorithms. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Angerschmid, Alessa, Jianlong Zhou, Kevin Theuermann, Fang Chen, and Andreas Holzinger. 2022. Fairness and Explanation in AI-Informed Decision Making. Machine Learning and Knowledge Extraction 4: 556–579.

    Article  Google Scholar 

  • Ashoori, Maryam, and Justin D. Weisz. 2019. In AI We Trust? Factors That Influence Trustworthiness of AI-Infused Decision-Making Processes. arXiv:1912.02675 [cs.CY]. https://doi.org/10.48550/arXiv.1912.02675.

  • Bayefsky, A.F. 2000. The CEDAW Convention: Its Contribution Today. Proceedings of the ASIL Annual Meeting 94: 197–200.

    Article  Google Scholar 

  • Beard, Mary. 2017. Women & Power: A Manifesto. London: Profile Books.

    Google Scholar 

  • Beghini, Valentina, Umberto Cattaneo, and Emanuela Pozzan. 2022. The Urgency of a Quantum Leap for Gender Equality in the World of Work. In Gender Equality in the Mirror: Reflecting on Power, Participation and Global Justice, ed. Elisa Fornalé, 53–69. Leiden: Brill.

    Chapter  Google Scholar 

  • Bloch, Yanina. 2019. UN-Women: Ein Neues Kapitel Für Frauen in Den Vereinten Nationen. Baden-Baden: Nomos Verlagsgesellschaft.

    Book  Google Scholar 

  • Bohnet, Iris. 2016. What Works. Harvard: Harvard University Press.

    Book  Google Scholar 

  • Bringas Colmenarejo, Alejandra, Luca Nannini, Alisa Rieger, Kristen M. Scott, Xuan Zhao, Gourab K. Patro, Gjergji Kasneci, and Katharina Kinder-Kurlanda. 2022. Fairness in Agreement With European Values: An Interdisciplinary Perspective on AI Regulation. arXiv:2207.01510 [cs.CY]. https://doi.org/10.48550/arXiv.2207.01510

  • Brown, S., J. Davidovic, and A. Hasan. 2021. The Algorithm Audit: Scoring the Algorithms That Score Us. Big Data & Society. https://doi.org/10.1177/2053951720983865.

  • Budlender, Debbie, Diane Elston, Guy Hewitt, and Tanni Mukhopadhyay. 2002. Gender Budgets Make Cents: Understanding Gender Responsive Budgets. London: Commonwealth Secretariat.

    Google Scholar 

  • Buvinic, M., and R. Levine. 2016. Closing the Gender Data Gap. Significance 13: 34–37. https://doi.org/10.1111/j.1740-9713.2016.00899.x.

    Article  Google Scholar 

  • Cusack, Simone, and Lisa Pusey. 2013. CEDAW and the Rights to Non-Discrimination and Equality. Melbourne Journal of International Law 14: 54–92.

    Google Scholar 

  • Datta, A., M.C. Tschantz, and A. Datta. 2014. Automated Experiments on Ad Privacy Settings: A Tale of Opacity, Choice, and Discrimination. arXiv:1408.6491 [cs.CR]. https://doi.org/10.48550/arXiv.1408.6491.

  • Delage, Pauline. 2018. Droits Des Femmes, Tout Peut Disparaître. Paris: Éditions Textuel.

    Google Scholar 

  • Dijkstra, A. Geske. 2002. Revisiting UNDP’s GDI and GEM: Towards an Alternative. Social Indicators Research 57: 301–338.

    Article  Google Scholar 

  • Dijkstra, A. Geske, and Lucia C. Hanmer. 2000. Measuring Socio-Economic Gender Inequality: Toward an Alternative to the UNDP Gender-Related Development Index. Feminist Economics 6: 41–75.

    Article  Google Scholar 

  • Eden, Lorraine, and M. Fernanda Wagstaff. 2021. Evidence-Based Policymaking and the Wicked Problem of SDG 5 Gender Equality. Journal of International Business Policy 4: 28–57.

    Article  Google Scholar 

  • Englehart, Neil A., and Melissa K. Miller. 2014. The CEDAW Effect: International Law’s Impact on Women’s Rights. Journal of Human Rights 13: 22–47.

    Article  Google Scholar 

  • Ernst, E., R. Merola, and D. Samaan. 2019. Economics of Artificial Intelligence: Implications for the Future of Work. IZA Journal of Labor Policy 9: 1–35. https://doi.org/10.2478/izajolp-2019-0004.

    Article  Google Scholar 

  • Eubanks, Virginia. 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. Stuttgart: Macmillan Publishers.

    Google Scholar 

  • Früh, A., and D. Haux. 2022. Foundations of Artificial Intelligence and Machine Learning. Weizenbaum Series. https://doi.org/10.34669/WI.WS/29.

  • Fry, Hannah. 2018. Hello World: How to Be Human in the Age of the Machine. London: Random House.

    Google Scholar 

  • Greaves, Hilary, and William MacAskill. 2019. The Case for Strong Longtermism. GPI Working Paper No. 5-2021. Accessed 31 October 2022. https://globalprioritiesinstitute.org/hilary-greaves-william-macaskill-the-case-for-strong-longtermism-2/.

  • Hamon, Ronan, Henrik Junklewitz, Ignacio Sanchez, Gianclaudio Malgieri, and Paul De Hert. 2022. Bridging the Gap between AI and Explainability in the GDPR: Towards Trustworthiness-by-Design in Automated Decision-Making. IEEE Computational Intelligence Magazine 17: 72–85.

    Article  Google Scholar 

  • Hosanagar, Kartik. 2020. A Human's Guide to Machine Intelligence: How Algorithms Are Shaping Our Lives and How We Can Stay in Control. New York: Viking Press.

    Google Scholar 

  • Houser, Kimberly A. 2019. Can AI Solve the Diversity Problem in the Tech Industry?: Mitigating Noise and Bias in Employment Decision-Making. Stanford Technology Law Review 22: 290–354.

    Google Scholar 

  • Ijjas, Flora. 2021. Sustainability and the Real Value of Care in Times of a Global Pandemic: SDG5 and Covid-19. Discover Sustainability 2: 1–9.

    Article  Google Scholar 

  • Jain, S., M. Luthra, S. Sharma, and M. Fatima. 2020. Trustworthiness of Artificial Intelligence. 2020 6th International Conference on Advanced Computing and Communication Systems (ICACCS). https://doi.org/10.1109/ICACCS48705.2020.9074237.

  • Kahneman, Daniel, Olivier Sibony, and Cass R. Sunstein. 2021. Noise: A Flaw in Human Judgment. London: William Collins.

    Google Scholar 

  • Kalbarczyk, A., N.-L. Aberman, B.S.M. van Asperen, R. Morgan, Z. Bhutta, R. Heidkamp, and S. Osendarp. 2022. Covid-19, Nutrition, and Gender: An Evidence-Based Approach to Gender-Responsive Policies and Programs. Social Science & Medicine. https://doi.org/10.1016/j.socscimed.2022.115364.

  • Kay, M., C. Matuszek and S.A. Munson. 2015. Unequal Representation and Gender Stereotypes in Image Search Results for Occupations. CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. https://doi.org/10.1145/2702123.2702520.

  • Kazim, E., A. Soares Koshiyama, A. Hilliard, and R. Polle. 2021. Systematizing Audit in Algorithmic Recruitment. Journal of Intelligence 9: 46–57. https://doi.org/10.3390/jintelligence9030046.

    Article  Google Scholar 

  • Krook, Mona Lena, and Jacqui True. 2012. Rethinking the Life Cycles of International Norms: The United Nations and the Global Promotion of Gender Equality. European Journal of International Relations 18: 103–127.

    Article  Google Scholar 

  • Kurita, Keita, Nidhi Vyas, Ayush Pareek, Alan W. Black, and Yulia Tsvetkov. 2019. Quantifying Social Biases in Contextual Word Representations. ACL 2019. Accessed 31 October 2022. https://www.semanticscholar.org/paper/Quantifying-Social-Biases-in-Contextual-Word-Kurita-Vyas/3259d52ae00e65b98391e7e6a2f672dfee721bf8.

  • Lambrecht, Anja, and Catherine Tucker. 2019. Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads. Management Science 65: 2966–2981.

    Article  Google Scholar 

  • Larsson, Stefan, Claire Ingram Bogusz, Jonas Andersson Schwarz, and Fredrik Heintz. 2020. Human-Centred AI in the EU: Trustworthiness as a Strategic Priority in the European Member States. Stockholm: Fores.

    Google Scholar 

  • Lee, Kai-Fu. 2021. AI 2041—Ten Visions for the Future. New York: Currency.

    Google Scholar 

  • Lütz, Fabian. 2022a. Discrimination by Correlation. Towards Eliminating Algorithmic Biases and Achieving Gender Equality. In In (Dis)Obedience in Digital Societies. Perspectives on the Power of Algorithms and Data, ed. Sven Quadflieg, Klaus Neuburg, and Simon Nestler, 250–293. Bielefeld: Transcript Verlag.

    Chapter  Google Scholar 

  • ———. 2022b. Gender Equality and Artificial Intelligence in Europe. Addressing Direct and Indirect Impacts of Algorithms on Gender-Based Discrimination. ERA Forum 23: 33–52.

    Article  Google Scholar 

  • ———. 2023a. Artificial Intelligence and Gender-Based Discrimination. In Human Rights and Artificial Intelligence, ed. Jeroen Temperman and Alberto Quintavalla. Oxford: Oxford University Press.

    Google Scholar 

  • ———. 2023b. Le rôle du droit pour contrer la discrimination algorithmique dans le recrutement automatisé. In La technologie, l‘humain et le droit, ed. Florence Guillaume. Bern: Stämpfli Verlag.

    Google Scholar 

  • MacAskill, William. 2022a. What We Owe the Future. New York: Basic Books.

    Google Scholar 

  • ———. 2022b. The Case for Longtermism. New York Times, August 5.

    Google Scholar 

  • Mantelero, Alessandro. 2018. AI and Big Data: A Blueprint for a Human Rights, Social and Ethical Impact Assessment. Computer Law & Security Review 34: 754–772.

    Article  Google Scholar 

  • Mégret, Frédéric, and Alston, Philip. 2020. The United Nations and Human Rights: A Critical Appraisal. Oxford: Oxford University Press.

    Google Scholar 

  • Middleton, Stuart E., Emmanuel Letouzé, Ali Hossaini, and Adriane Chapman. 2022. Trust, Regulation, and Human-in-the-Loop AI: Within the European Region. Communications of the ACM 65: 64–68.

    Article  Google Scholar 

  • Oliveira, Álvaro, Miguel de la Corte, and Rodríguez, and Fabian Lütz. 2020. The New Directive on Work-Life Balance: Towards a New Paradigm of Family Care and Equality? European Law Review 3: 295–323.

    Google Scholar 

  • Pasquale, Frank. 2020. New Laws of Robotics: Defending Human Expertise in the Age of AI. Harvard: Belknap Press.

    Book  Google Scholar 

  • Pederson, Ann, Lorraine Greaves, and Nancy Poole. 2015. Gender-Transformative Health Promotion for Women: A Framework for Action. Health Promotion International 30: 140–150. https://doi.org/10.1093/heapro/dau083.

    Article  Google Scholar 

  • Perez, Caroline Criado. 2019. Invisible Women: Exposing Data Bias in a World Designed for Men. New York: Random House.

    Google Scholar 

  • Permanyer, Iñaki. 2013. Are UNDP Indices Appropriate to Capture Gender Inequalities in Europe? Social Indicators Research 110: 927–950.

    Article  Google Scholar 

  • Prates, M.O. R., P.H. Avelar and L.C. Lamb. 2020. Assessing Gender Bias in Machine Translation: A Case Study with Google Translate. arXiv:1809.02208 [cs.CY]. https://doi.org/10.48550/arXiv.1809.02208.

  • Puechguirbal, Nadine. 2010. Discourses on Gender, Patriarchy and Resolution 1325: A Textual Analysis of UN Documents. International Peacekeeping 17: 172–187.

    Article  Google Scholar 

  • Rosche, Daniela. 2016. Agenda 2030 and the Sustainable Development Goals: Gender Equality at Last? An Oxfam Perspective. Gender & Development 24: 111–126.

    Article  Google Scholar 

  • Rossi, Francesca. 2018. Building Trust in Artificial Intelligence. Journal of International Affairs 72: 127–134.

    Google Scholar 

  • Sachs, Jeffrey, Christian Kroll, Guillame Lafortune, Grayson Fuller, and Finn Woelm. 2022. Sustainable Development Report 2022. From Crisis to Sustainable Development, the SDGs as Roadmap to 2030 and Beyond. Cambridge: Cambridge University Press.

    Google Scholar 

  • Schulz, Patricia. 2022. Progress in and Challenges to the Rights of Women to Non-Discrimination and Gender Equality. In Gender Equality in the Mirror: Reflecting on Power, Participation and Global Justice, ed. Elisa Fornalé, 25–52. Leiden: Brill.

    Chapter  Google Scholar 

  • Shepherd, Laura J. 2008. Power and Authority in the Production of United Nations Security Council Resolution 1325. International Studies Quarterly 52: 383–404.

    Article  Google Scholar 

  • Smith, G., and I. Rustagi. 2021. When Good Algorithms Go Sexist: Why and How to Advance AI Gender Equity. Stanford Social Innovation Review. https://doi.org/10.48558/A179-B138.

  • Sokhi-Bulley, Bal. 2006. The Optional Protocol to CEDAW: First Steps. Human Rights Law Review 6: 143–159.

    Article  Google Scholar 

  • Solomon, Divya Susan, Chandni Singh, and Farjana Islam. 2021. Examining the Outcomes of Urban Adaptation Interventions on Gender Equality Using SDG 5. Climate and Development 13: 830–841.

    Article  Google Scholar 

  • Sutrop, Margit. 2019. Should We Trust Artificial Intelligence? Trames: A Journal of the Humanities and Social Sciences 23: 499–522.

    Article  Google Scholar 

  • Taleb, Nassim Nicholas. 2012. Antifragile: Things That Gain from Disorder. New York: Random House Publishing Group.

    Google Scholar 

  • Taylor, Sara Rose, and Rianne Mahon. 2019. Gender Equality from the MDGs to the SDGs: The Struggle Continues. In Achieving the Sustainable Development Goals, ed. Simon Dalby et al., 54–70. Oxford: Routledge.

    Chapter  Google Scholar 

  • Thelisson, E. 2017. Towards Trust, Transparency and Liability in AI/AS Systems. Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence Doctoral Consortium. https://doi.org/10.24963/ijcai.2017/767.

  • Wachter, Sandra, Brent Mittelstadt, and Chris Russell. 2020. Bias Preservation in Machine Learning: The Legality of Fairness Metrics under EU Non-Discrimination Law. West Virginia Law Review. 123: 735–790.

    Google Scholar 

  • ———. 2021. Why Fairness Cannot Be Automated: Bridging the Gap between Eu Non-Discrimination Law and AI. arXiv:2005.05906 [cs.AI]. https://doi.org/10.48550/arXiv.2005.05906.

  • Willett, Susan. 2010. Introduction: Security Council Resolution 1325: Assessing the Impact on Women, Peace and Security. International Peacekeeping 17: 142–158.

    Article  Google Scholar 

  • Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. 1st ed. New York: PublicAffairs.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fabian Lütz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Lütz, F. (2023). Gender Equality and Artificial Intelligence: SDG 5 and the Role of the UN in Fighting Stereotypes, Biases, and Gender Discrimination. In: Fornalé, E., Cristani, F. (eds) Women’s Empowerment and Its Limits. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-29332-0_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-29332-0_9

  • Published:

  • Publisher Name: Palgrave Macmillan, Cham

  • Print ISBN: 978-3-031-29331-3

  • Online ISBN: 978-3-031-29332-0

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics