Skip to main content

Gender Differences and Bias in Artificial Intelligence

  • Chapter
  • First Online:
Gender in AI and Robotics

Part of the book series: Intelligent Systems Reference Library ((ISRL,volume 235))

Abstract

Artificial Intelligence (AI) is supporting decisions in ways that increasingly affect humans in many aspects of their lives. Both autonomous and decision-support systems applying AI algorithms and data-driven models are used for decisions about justice, education, physical and psychological health, and to provide or deny access to credit, healthcare, and other essential resources, in all aspects of daily life, in increasingly ubiquitous and sometimes ambiguous ways. Too often systems are built without considering human factors associated with their use, such as gender bias. The need for clarity about the correct way to employ such systems is an an increasingly critical aspect of design, implementation, and presentation. Models and systems provide results that are difficult to interpret and are blamed for being good or bad, whereas only the design of such tools is good or bad, and the necessary training for them to be integrated into human values. This chapter aims at discussing the most evident issues about gender bias in AI and exploring possible solutions for the impact on humans of AI and decision support algorithms, with a focus on how to integrate gender balance principles into data sets, AI agents, and in general in scientific research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    At the moment this page is written, the world is facing a critical step back in gender equality in Arabic countries, with Afghanistan facing the renewed arrival of Talibans’ command. Clear data about this part of the world are still not available. ISIS is bombing Kabul’s airport to force people to stay under the Sharia Islamic rules, where women are limited in which in other countries are considered human rights. Students and researchers accepted in foreign universities cannot exit the country. Women are abandoning their children in the arms of the US army and European ambassadors, hoping for them to be transported outside Afghanistan.

  2. 2.

    Note of the author: It is such a difficult situation to live in that I had difficulty getting to sleep while writing these paragraphs because it is a worrying situation with a dark past and future, and it can be emotionally challenging for any woman in the field, even to speak about it. Many studies and surverys are circulating to track such issues, all of them are anonymous to the aim to protect the freedom of speech.

References

  1. https://spectrum.ieee.org/untold-history-of-ai-invisible-woman-programmed-americas-first-electronic-computer

  2. Lennerlöf, L.: Learned helplessness at work. Int. J. Health Serv. 18(2), 207-222 (1988). https://doi.org/10.2190/CPFB-Y04Y-5DCM-YX7F

  3. https://www.alliekmiller.com/

  4. https://profiles.stanford.edu/fei-fei-li

  5. https://rachel.fast.ai/

  6. https://www.coursera.org/instructor/koller

  7. https://www.diag.uniroma1.it/en/users/luigia_carlucci-aiello

  8. https://web.media.mit.edu/~picard/

  9. http://www.ladamic.com/

  10. https://www.media.mit.edu/people/joyab/projects/

  11. OECD Gender Data Portal: Where are tomorrow’s female scientists? https://www.oecd.org/gender/data/wherearetomorrowsfemalescientists.htm

  12. CNI Data Portal: In calo gli immatricolati ai corsi di ingegneria .https://www.cni.it/media-ing/news/213-2019/2620-in-calo-gli-immatricolati-ai-corsi-di-laurea-in-ingegneria

  13. I’d blush if I could, UNESCO-EQUALS (2019). https://en.unesco.org/EQUALS/ICT-GE-paradox

  14. Questioni di genere inIntelligenza artificiale, report from S. Badaloni on Gender bias in Artifiial Intelligence (2020). https://vimeo.com/486394250

  15. ISTAT portal: Divario retributivo di genere, report from the Italian National Institute for Statistics (2019). https://www.istat.it/donne-uomini/bloc-2d.html

  16. Craglia et al.: Artificial Intelligence: A European Perspective. Joint Research Centre (2018). https://ec.europa.eu/jrc/en/publication/eur-scientific-and-technical-research-reports/artificial-intelligence-european-perspective

  17. McMillan, G.: It’s Not You, It’s It: Voice Recognition Doesn’t Recognize Women. Times article (2011). https://techland.time.com/2011/06/01/its-not-you-its-it-voice-recognition-doesnt-recognize-women/

  18. Dastin, J.: Amazon scraps secret AI recruiting tool that showed bias against women (2018) https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G

  19. Chin, C.: AI Is the Future-But Where Are the Women? Wired article (2018). https://www.wired.com/story/artificial-intelligence-researchers-gender-imbalance/

  20. UNESCO Data Portal: The rise of gendered AI and its troubling ripercussions (2018). https://en.unesco.org/EQUALS/voice-assistants

  21. UNESCO Data Portal: Priority Gender Equality. https://en.unesco.org/genderequality

  22. European Commission projects https://eur-lex.europa.eu/

  23. Stanford Gendered Innovation Platform. Chronic Pain: Analyzing How Sex and Gender Interact (2018). https://genderedinnovations.stanford.edu/case-studies/pain.html

  24. Stanford Gendered Innovation Platform: Facial Recognition: Analyzing Gender and Intersectionality in Machine Learning (2019). https://genderedinnovations.stanford.edu/case-studies/facial.html

  25. Stanford Gendered Innovation Platform. Extended Virtual Reality: Analyzing Gender (2019). https://genderedinnovations.stanford.edu/case-studies/extendedVR.html

  26. Stanford Gendered Innovation Platform. Gendering Social Robots: Analyzing Gender and Intersectionality (2018). https://genderedinnovations.stanford.edu/case-studies/extendedVR.html

  27. Stanford Gendered Innovation Platform: Inclusive Crash Test Dummies: Rethinking Standards and Reference Models (2019). https://genderedinnovations.stanford.edu/case-studies/crash.html

  28. Buolamwini, J., Gebru, T.: Gender shades: intersectional accuracy disparities in commercial gender classification. Proc. Mach. Learn. Res. 81, 1–15 (2018)

    Google Scholar 

  29. Tolan, S.: Fair and Unbiased Algorithmic Decision Making: Current State and Future Challenges. JRC Technical Report (2018)

    Google Scholar 

  30. Tolan, S., Miron, M., Castillo, C., Gómez, E.: Performance, fairness and bias of expert assessment and machine learning algorithms: the case of juvenile criminal recidivism in Catalonia. Algorithms and Society Workshop (2018)

    Google Scholar 

  31. Tatman, R.: Gender and dialect bias in YouTube’s automatic captions. Ethics in Natural Language Processing (2017)

    Google Scholar 

  32. Pinkola Estés, C.: Women Who Run with the Wolves. Pickwick BIG (2016)

    Google Scholar 

  33. Roger, J.: A field study of the impact of gender and user’s technical experience on the performance of voice-activated medical tracking application. Int. J. Human-Comput. Stud. 60(5–6), 529–544 (2004)

    Article  Google Scholar 

  34. Falkner, K.: Gender gap in academia: perceptions of female computer science academics. In: Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education, pp. 111–116 (2015)

    Google Scholar 

  35. Franzoni, V., Baia, A.E., Biondi, G.,Milani, A.: Producing artificial male voices with maternal features for relaxation. In: 20th IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, Dec 14-17, 2021, Melbourne, Australia, p. 8. ACM, New York, NY, USA (2021). https://doi.org/10.1145/1122445.1122456

  36. Porges, S., Lewis, G.: The polyvagal hypothesis: common mechanisms mediating autonomic regulation, vocalizations and listening. In: Handbook of Mammalian Vocalization. Handbook of Behavioral Neuroscience, vol. 19, pp. 255–264. Elsevier (2010). https://doi.org/10.1016/B978-0-12-374593-4.00025-5

  37. Badaloni, S., Lisi, F.A.: Towards a Gendered Innovation in AI (short paper). DP@AI*IA 12–18 (2020)

    Google Scholar 

  38. Tannenbaum, C., Ellis, R.P., Eyssel, F., et al.: Sex and gender analysis improves science and engineering. Nature 575, 137–146 (2019)

    Article  Google Scholar 

  39. Franzoni, V., Milani, A., Di Marco, N., Biondi, G.: How virtual reality influenced emotional well-being worldwide during the Covid-19 pandemics. In: 20th IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, 14–17 Dec 2021, Melbourne, Australia, p. 8. ACM, New York, NY, USA (2021). https://doi.org/10.1145/1122445.11224561

  40. West, S.M., Whittaker, M., Crawford, K.: Discriminating Systems: Gender, Race and Power in AI. AI Now Institute (2019). Retrieved from https://ainowinstitute.org/discriminatingsystems.html

  41. Mehrabi, N. et al.: A survey on bias and fairness in machine learning. ACM Comput. Surv. 54(6), 115:1–115:35 (2021)

    Google Scholar 

  42. Raji, I.D., Buolamwini, J., et al.: Saving Face: Investigating the Ethical Concerns of Facial Recognition Auditing, pp. 145–151. AIES ’20 (2020)

    Google Scholar 

  43. Raji, I.D., Buolamwini, J.: Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products, pp. 429–435. AIES ’19 (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Valentina Franzoni .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Franzoni, V. (2023). Gender Differences and Bias in Artificial Intelligence. In: Vallverdú, J. (eds) Gender in AI and Robotics. Intelligent Systems Reference Library, vol 235. Springer, Cham. https://doi.org/10.1007/978-3-031-21606-0_2

Download citation

Publish with us

Policies and ethics