Abstract
It appears to be something wrong if a person’s health is related to gender. Indeed, we might have continued to link this dependency (health-gender) to other factors—such as education or income—had it not been for the use of artificial intelligence-based systems in medicine and healthcare, which made us more aware of a broader picture of how medical research and practice has not taken male and female bodies into account equally. Nonetheless, AI has to be trustworthy, and for that purpose, it shall be lawful, ethical, and robust. But how lawful and ethical can it be if it leaves half of humanity out of the picture? Hence the focus of this chapter is to address how medical AI could positively impact the achievement of gender equality as a Sustainable Development Goal (SDG). In particular, we use several use cases to highlight how medical AI applications have made it evident that there is an enormous data gap between male and female sex involvement in clinical trials, disease treatment, and other medical therapies and that this data gap is the reason why so many AI applications are biased, limited, and inefficient. Filling this gap would mean improving and increasing data generation that would reflect particularities and specificities of female bodies and enable female representation in training algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
More about the research involving these identities in medical and other domains, see Marshall et al. (2019).
- 2.
According to Dr Andrea Sottoriva, the team leader in evolutionary genomics and modelling at ICR and the REVOLVER study leader, the machine learning technique has the ability to “identify patterns in DNA mutation within cancers and forecast future genetic changes,” and it is expected to “transform the way cancer is diagnosed, managed and treated” (Health Europa 2018).
- 3.
Out of these 29 medical devices, 21 are used in the medical specialty of radiology (2 in cardiology, 6 in oncology, 3 in neurology, and 4 in emergency medicine, while in the others, there is no secondary medical specialty clearly stated), 1 in neurology, 1 in ophthalmology, 2 in endocrinology, 3 in cardiology, and 1 in internal medicine.
- 4.
It is hard to envisage how AI is used in the EU as there is no public database to consult for this information. According to the MDR, there should soon be a database, Eudamed, for this purpose.
- 5.
According to the FDA, there are three classes of medical devices: class I, class II, and class III. Depending on the risk associated with the use of the device, the intended uses, the duration of the use, etc., a medical device should be classified in one or another class. Class I devices are subject to general controls (such as good manufacturing practices, labelling requirements, etc.); class II devices to special controls as determined by the FDA on a case-by-case basis and require that the manufacturer files a premarket notification with the FDA; and class III devices which need to go through the most stringent regulatory process: premarket approval.
- 6.
According to the provisions of the Regulation (EU) 2017/745 of the European Parliament and of the Council, of 5 April 2017, on medical devices (hereinafter, the MDR).
- 7.
It is worth highlighting that the sensors are installed in the hardware in case of an embedded software, but if we are dealing with stand-alone software, the process of obtaining information is done through non-physical sensors. An example could be an Internet browser or website which uses cookies to obtain information about the user’s search preferences to provide him or her with a more personalised experience. In this regard, generally women have less access to any kind of technologies, including but not limited to the Internet (Cirillo et al. 2020).
- 8.
Humans should not be seen from the lens of objectiveness. We are not born to act in terms of all-or-nothing dynamics. In terms of economic rationality, this scenario is not desired as economic theories applied to human behaviour (behavioural economics) consider that humans can be nudged towards reaching a specific objective by changing the incentives at stake. A person might prefer a more fallible treatment that grants him or her a 50% chance of being cured (being the other 50% an innocuous result) than a treatment that is promised to be more effective, but with an unknown rate of error.
- 9.
Other biases are historical bias, measurement bias, aggregation bias, evaluation bias, and algorithmic bias: All these biases are explained in Cirillo et al. (2020).
- 10.
- 11.
More on this tool, see https://www.inne.io/en/home/
- 12.
However, we do not address here the sustainability of AI development and in particular its impact on the environment that has been described in Strubell et al. (2019): The authors show the cost of training neural network models for Natural Language Processing—besides others—in terms of its impact on energy consumption and invite academic and industry stakeholders to choose environmentally friendly hardware and software.
Bibliography
Abbott, R. 2020. The Reasonable Robot: Artificial Intelligence and the Law. Cambridge: Cambridge University Press.
Academy of Medical Royal Colleges. 2019. Artificial Intelligence in Healthcare. Available at https://www.aomrc.org.uk/wp-content/uploads/2019/01/Artificial_intelligence_in_healthcare_0119.pdf.
Ahuja, A.S. 2019. The Impact of Artificial Intelligence in Medicine on the Future Role of the Physician. PeerJ: Life & Environment 7. Available at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6779111/.
Akazawa, M., and K. Hashimoto. 2020. Artificial Intelligence in Ovarian Cancer Diagnosis. Anticancer Research 40 (8): 4795–4800.
Albert, P.R. 2015. Why Is Depression More Prevalent in Women? Journal of Psychiatry and Neuroscience 40 (4): 219–221.
Barredo Arrieta, A., N. Díaz-Rodríguez, J. Del Ser, A. Bennetot, S. Tabik, A. Barbado, S. Garcia, S. Gil-López, D. Molina, R. Benjamins, R. Chatila, and F. Herrera. 2020. Explainable Artificial Intelligence (XAI): Concepts, Taxonomies, Opportunities and Challenges Toward Responsible AI. Information Fusion 58: 82–115.
Bathaee, Y. 2018. The Artificial Intelligence Black Box and the Failure of Intent and Causation. Harvard Journal of Law & Technology 31 (2): 889–938.
Beery, A.K., and I. Zucker. 2011. Sex Bias in Neuroscience and Biomedical Research. Neuroscience: Faculty Publications, Smith College, Northampton, MA. Available at https://core.ac.uk/download/pdf/28735 5536.pdf.
Benjamens, S., P. Dhunnoo, and B. Meskó. 2020. The State of Artificial Intelligence-Based FDA-Approved Medical Devices and Algorithms: An Online Database. NPJ Digital Medicine 118: 1–8.
Borgesius, F.Z. 2018. Discrimination, Artificial Intelligence, and Algorithmic Decision-Making. Strasbourg: Council of Europe.
Boulamwini, J., and T. Gebru. 2018. Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research 81: 1–15.
Cirillo, et al. 2020. Sex and Gender Differences and Biases in Artificial Intelligence for Biomedicine and Healthcare. NPJ Digital Medicine 3(81). Available at https://www.nature.com/articles/s41746-020-0288-5#citeas.
Couldry, N., and U.A. Mejias. 2019. Data Colonialism: Rethinking the Big Data’s Relation to the Contemporary Subject. Television and New Media 20 (4): 336–349.
Criado Perez, C. 2019. Invisible Women: Data Bias in World Designed for Men. New York: Abram Press.
Daly, C., F. Clemens, J.L. Lopez Sendon, L. Tavazzi, E. Boersma, N. Danchin, F. Delahaye, A. Gitt, D. Julian, D. Mulcahy, W. Ruzyllo, K. Thygesen, F. Verheugt, and K.M. Fox. 2006. Gender Differences in the Management and Clinical Outcome of Stable Angina. Circulation 113: 490–498.
Dusenbery, M. 2018. Doing Harm: The Truth About How Bad Medicine and Lazy Science Leave Women Dismissed, Misdiagnosed and Sick. New York: HarperOne.
Friedman, B., P.H. Kahn Jr., A. Borning, and A. Huldtgren. 2013. Value Sensitive Design and Information Systems. In Early Engagement and New Technologies: Opening Up the Laboratory, ed. N. Doorn, D. Schuurbiers, I. van de Poel, and M.E. Gorman, 55–95. Dordrecht: Springer.
Guerriero, S., et al. 2021. Artificial Intelligence (AI) in the Detection of Rectosigmoid Deep Endometriosis. European Journal of Obstetrics & Gynecology and Reproductive Biology 261: 29–33.
Gunning, D., M. Stefik, J. Choi, T. Miller, S. Stumpf, and G. Yang. 2019. XAI-Explainable Artificial Intelligence. Science Robotics 4: 1–2.
Health Europa. 2018. Towards Personalised Medicine: Artificial Intelligence in Cancer. Interview accessible at https://www.healtheuropa.eu/artificial-intelligence-in-cancer/88685/.
High-Level Expert Group on Artificial Intelligence. 2019a. Ethics Guidelines for Trustworthy AI. Report accessible at https://digital-strategy.ec.europa.eu/en/library/ethics-guidelines-trustworthy-ai.
———. 2019b. A Definition of AI: Main Capabilities and Disciplines. Report accessible at https://ec.europa.eu/futurium/en/system/files/ged/ai_hleg_definition_of_ai_18_december_1.pdf.
Larrazabal, A.J., N. Nieto, V. Peterson, D.H. Milone, and E. Ferrante. 2020. Gender Imbalance in Medical Imaging Datasets Produces Biased Classifiers for Computer-Aided Diagnosis. Proceedings of the National Academy of Sciences 117 (23): 12592–12594.
Liaudat, C.C., P. Vaucher, T. De Francesco, N. Jaunin-Stadler, L. Herzig, F. Verdon, B. Favrat, I. Locatelli, and C. Clair. 2018. Sex/Gender Bias in the Management of Chest Pain in Ambulatory Care. Women’s Health 14: 1–9.
Littman, M.L., I. Ajunwa, G. Berger, C. Boutilier, M. Currie, F. Doshi-Velez, G. Hadfield, M.C. Horowitz, C. Isbell, H. Kitano, K. Levy, T. Lyons, M. Mitchell, J. Shah, S. Sloman, S. Vallor, and T. Walsh. 2021. Gathering Strength, Gathering Storms: The One Hundred Year Study on Artificial Intelligence. Available at https://ai100.stanford.edu/2021-report/gathering-strength-gathering-storms-one-hundred-year-study-artificial-intelligence.
Liu, K.A., and N.A. Dipietro Mager. 2016. Women’s Involvement in Clinical Trials: Historical Perspective and Future Implications. Pharmacy Practice 14(1). Available at https://www.ncbi.nlm.nih.gov/pmc/articles/ PMC4800017/.
Marshall, Z., et al. 2019. Documenting Research with Transgender, Nonbinary, and Other Gender Diverse (Trans) Individuals and Communities: Introducing the Global Trans Research Evidence Map. Transgender Health 4(1). Available at https://www.liebertpub.com/doi/full/10.1089/trgh.2018.0020.
McGregor, A.J., M Hasnain, K Sandberg, M.F Morrison, M Berlin and J Trott. 2016. How to Study the Impact of Sex and Gender in Medical Research: A Review of Resources. Biology of Sex Differences 7 (Suppl 1): 61–72.
Meiliana, A., et al. 2019. Artificial Intelligence in Healthcare. The Indonesian Biomedical Journal 11 (2): 125–135.
Nidumolu, R., et al. 2009. Why Sustainability Is Now the Key Driver of Innovation. Harvard Business Review, September 2009. Available at https://hbr.org/2009/09/why-sustainability-is-now-the-key-driver-of-innovation.
Osoba, O., and W. Welser IV. 2017. An Intelligence in Our Image. The Risks of Bias and Errors in Artificial Intelligence. RAND Corporation. Available at https://www.rand.org/content/dam/rand/pubs/research_reports/RR1700/RR1744/RAND_RR1744.pdf.
Panth, S. 1997. Technological Innovation, Industrial Evolution, and Economic Growth. London/New York: Garland Publishing.
Petrone, J. 2018. FDA Approves Stroke-Detecting AI Software. Nature Biotechnology 36: 290.
Schwartz, R., et al. 2021. A Proposal for Identifying and Managing Bias in Artificial Intelligence, Draft NIST Special Publication 1270. National Institute of Standards and Technology. Available at https://nvlpubs.nist.gov/nistpubs/Special Publications/NIST.SP.1270-draft.pdf.
Shannon, J. 2018. Heart Attack – It’s Different for Women. Irish Heart Foundation. Available at https://irishheart.ie/news/heart-attack-its-different-for-women/.
Strubell, et al. 2019. Energy and Policy Considerations for Deep Learning in NLP. Available at https://arxiv.org/pdf/1906.02243.pdf.
Sumathi, M., et al. 2021. Study and Detection of PCOS Related Diseases Using CNN. IOP Conference Series: Materials Science and Engineering 1070. Available at https://iopscience.iop.org/article/10.1088/1757-899X/1070/1/012062/meta.
Tahhan, A.S., M. Vaduganathan, S.J. Greene, A. Alrohaibani, M. Raad, M. Gafeer, G.C. Fonarow, P.S. Douglas, D.L. Bhatt, and J. Butler. 2020. Enrollment of Older Patients, Women, and Racial/Ethnic Minority Groups in Contemporary Acute Coronary Syndrome Clinical Trials. A Systematic Review. JAMA Cardiology 5(6): E1–E9.
Tat, E., D.L. Bhatt, and M.G. Rabbat. 2020. Addressing Bias: Artificial Intelligence in Cardiovascular Medicine. The Lancet 2: e635–e636.
Te-Ping, C. 2020. Women Founders of AI Startups Take Aim at Gender Bias. Wall Street Journal, 29 September 2021.
The Royal Society. 2019. Explainable AI: The Basics. Policy Briefing. Available at https://royalsociety.org/-/media/policy/projects/explainable-ai/AI-and-interpretability-policy-briefing.pdf.
Umbrello, S., M. Capasso, M. Balistreri, A. Pirni, and F. Merenda. 2021. Value Sensitive Design to Achieve the UN SDGs with AI: A Case of Elderly Care Robots. Minds and Machines 31: 395–419.
UNESCO. 2020. Artificial Intelligence and Gender Equality. Report available at https://en.unesco.org/AI-and-GE-2020.
Van Wynsberghe, A. 2021. Sustainable AI: AI for Sustainability and the Sustainability of AI. AI and Ethics 1: 213–218.
Vinuesa, R., H. Azizpour, I. Leite, M. Balaam, V. Dignum, S. Domisch, A. Felländer, S.D. Langhans, M. Tegmark, and F.F. Nerini. 2020. The Role of Artificial Intelligence in Achieving the Sustainable Development Goals. Nature 11: 233–242.
Watson, D., J. Krutzinna, I. Bruce, C. Griffiths, I. McInnes, M. Barnes, and L. Floridi. 2019. Clinical Applications of Machine Learning Algorithms: Beyond the Black Box. Available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3352454.
WHO. 2016. Women’s Health and Well-Being in Europe: Beyond the Mortality Advantage. Report accessible at https://www.euro.who.int/ en/health-topics/health-determinants/gender/publications/2016/ womens-health-and-well-being-in-europe-beyond-the-mortality-advantage-2016.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
García-Micó, T.G., Laukyte, M. (2023). Gender, Health, and AI: How Using AI to Empower Women Could Positively Impact the Sustainable Development Goals. In: Mazzi, F., Floridi, L. (eds) The Ethics of Artificial Intelligence for the Sustainable Development Goals . Philosophical Studies Series, vol 152. Springer, Cham. https://doi.org/10.1007/978-3-031-21147-8_16
Download citation
DOI: https://doi.org/10.1007/978-3-031-21147-8_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-21146-1
Online ISBN: 978-3-031-21147-8
eBook Packages: Religion and PhilosophyPhilosophy and Religion (R0)