Skip to main content

Gender, Health, and AI: How Using AI to Empower Women Could Positively Impact the Sustainable Development Goals

  • Chapter
  • First Online:
The Ethics of Artificial Intelligence for the Sustainable Development Goals

Part of the book series: Philosophical Studies Series ((PSSP,volume 152))

Abstract

It appears to be something wrong if a person’s health is related to gender. Indeed, we might have continued to link this dependency (health-gender) to other factors—such as education or income—had it not been for the use of artificial intelligence-based systems in medicine and healthcare, which made us more aware of a broader picture of how medical research and practice has not taken male and female bodies into account equally. Nonetheless, AI has to be trustworthy, and for that purpose, it shall be lawful, ethical, and robust. But how lawful and ethical can it be if it leaves half of humanity out of the picture? Hence the focus of this chapter is to address how medical AI could positively impact the achievement of gender equality as a Sustainable Development Goal (SDG). In particular, we use several use cases to highlight how medical AI applications have made it evident that there is an enormous data gap between male and female sex involvement in clinical trials, disease treatment, and other medical therapies and that this data gap is the reason why so many AI applications are biased, limited, and inefficient. Filling this gap would mean improving and increasing data generation that would reflect particularities and specificities of female bodies and enable female representation in training algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    More about the research involving these identities in medical and other domains, see Marshall et al. (2019).

  2. 2.

    According to Dr Andrea Sottoriva, the team leader in evolutionary genomics and modelling at ICR and the REVOLVER study leader, the machine learning technique has the ability to “identify patterns in DNA mutation within cancers and forecast future genetic changes,” and it is expected to “transform the way cancer is diagnosed, managed and treated” (Health Europa 2018).

  3. 3.

    Out of these 29 medical devices, 21 are used in the medical specialty of radiology (2 in cardiology, 6 in oncology, 3 in neurology, and 4 in emergency medicine, while in the others, there is no secondary medical specialty clearly stated), 1 in neurology, 1 in ophthalmology, 2 in endocrinology, 3 in cardiology, and 1 in internal medicine.

  4. 4.

    It is hard to envisage how AI is used in the EU as there is no public database to consult for this information. According to the MDR, there should soon be a database, Eudamed, for this purpose.

  5. 5.

    According to the FDA, there are three classes of medical devices: class I, class II, and class III. Depending on the risk associated with the use of the device, the intended uses, the duration of the use, etc., a medical device should be classified in one or another class. Class I devices are subject to general controls (such as good manufacturing practices, labelling requirements, etc.); class II devices to special controls as determined by the FDA on a case-by-case basis and require that the manufacturer files a premarket notification with the FDA; and class III devices which need to go through the most stringent regulatory process: premarket approval.

  6. 6.

    According to the provisions of the Regulation (EU) 2017/745 of the European Parliament and of the Council, of 5 April 2017, on medical devices (hereinafter, the MDR).

  7. 7.

    It is worth highlighting that the sensors are installed in the hardware in case of an embedded software, but if we are dealing with stand-alone software, the process of obtaining information is done through non-physical sensors. An example could be an Internet browser or website which uses cookies to obtain information about the user’s search preferences to provide him or her with a more personalised experience. In this regard, generally women have less access to any kind of technologies, including but not limited to the Internet (Cirillo et al. 2020).

  8. 8.

    Humans should not be seen from the lens of objectiveness. We are not born to act in terms of all-or-nothing dynamics. In terms of economic rationality, this scenario is not desired as economic theories applied to human behaviour (behavioural economics) consider that humans can be nudged towards reaching a specific objective by changing the incentives at stake. A person might prefer a more fallible treatment that grants him or her a 50% chance of being cured (being the other 50% an innocuous result) than a treatment that is promised to be more effective, but with an unknown rate of error.

  9. 9.

    Other biases are historical bias, measurement bias, aggregation bias, evaluation bias, and algorithmic bias: All these biases are explained in Cirillo et al. (2020).

  10. 10.

    There are many initiatives that contribute in making gender equality a reality in the research settings: For a list of initiatives in AI, see UNESCO (2020); for recommendations to incorporation gender and sex in research, see McGregor et al. (2016).

  11. 11.

    More on this tool, see https://www.inne.io/en/home/

  12. 12.

    However, we do not address here the sustainability of AI development and in particular its impact on the environment that has been described in Strubell et al. (2019): The authors show the cost of training neural network models for Natural Language Processing—besides others—in terms of its impact on energy consumption and invite academic and industry stakeholders to choose environmentally friendly hardware and software.

Bibliography

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tomás Gabriel García-Micó .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

García-Micó, T.G., Laukyte, M. (2023). Gender, Health, and AI: How Using AI to Empower Women Could Positively Impact the Sustainable Development Goals. In: Mazzi, F., Floridi, L. (eds) The Ethics of Artificial Intelligence for the Sustainable Development Goals . Philosophical Studies Series, vol 152. Springer, Cham. https://doi.org/10.1007/978-3-031-21147-8_16

Download citation

Publish with us

Policies and ethics