Abstract
The threat posed by Lethal Autonomous Weapon Systems (LAWS) is among the most severe challenges facing humanity today. Climate change, geopolitical instability and economic recessions all form part of the current ‘permacrisis’. I would like to add ‘threats from AI weaponry’ to this list. This paper importantly explores in-depth the ethical issues and possible negative consequences of allowing the development and deployment of AI weaponry. In this paper, I demonstrate that the use of LAWS in warfare is unethical and that it violates not only human rights and International Humanitarian Law (IHL) but human dignity as well. I do this by first examining Roland C. Arkin’s article. I then explore the debate on LAWS from a consequentialist ethical perspective. I outline the arguments both in opposition and in support of the deployment of LAWS to determine the outcome of the moral calculus that much of the debate has been centred around. I then explore the principled arguments against the use of LAWS. These arguments focus on possible violations of human rights, IHL and human dignity. The final section offers a way forward in terms of how a ban on LAWS might be implemented and the role of governments, academics, and the public.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
International Humanitarian Law is the law which regulates the conduct of war and limits the effects of armed conflicts [9].
- 2.
A balance of power refers to the idea that states try to maintain an unequal distribution of power to avoid dominance by one [1].
- 3.
Proportionality here refers to one of the three principles of Jus in Bello that must be met for a military action to be considered legal. “A military action is proportional if the harms are proportionate to the military advantage” [16].
- 4.
The burden of war refers to those most directly impacted by the negative consequences of war. Though human soldiers are usually the direct victims of military operations this may shift with the introduction of LAWS on the battlefield [19].
- 5.
The arbitrary deprivation of life violates IHL the argument here is that the taking of life by an autonomous system would automatically violate this clause [28].
- 6.
“Humans-in-the-loop are systems that select targets and deliver force with a human command, human-on-the-loop are systems that select targets and deliver force with oversight from a human operator and human-out-of-the-loop are systems that can select targets and deliver force without any human input or interaction” [8].
- 7.
As is the case of rape, slavery, or genocide [13].
References
Anderson, S.M.: Balance of power. Wiley Library (2018). https://onlinelibrary.wiley.com/. Accessed 23 June 2023
Arkin, R.: The case for ethical autonomy in unmanned systems. J. Milit. Ethics 9(4), 332–341 (2010)
Baker, D.: Should We Ban Killer Robots? Polity Press, Cambridge (2022)
Blanchard, A., Taddeo, M.: Jus in bello necessity, the requirement of minimal force, and autonomous weapons systems. J. Milit. Ethics 21(3–4), 286–303 (2022). https://doi.org/10.1080/15027570.2022.2157952
Blanchard, A., Edgar, E., McNeish, D., Taddeo, M.: Ethical principles for ai in national defence. Philos. Technol. 34(1), 1707–1729 (2021). https://doi.org/10.1007/s13347-021-00482-3
Cernea, M.: The ethical troubles of future warfare. on the prohibition of autonomous weapon systems. ANNALS Univ. Bucharest Philos. Ser. 66(2), 67–89 (2018)
Davis, J.: Ethics and insights: the ethics of AI in warfare. Naval Postgraduate School (2021). https://nps.edu/documents/110773463/135759179/Ethics+and+Insights+The+Ethics+of+AI+in+Warfare.pdf/dfa0271f-1b93-9495-69a3-fa160ebb2f77?t=1652136179368. Accessed 10 July 2023
Docherty, B.: Losing Humanity: The Case Against Killer Robots. Human Rights Watch (2012). https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots. Accessed 11 May 2023
European Commission. European Civil Protection and Humanitarian Aid Operations (2019). https://civil-protection-humanitarian-aid.ec.europa.eu/what/humanitarian-aid/international-humanitarian-law_en. Accessed 26 Apr 2023
Ezra, A.: Renewable energy alone can’t address data centre’s adverse environmental impact. Forbes Technology Council (2021). https://www.forbes.com/sites/forbestechcouncil/2021/05/03/renewable-energy-alone-cant-address-data-centers-adverse-environmental-impact/?sh=384bdd5b5ddc. Accessed 07 Aug 2023
Fallmann, D.: Human cognitive bias and its role in AI. Forbes (2021). https://www.forbes.com/sites/forbestechcouncil/2021/06/14/human-cognitive-bias-and-its-role-in-ai/?sh=43fa1f6527b9. Accessed 23 May 2023
Galliott, J., McFarland, T.: Autonomous systems in a military context (part 2): a survey of the ethical issues. Int. J. Robot. Appl. Technol. 4(2), 53–68 (2016). https://doi.org/10.4018/IJRAT.2016070104
Heyns, C.: Autonomous weapons in armed conflict and the right to a dignified life: an African perspective. South Afr. J. Hum. Rights 33(1), 46–71 (2017). https://doi.org/10.1080/02587203.2017.1303903
ICRC. What You Need to Know About Autonomous Weapons Systems (2022). https://www.icrc.org/en/document/what-you-need-know-about-autonomous-weapons. Accessed 24 May 2023
IEA. Data Centers and Data Transmission Networks (2023). https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks. Accessed 07 Aug 2023
Lazar, S.: War. The Stanford Encyclopedia of Philosophy. Zalta, E.N. (ed.) (2020). https://plato.stanford.edu/entries/war/. Accessed 28 June 2023
Maas, M.: Regulating for ‘normal AI accidents’: operational lessons for the responsible governance of artificial intelligence deployment. In: Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society (2018)
Müller, V.: Autonomous killer robots are probably good news. In: Di Nucci, E., Santonio de Sio, F. (eds.) Drones and Responsibility: Legal, Philosophical and Socio-Technical Perspectives on the Use of Remotely Controlled Weapons, pp. 67–81. Ashgate, London (2016)
Plümper, T., Neumayer, E.: The unequal burden of war: the effect of armed conflict on the gender gap in life expectancy. Int. Organ. 60(3), 723–754 (2006)
Riesen, E.: The moral case for the development and use of autonomous weapon systems. J. Milit. Ethics 21(2), 132–150 (2022). https://doi.org/10.1080/15027570.2022.2124022
Reichberg, M.G., Syse, H.: Applying AI on the battlefield: the ethical debates. In: von Braun, J.S., Archer, M., Reichberg, G.M., Sánchez Sorondo, M. (eds.) Robotics, AI, and Humanity, pp. 147–159. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-54173-6_12
Russel, S.: Banning lethal autonomous weapons: an education. Issues Sci. Technol. 38(3) (2022)
Sharkey, A.: Autonomous weapons systems, killer robots and human dignity. Ethics Inf. Technol. 21(2), 75–87 (2018). https://doi.org/10.1007/s10676-018-9494-0
Sinnott-Armstrong, W.: Consequentialism. The Stanford Encyclopedia of Philosophy. Zalta, N.E., Nodelman, U (eds.) (2022). https://plato.stanford.edu/entries/consequentialism/. Accessed 31 May 2023
Swett, B., Hahn, E., Lourens, A.: Designing robots for the battlefield: state of the art. In: von Braun, J.S., Archer, M., Reichberg, G.M., Sánchez Sorondo, M. (eds.) Robotics, AI, and Humanity, pp. 131–146. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-54173-6_11
Taddeo, M., Blanchard, A.: A comparative analysis of the definitions of autonomous weapons systems. Sci. Eng. Ethics 28(37). https://doi.org/10.1007/s11948-022-00392-3
UNESCO. Recommendation on the Ethics of Artificial Intelligence. UNESDOC Digital Library (2022). https://unesdoc.unesco.org/ark:/48223/pf0000380455. Accessed 07 Aug 2023
UNGA. International Covenant on Civil and Political Rights (2019). https://www.ohchr.org/sites/default/files/Documents/ProfessionalInterest/ccpr.pdf. Accessed 08 Aug 2023
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Styber, M. (2023). Warfare in the Age of AI: A Critical Evaluation of Arkin’s Case for Ethical Autonomy in Unmanned Systems. In: Pillay, A., Jembere, E., J. Gerber, A. (eds) Artificial Intelligence Research. SACAIR 2023. Communications in Computer and Information Science, vol 1976. Springer, Cham. https://doi.org/10.1007/978-3-031-49002-6_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-49002-6_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-49001-9
Online ISBN: 978-3-031-49002-6
eBook Packages: Computer ScienceComputer Science (R0)