Skip to main content
Log in

Why reciprocity prohibits autonomous weapons systems in war

  • Original Research
  • Published:
AI and Ethics Aims and scope Submit manuscript

Abstract

This paper presents the argument that there ought to be a categorical ban on autonomous weapons systems (AWS) in warfare. First, I provide a foundational argument that international humanitarian law (jus in bello) is deontological. Following the argument shared by Peter Asaro and Robert Sparrow, I then argue that AWS lack the ability to properly acknowledge its target and consequently, breaches jus in bello principles. I, however, go further than Asaro and Sparrow by emphasizing the necessity of reciprocity for deontological law. Because AWS lack a constitutive symmetry with human combatants, humans and AWS cannot coexist in warfare if they are to respect the existing international principles. After addressing foreseeable objections, including arguments for reducing deaths and the prohibition of other weapons, I conclude that a categorical ban of AWS remains a reasonable consideration. The benefit of this paper is that it avoids complex and hypothetical considerations of future developments of AWS capabilities. It also shows that if the moral underpinnings of jus in bello principles are respected, then categorically banning AWS from warfare is already an accepted position.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. For example, the US set aside US$18 billion between 2016 and 2020 for AWS development [5].

  2. According to the report, these machines were autonomous as once in the battlefield they could, “…attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability.” [13, p. 17/548].

  3. This term refers to combatants who are in a situation, often through injury, where they are unable to perform regular military duties.

  4. There are many ways to define illegitimate harm. However, by illegitimate harm in warfare, including deaths, I only consider that which occurs without respecting the principles of jus in bello.

  5. While they all share the openness of AWS sufficiency progressing to the necessary moral standards, their arguments are not precisely the same. Arkin only requires accurate performance while, as mentioned in the previous section, Asaro and Sparrow require, in addition to performance, the associated awareness of the other human’s intrinsic value.

  6. While the high level of reasoning required in this argument is currently outside of the capabilities of AI, some experts indicate the likelihood of it potentially happening within the next few of decades, see [6, 9].

  7. This is modified from Korsgaard’s example between a military general and private [12, p. 124].

  8. My thanks to the anonymous reviewer for raising this objection.

  9. I am alluding to someone who is aware of the practical realities that may require concessions to one’s philosophical obligations.

References

  1. Arkin, R.: Lethal autonomous systems and the plight of the non-combatant. In: Kiggins, R. (ed.) The Political Economy of Robots—Prospects for Prosperity and Peace in the Automated 21st Century, pp. 317–326. Palgrave Macmillan, Cham (2018). https://doi.org/10.1007/978-3-319-51466-6_15

    Chapter  Google Scholar 

  2. Asaro, P.: Autonomous Weapons and the Ethics of Artificial Intelligence. In: Liao, S.M. (ed.) Ethics of Artificial Intelligence, pp. 212–236. Oxford University Press, Oxford (2020)

    Chapter  Google Scholar 

  3. Cantrell, H.: Autonomous weapon systems and the claim-rights of innocents on the battlefield. AI Ethics (2021). https://doi.org/10.1007/s43681-021-00119-3

    Article  Google Scholar 

  4. Davison, N.: A legal perspective: autonomous weapon systems under international humanitarian law. UNODA Occasional Papers No. 30, November 2017 (2018). https://doi.org/10.18356/29a571ba-en

  5. Dawes, J.: UN fails to agree on ‘killer robot’ ban as nations pour billions into autonomous weapons research. The Conversation. https://theconversation.com/un-fails-to-agree-on-killer-robot-ban-as-nations-pour-billions-into-autonomous-weapons-research-173616 (2021). Accessed 22 Apr 2022

  6. Dilmegani, C.: When will singularity happen? 995 experts’ opinions on AGI. AI Multiple. https://research.aimultiple.com/artificial-general-intelligence-singularity-timing/ (2022). Accessed 12 Apr 2022

  7. Dreveskracht, R.: Just war in international law: an argument for a deontological approach to humanitarian law. Buff Hum Rts L Rev 16, 237–288 (2010)

    Google Scholar 

  8. Felter, J.H., Shapiro, J.N.: Limiting civilian casualties as part of a winning strategy: the case of courageous restraint. Dædalus (2017). https://doi.org/10.1162/DAED_a_00421

    Article  Google Scholar 

  9. Haner, J., Garcia, D.: The artificial intelligence arms race: trends and world leaders in autonomous weapons development. Glob Policy (2019). https://doi.org/10.1111/1758-5899.12713

    Article  Google Scholar 

  10. Jenkins, R., Purves, D.: Robots and respect: a response to Robert sparrow. Ethics Int Aff (2016). https://doi.org/10.1017/S0892679416000277

    Article  Google Scholar 

  11. Kant, I.: Groundwork of the metaphysics of morals. In: Gregor, M. (ed.) Cambridge University Press, Cambridge (1998)

  12. Korsgaard, C.M.: Fellow Creatures: Our Obligations to the Other Animals. Oxford University Press, Oxford (2018)

    Book  Google Scholar 

  13. Majumdar Roy Choudhury, L. et al.: Final report of the Panel of Experts on Libya established pursuant to Security Council resolution 1973 (2011) S/2021/229. United Nations Security Council. https://documents-dds-ny.un.org/doc/UNDOC/GEN/N21/037/72/PDF/N2103772.pdf?OpenElement (2021). Accessed 15 Apr 2022

  14. Moor, J.H.: The nature, importance, and difficulty of machine ethics. IEEE Intell. Syst. (2006). https://doi.org/10.1109/MIS.2006.80

    Article  Google Scholar 

  15. Moseley, A.: Just war theory. The Internet Encyclopedia of Philosophy. https://iep.utm.edu/justwar/. Accessed 5 Apr 2021

  16. Sparrow, R.: Robots and respect: assessing the case against autonomous weapon systems. Ethics Int. Aff. (2016). https://doi.org/10.1017/S0892679415000647

    Article  Google Scholar 

  17. Worsnip, P.: Wars less deadly than they used to be, report says. Reuters. https://www.reuters.com/article/us-war-casualties-report-idUSTRE60J5UG20100121 (2010). Accessed 12 Apr 2022

Download references

Acknowledgements

I would like to thank Dr Shannon Vallor for introducing me to this topic and providing invaluable discussions. I would also like to thank Marijus Dingilevskis, Astrid Bertrand, and the anonymous reviewers for their insightful comments and suggestions.

Funding

No funds, grants, or other support was received.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joshua L. M. Brand.

Ethics declarations

Conflict of interest

The author has no competing interests to declare that are relevant to the content of this article.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Brand, J.L.M. Why reciprocity prohibits autonomous weapons systems in war. AI Ethics 3, 619–624 (2023). https://doi.org/10.1007/s43681-022-00193-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s43681-022-00193-1

Keywords

Navigation