Abstract
This chapter explores the legal implications of autonomous weapon systems and the potential challenges such systems might present to the laws governing weaponry and the conduct of hostilities. Autonomous weapon systems are weapons that are capable of selecting and engaging a target without further human operator involvement. Although such systems have not yet been fully developed, technological advances, particularly in artificial intelligence, make the appearance of such systems a distinct possibility in the years to come. Given such a possibility, it is essential to look closely at both the relevant technology involved in these cutting-edge systems and the applicable law. This chapter commences with an examination of the emerging technology supporting these sophisticated systems, by detailing autonomous features that are currently being designed for weapons and anticipating how technological advances might be incorporated into future weapon systems. A second aim of the chapter is to describe the relevant law of armed conflict principles applicable to new weapon systems, with a particular focus on the unique legal challenges posed by autonomous weapons. The legal analysis will outline how autonomous weapon systems would need to be designed for them to be deemed lawful per se, and whether the use of autonomous weapons during hostilities might be prohibited in particular circumstances under the law of armed conflict. The third and final focus of this chapter is to address potential lacunae in the law dealing with autonomous weapon systems. In particular, the author will reveal how interpretations of and issues related to subjectivity in targeting decisions and overall accountability may need to be viewed differently in response to autonomy.
The author is Lieutenant Colonel in the United States Army, Judge Advocate, Faculty, International Law Department, Naval War College, Newport, Rhode Island, USA. The views expressed are those of the author and should not be understood as necessarily representing those of the United States Department of Defense or any other government entity.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
US Department of Defense 2012a, p. 13.
- 2.
Singer 2009, p. 128.
- 3.
See, for example, Human Rights Watch 2012, p. 1.
- 4.
Heyns 2013.
- 5.
The US promulgated a policy directive in late 2012 establishing a strict approval process for any AWS acquisitions or development and mandating that various safety measures be incorporated into future AWS designs: US Department of Defense 2012a.
- 6.
For example, the former chief scientist for the US Air Force even contends that technology currently exists to facilitate ‘fully autonomous military strikes’: Dahm 2012, p. 11.
- 7.
Poitras 2012.
- 8.
For an overview of machine learning capabilities and possibilities, see, Russell and Norvig 2010, Chap. 18.
- 9.
Public Broadcasting Service 2011.
- 10.
IEEE 2012.
- 11.
- 12.
For example, the US military fears that it will overload its intelligence analysts and their ability to review the information being supplied by unmanned assets if changes, to include increasing the autonomy of the systems, are not made: US Department of Defense 2012b, pp. 30–34, 82–83.
- 13.
US Department of Defense 2012b, p. 1 (‘Enable humans to delegate those tasks that are more effectively done by computer… thus freeing humans to focus on more complex decision making’).
- 14.
Sharkey 2012, p. 110 (observing that ‘armed robots are set to change the pace of battle dramatically in the coming decade. It may not be militarily advantageous to keep a human in control of targeting’).
- 15.
For example, the US is seeking to greatly expand its use of autonomy: US Department of Defense 2012b, pp. 1–3.
- 16.
Human-like cognitive abilities are not the equivalent of human abilities. Consensus does not exist as to if and when general artificial intelligence might become available. Computer scientist Noel Sharkey doubts that artificial intelligence advances will achieve human-like abilities in even the next 15Â years: Sharkey 2011, p. 140.
- 17.
- 18.
Kellenberger 2011, p. 27.
- 19.
- 20.
US Air Force 2009, p. 16 (stating that ‘[a]s autonomy and automation merge, [systems] will be able to swarm… creating a focused, relentless, and scaled attack’). The US Air Force’s Proliferated Autonomous Weapons may represent an early prototype of future swarming systems. See, Singer 2009, p. 232; Alston 2011, p. 43.
- 21.
- 22.
- 23.
Protocol Additional to the Geneva Conventions of 12 August 1949 relating to the Protection of Victims of International Armed Conflicts (Protocol I), 8 June 1977, 1125 UNTS 3 (entered into force 7 December 1978) (‘Additional Protocol I’).
- 24.
- 25.
A legal review requirement is generally considered customary only with respect to the means of warfare, namely weapons and weapon systems. Additional Protocol I, Article 48 also requires a legal review of methods of warfare. An obligation to review new methods of warfare has not crystallised into customary international law: Schmitt (ed) 2013, commentary accompanying r. 48.
- 26.
Nuclear Weapons, paras 78–79.
- 27.
- 28.
Additional Protocol I, Articles 48, 51–52.
- 29.
For details, see, for example, Schmitt 2012a.
- 30.
- 31.
The rule specifies that an attack is indiscriminate if it is ‘expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated’: Additional Protocol I, Article 51(5)(b).
- 32.
For a discussion of the methodology, see, Thurnher and Kelly 2012.
- 33.
Depending on future technological advances, sliding scale-type algorithms or mechanisms may be developed to allow AWS to adjust from those established baselines on their own based upon changes that the systems identify on the battlefield.
- 34.
Henckaerts and Doswald-Beck 2005, r. 15; Cadwalader, pp. 161–162.
- 35.
Additional Protocol I , Article 57(2)(a)(ii).
- 36.
Humanitarian Policy and Conflict Research 2009, p. 38.
- 37.
Additional Protocol I, Article 57(2)(a)(i).
- 38.
- 39.
It is important to note that the objective decisions made by AWS are distinct from the subjective ones required by the law of armed conflict. Any objective criteria are more akin to Rules of Engagement that direct the autonomous weapon’s actions than to legal thresholds. These operational constraints can and would likely be set at a more stringent level than would be allowed by law.
- 40.
It is less likely that systems that operate underwater or in areas where communications jamming is prevalent will be able to have their subjective values adjusted during a mission.
- 41.
- 42.
Human Rights Watch 2012, p. 42.
- 43.
Fenrick 2010, p. 505.
- 44.
See, for example, Geneva Convention Relative to the Protection of Civilian Persons in Time of War, 12 August 1949, 75 UNTS 287 (entered into force 21 October 1950), Article 146; Additional Protocol I, Articles 86–87; Rome Statute of the International Criminal Court, 17 July 1998, 2187 UNTS 90 (entered into force 1 July 2002), Articles 25(3)(b) and 28. The law of armed conflict further imposes a duty to investigate possible war crimes: Schmitt 2011, pp. 31–84.
References
Ackerman S (2013) Navy preps to build a robot ship that blows up mines. www.wired.com/dangerroom/2013/01/robot-mine-sweeper/. Accessed 26 February 2013
Alston P (2011) Lethal robotic technologies: the implications for human rights and international humanitarian law. J Law Inf Sci 21:35–60
Cadwalader G (2011) The rules governing the conduct of hostilities in Additional Protocol I to the Geneva Conventions of 1949: a review of relevant United States references. Yearb Int Humanit Law 14:133–171
Coughlin T (2011) The future of robotic weaponry and the law of armed conflict: irreconcilable differences? Univ Coll London Jurisprudence Rev 17:67–99
Dahm W (2012) Killer drones are science fiction. Wall Street J, 15 Feb 2012, A. 11
Fenrick W (2010) The prosecution of international crimes in relation to the conduct of military operations. In: Gill T, Fleck D (eds) The handbook of the law of military operations. Oxford University Press, Oxford, pp 501–514
Gillespie T, West R (2010) Requirements for autonomous unmanned air systems set by legal issues. Int C2 J 4(2):1–32
Henckaerts J, Doswald-Beck L (2005) Customary international humanitarian law. Cambridge University Press, Cambridge
Heintschel von Heinegg W (2011) Concluding remarks. In: Heintschel von Heinegg W, Beruto GL (eds) International humanitarian law and new weapon technologies. International Institute of Humanitarian Law, Sanremo, pp 183–186
Herbach J (2012) Into the caves of steel: precaution, cognition and robotic weapon systems under the law of armed conflict. Amsterdam Law Forum 4(3):3–20
Heyns C (2013) Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions on lethal autonomous robotics. UN Doc A/HRC/23/47
Human Rights Watch (2012) Losing humanity: the case against killer robots. www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf. Accessed 27 Feb 2013
Humanitarian Policy and Conflict Research (2009) Manual on international law applicable to air and missile warfare. www.ihlresearch.org/amw/manual. Accessed 28 June 2013
IEEE (2012) Look ma, no hands. www.ieee.org/about/news/2012/5september_2_2012.html. Accessed 26 Feb 2013
Kellenberger J (2011) Keynote address. In: Heintschel von Heinegg W, Beruto GL (eds) International humanitarian law and new weapon technologies. International Institute of Humanitarian Law, Sanremo, pp 23–27
Poitras C (2012) Smart robotic drones advance science. http://today.uconn.edu/blog/2012/10/smart-robotic-drones-advance-science/. Accessed 27 Feb 2013
Public Broadcasting Service (2011) Smartest machines on earth. (transcript) http://www.pbs.org/wgbh/nova/tech/smartest-machine-on-earth.html. Accessed 27 Feb 2013
Russell S, Norvig P (2010) Artificial intelligence: a modern approach, 3rd edn. Prentice Hall, Upper Saddle River
Schmitt MN (2011) Investigating violations of international law in armed conflict. Harv Natl Secur J 2:31–84
Schmitt MN (2012a) Discriminate warfare: the military necessity-humanity dialectic of international humanitarian law. In: Lovell DW, Primoratz I (eds) Protecting civilians during violent conflict: theoretical and practical issues for the 21st century. Ashgate, Farnham, pp 85–102
Schmitt MN (2012b) Autonomous weapon systems and international humanitarian law: a reply to the critics. Harvard National Security Journal Features. http://harvardnsj.org/wp-content/uploads/2013/02/Schmitt-Autonomous-Weapon-Systems-and-IHL-Final.pdf. Accessed 27 February 2013
Schmitt MN (ed) (2013) Tallinn manual on the international law applicable to cyber warfare. International Group of Experts at the Invitation of the NATO Cooperative Cyber Defence Centre of Excellence/Cambridge University Press, Cambridge
Schmitt MN, Thurnher J (2013) ‘Out of the loop’: autonomous weapon systems and the law of armed conflict. Harv Natl Secur J 4:231–281
Sharkey N (2011) Automating warfare: lessons learned from the drones. J Law Inf Sci 21:140–154
Sharkey N (2012) Drones proliferation and protection of civilians. In: Heintschel von Heinegg W and Beruto GL (eds) International humanitarian law and new weapon technologies. International Institute of Humanitarian Law, Sanremo, pp 108–118
Singer PW (2009) Wired for war: the robotics revolution and conflict in the twenty-first century. Penguin Press, New York
Thurnher J, Kelly T (2012) Collateral damage estimation. US Naval War College video. www.youtube.com/watch?v=AvdXJV-N56A&list=PLam-yp5uUR1YEwLbqC0IPrP4EhWOeTf8v&index=1&feature=plpp_video. Accessed on 26 February 2013
US Air Force (2009) Unmanned aircraft systems flight plan 2009–2047. Headquarters Department of the Air Force, Washington DC
US Defense Advanced Research Projects Agency (2013) DARPA’s anti submarine warfare game goes live. www.darpa.mil/NewsEvents/Releases/2011/2011/04/04_DARPA’s_Anti-Submarine_Warfare_game_goes_live.aspx. Accessed on 26 February 2013
US Department of Defense (2009) FY2009–2034 unmanned systems integrated roadmap. Government Printing Office, Washington DC
US Department of Defense (2012a) Directive 3000.09: autonomy in weapon systems. Government Printing Office, Washington DC
US Department of Defense (2012b) Task force report: the role of autonomy in DoD systems. www.fas.org/irp/agency/dod/dsb/autonomy.pdf. Accessed 26 Feb 2012
Wagner M (2012) Autonomy in the battlespace: independently operating weapon systems and the law of armed conflict. In: Saxon D (ed) International humanitarian law and the changing technology of war. Martinus Nijhoff, Leiden, pp 99–122
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 T.M.C. Asser Press and the authors
About this chapter
Cite this chapter
Thurnher, J.S. (2014). Examining Autonomous Weapon Systems from a Law of Armed Conflict Perspective. In: Nasu, H., McLaughlin, R. (eds) New Technologies and the Law of Armed Conflict. T.M.C. Asser Press, The Hague. https://doi.org/10.1007/978-90-6704-933-7_13
Download citation
DOI: https://doi.org/10.1007/978-90-6704-933-7_13
Published:
Publisher Name: T.M.C. Asser Press, The Hague
Print ISBN: 978-90-6704-932-0
Online ISBN: 978-90-6704-933-7
eBook Packages: Humanities, Social Sciences and LawLaw and Criminology (R0)