Skip to main content

Means and Methods of the Future: Autonomous Systems

  • Chapter
  • First Online:

Abstract

Autonomous systems will fundamentally alter the way wars are waged. In particular, autonomous weapon systems, capable of selecting and engaging targets without direct human operator involvement, represent a significant shift of humans away from the battlefield. As these new means and methods of warfare are introduced, many important targeting decisions will likely need to be made earlier and further away from the front lines. Fearful of these changes and coupled with other legal and moral concerns, groups opposed to autonomous weapons have formed and begun campaigning for a pre-emptive ban on their development and use. Nations intending to use these emerging technologies must grapple with how best to adjust their targeting processes and procedures to accommodate greater autonomy in weapon systems. This chapter examines these cutting-edge and controversial weapons with a particular emphasis on the legal impact on targeting during international armed conflicts. Initially, this chapter will explore the promising technological advances and operational benefits which indicate these weapon systems may become a reality in the not-so-distant future. The focus will then turn to the unique challenges the systems present to the law of armed conflict under both weapons law and targeting law principles. Next, the examination will shift to two key aspects of targeting most affected by autonomous systems: targeting doubt and subjectivity in targeting. The author ultimately concludes that autonomous weapon systems are unlikely to be deemed unlawful per se and that, while these targeting issues raise legitimate concerns, the use of autonomous weapons under many circumstances will be lawful.

The views expressed are those of the author and should not be understood as necessarily representing those of NATO, the United States Department of Defense, or any other government entity.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    In fact, nations such as the United States and United Kingdom have declared they are not pursuing such weapons other than human supervised ones. House of Lords Debate 26 March 2013 (The UK Ministry of Defense ‘currently has no intention of developing [weapon] systems that operate without human intervention’.); United States Department of Defense 2012a, p. 3 (The United States has no ‘plans to develop lethal autonomous weapon systems other than human-supervised systems for the purposes of local defense of manned vehicles or installations’.).

  2. 2.

    More than 40 non-governmental organizations have formed the Campaign to Stop Killer Robots, an umbrella organization dedicated to seeking a comprehensive and pre-emptive ban on the development, production, and use of autonomous weapons. Campaign to Stop Killer Robots 2013. http://www.stopkillerrobots.org/. Accessed 8 January 2014.

  3. 3.

    Human Rights Watch is one of the founding organizations of the coalition. For a full description of their reservations and criticism of autonomous weapon systems, see Human Rights Watch 2012, p. 1.

  4. 4.

    United Nations A/HRC/23/47, p. 21.

  5. 5.

    Convention on Conventional Weapons CCW/MSP/2013/CRP.1, p. 4.

  6. 6.

    Not all of the legal principles discussed below may apply during conflicts not of an international character, otherwise known as non-international armed conflicts. The use of fully autonomous weapons during non-international armed conflicts is outside the scope of this chapter.

  7. 7.

    Krishnan 2009, p. 45.

  8. 8.

    United States Department of Defense 2012, pp. 13–14.

  9. 9.

    United States Department of Defense 2012, pp. 13–14.

  10. 10.

    Human Rights Watch 2012, p. 2.

  11. 11.

    Krishnan 2009, p. 43.

  12. 12.

    Krishnan 2009, p. 44.

  13. 13.

    It is conceivable that advances in artificial intelligence technology in the future may allow systems to possess human-like reasoning. However, it is far from certain that the technology will successfully develop in such a manner, and even Dr. Krishnan contends that any such advances would be unlikely to materialize until well beyond the year 2030. Krishnan 2009, p. 44.

  14. 14.

    Schmitt 2013a, p. 4.

  15. 15.

    Singer 2009, p. 128.

  16. 16.

    For example, the former chief scientist for the United States Air Force postulates that technology currently exists to facilitate ‘fully autonomous military strikes’; Dahm 2012, p. 11.

  17. 17.

    Guarino 2013.

  18. 18.

    Poitras 2012.

  19. 19.

    Guarino 2013. For a more general overview of machine learning capabilities and possibilities, see Russell and Norvig 2010, Chap. 18. For a discussion about how computer systems are learning, in approaches similar to how humans learn by examples, see Public Broadcasting Service 2011.

  20. 20.

    Heintschel von Heinegg 2011, p. 184 (asserting that such mines are ‘quite common and legally uncontested’).

  21. 21.

    United States Defense Advanced Research Projects Agency 2013. Note, however, that at least initially the vessel is designed to require human approval before launching an attack. The United States Navy is developing similar underwater systems to conduct de-mining operations; Ackerman 2013.

  22. 22.

    Guarino 2013.

  23. 23.

    Ibid.

  24. 24.

    Healey 2013.

  25. 25.

    Guarino 2013.

  26. 26.

    United States Air Force 2009, p. 16 (stating that ‘[a]s autonomy and automation merge, [systems] will be able to swarm … creating a focused, relentless, and scaled attack’). The United States Air Force’s Proliferated Autonomous Weapons may represent an early prototype of future swarming systems. See Singer 2009, p. 232; Alston 2011, p. 43.

  27. 27.

    Singer 2009, p. 74; Kellenberger 2011, p. 27. Note, consensus does not exist as to if and when general artificial intelligence might become available. Artificial intelligence has previously failed to live up to some expectations. Computer scientist Noel Sharkey doubts that artificial intelligence advances will achieve human-like abilities in even the next 15 years; Sharkey 2011, p. 140.

  28. 28.

    Waxman and Anderson 2013, p. 2.

  29. 29.

    United States Department of Defense 2013, p. 25. Under a heading labelled ‘A Look to the Future’ it explains: ‘Currently personnel costs are the greatest single cost in (the Department of Defense), and unmanned systems must strive to reduce the number of personnel required to operate and maintain the systems. Great strides in autonomy, teaming, multi-platform control, tipping, and cueing have reduced the number of personnel required, but much more work needs to occur’.

  30. 30.

    ‘Enable humans to delegate those tasks that are more effectively done by computer … thus freeing humans to focus on more complex decision making’; United States Department of Defense 2012b, p. 1.

  31. 31.

    Sharkey 2012, p. 110.

  32. 32.

    Singer 2009, p. 128.

  33. 33.

    For example, the United States has expressed an interest in seeking an expansion of autonomous features, albeit not lethal targeting capabilities, into its systems in the future; United States Department of Defense 2012b, pp. 1–3; United States Department of Defense 2013, p. 25.

  34. 34.

    Legality of the Threat or Use of Nuclear Weapons (Advisory Opinion), ICJ Reports 1996, p. 226 (hereinafter Nuclear Weapons); Schmitt and Thurnher 2013, p. 243. Schmitt 2013a, p. 8.

  35. 35.

    Henckaerts and Doswald-Beck 2005, r. 70; Nuclear Weapons, supra note 34, para 78; Cadwalader 2011, p. 157.

  36. 36.

    Protocol Additional to the Geneva Conventions of 12 August 1949 relating to the Protection of Victims of International Armed Conflicts (Protocol I), 8 June 1977, 1125 UNTS 3 (entered into force 7 December 1978) (hereinafter Additional Protocol I).

  37. 37.

    Henckaerts and Doswald-Beck 2005, r. 71. See also Cadwalader 2011, p. 153.

  38. 38.

    Henckaerts and Doswald-Beck 2005, r. 71.

  39. 39.

    Some commentators even contend that autonomous systems require a new, more ‘holistic’ approach to weapons review procedures. See Liu 2012, p. 639.

  40. 40.

    See for example Schmitt 2013b, commentary accompanying r. 48.

  41. 41.

    Schmitt 2013b, commentary accompanying r. 48.

  42. 42.

    The International Court of Justice has recognized distinction as a ‘cardinal’ principle of the law of armed conflict. Nuclear Weapons, supra note 34, paras 78–79.

  43. 43.

    Additional Protocol I, Articles 49, 51–52.

  44. 44.

    Henckaerts and Doswald-Beck 2005, r. 1; Nuclear Weapons Case, supra note 34, paras 78–79; Cadwalader 2011, p. 157.

  45. 45.

    See for example, Human Rights Watch 2012, pp. 30–32.

  46. 46.

    Additional Protocol I, Articles 51(5)(b), 57(2)(a)(iii).

  47. 47.

    Henckaerts and Doswald-Beck 2005, r. 14; Cadwalader 2011, pp. 157–158.

  48. 48.

    Additional Protocol I, Article 51(5)(b).

  49. 49.

    For a discussion of the collateral damage methodology used by the United States military, see Thurnher and Kelly 2012.

  50. 50.

    For example, Human Rights Watch maintains that an autonomous weapon ‘could not be programmed to duplicate the psychological processes in human judgment that are necessary to assess proportionality.’ Human Rights Watch 2012, p. 33.

  51. 51.

    Henckaerts and Doswald-Beck 2005, r. 15; Cadwalader, pp. 161162.

  52. 52.

    Additional Protocol I, Article 57.

  53. 53.

    Additional Protocol I, Article 57(2)(c).

  54. 54.

    Harvard Program on Humanitarian Policy and Conflict Research 2010, p. 38.

  55. 55.

    Additional Protocol I, Article 57(2)(a)(ii).

  56. 56.

    Additional Protocol I, Article 50(1); Henckaerts and Doswald-Beck 2005, r. 6. With regard to doubt involving the status of objects, Article 52(3) of Additional Protocol I requires Parties to presume an object is of civilian character in cases of doubt. Although it is unclear whether the rule is customary, States will nevertheless likely develop their autonomous systems to comply with such a rule. Henckaerts and Doswald-Beck 2005, r. 10.

  57. 57.

    Schmitt 2013b, commentary accompanying r. 33. See also Henckaerts and Doswald-Beck 2005, r. 10, which describes an Israeli position that in situations of doubt as to the character of an object, the appropriate threshold is whether ‘significant’ doubt exists.

  58. 58.

    See generally, CAVV 2013, p. 19.

  59. 59.

    Arkin 2009, p. 46.

  60. 60.

    Herbach 2012, pp. 17–19; Wagner 2012, pp. 121–122. See generally Gillespie and West 2010, pp. 13–20; O’Connell, p. 7.

  61. 61.

    O’Connell 2013, p. 12 (‘[T]he ultimate decision to kill must be made, therefore, by a human being at or very near the time of the lethal impact.’).

  62. 62.

    International Committee of the Red Cross 2013.

  63. 63.

    The United States issued a policy directive in 2012 establishing a strict approval process for any AWS acquisitions or development and mandating various safety measures be incorporated into future AWS designs. United States Department of Defense 2012.

References

  • Ackerman S (2013) Navy preps to build a robot ship that blows up mines. www.wired.com/dangerroom/2013/01/robot-mine-sweeper/. Accessed 30 Dec 2013

  • Alston P (2011) Lethal robotic technologies: the implications for human rights and international humanitarian law. J Law Inf Sci 21:35–60

    Google Scholar 

  • Anderson K, Waxman M (2013) Law and ethics for robot soldiers: why a ban won’t work and how the laws of war can. Hoover Inst Policy Rev

    Google Scholar 

  • Arkin R (2009) Governing lethal behavior in autonomous robots. Chapman & Hall, Boca Raton

    Book  Google Scholar 

  • Bar-Cohen Y, Hanson D (2009) The coming robot revolution: expectations and fears about emerging intelligent humanlike machines. Springer Science & Business Media, Pasadena

    Google Scholar 

  • Barnes M, Jentsch F (eds) (2010) Human-robot interactions in future military operations. Ashgate Publishing Company, Burlington

    Google Scholar 

  • Boothby W (2012) The law of targeting. Oxford University Press, Oxford

    Book  Google Scholar 

  • Cadwalader G (2011) The rules governing the conduct of hostilities in Additional Protocol I to the Geneva Conventions of 1949: a review of relevant United States references. Yearb Int Humanit Law 14:133–171

    Google Scholar 

  • Campaign to Stop Killer Robots (2013) Who we are. http://www.stopkillerrobots.org/coalition. Accessed 30 Dec 2013

  • CAVV (Commissie Van Advies Inzake Volkenrechtelijke Vraagstukken) (2013) Advisory report on armed drones, Advisory Committee on Issues of Public International Law. Advisory Report 23, July 2013

    Google Scholar 

  • Convention on Conventional Weapons (2013) Final report of the meeting of the high contracting parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be deemed to be excessively injurious or to have indiscriminate effects, CCW/MSP/2013/CRP.1

    Google Scholar 

  • Coughlin T (2011) The future of robotic weaponry and the law of armed conflict: irreconcilable differences? Univ Coll Lond Jurisprudence Rev 17:67–99

    Google Scholar 

  • Dahm W (2012) Killer drones are science fiction. Wall Str J, 15 January 2012

    Google Scholar 

  • Fenrick W (2010) The prosecution of international crimes in relation to the conduct of military operations. In: Gill T, Fleck D (eds) The handbook of the law of military operations. Oxford Press, Oxford, pp 501–514

    Google Scholar 

  • Gillespie T, West R (2010) Requirements for autonomous unmanned air systems set by legal issues. Int C2 J 4(2):1–32

    Google Scholar 

  • Gogarty B, Hagger M (2008) The laws of man over vehicles unmanned: the legal response to robotic revolution on sea, land and air. J Law, Inf Sci 19:73–145

    Google Scholar 

  • Graham D (2011) The law of armed conflict in asymmetric urban armed conflict. In: Pedrozo RA, Wollschlaeger DP (eds) International law and the changing character of war. International Law Studies, vol 87. US Naval War College, Newport, pp 301–313

    Google Scholar 

  • Guarino A (2013) Autonomous cyber weapons no longer science-fiction. Engineering and Technology Magazine. http://eandt.theiet.org/magazine/2013/08/intelligent-weapons-are-coming.cfm. Accessed 27 Dec 2013

  • Harvard Program on Humanitarian Policy and Conflict Research (HPCR) (2009) Manual on international law applicable to air and missile warfare

    Google Scholar 

  • Healey J (2013) Stuxnet and the dawn of algorithmic warfare. Huffington Post. http://www.huffingtonpost.com/jason-healey/stuxnet-cyberwarfare_b_3091274.html. Accessed 27 Dec 2013

  • Heintschel von Heinegg W (2011) Concluding remarks. In: Heintschel von Heinegg W, Beruto GL (eds) International humanitarian law and new weapon technologies. International Institute of Humantiarian Law, Sanremo, pp 183186

    Google Scholar 

  • Henckaerts J, Doswald-Beck L (eds) (2005) Customary international humanitarian law, ICRC. Cambridge University Press, Cambridge

    Google Scholar 

  • Herbach J (2012) Into the caves of steel: precaution, cognition and robotic weapon systems under the law of armed conflict. Amsterdam Law Forum 4(3):3–20

    Google Scholar 

  • House of Lords Debate 26 March 2013 (Lord Astor of Hever, Parliamentary Under Secretary of State, Defence). http://www.publications.parliament.uk/pa/ld201213/ldhansrd/text/130326-0001.htm#st_14. Accessed 30 Dec 2013

  • Human Rights Watch (2012) Losing humanity: the case against killer robots. www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf. Accessed 30 Dec 2013

  • IEEE (2012) Look ma, no hands. www.ieee.org/about/news/2012/5september_2_2012.html. Accessed 30 Dec 2013

  • International Committee for the Red Cross (2013) Autonomous weapons: States must address major humanitarian, ethical challenges http://www.icrc.org/eng/resources/documents/faq/q-and-a-autonomous-weapons.htm. Accessed 30 December 2013

  • Jenks C (2009) Law from above: unmanned aerial systems, use of force, and the law of armed conflict. North Dakota Law Rev 85:650–671

    Google Scholar 

  • Jensen E (2013) Future war, future law. Minn J Int Law 22:282–23

    Google Scholar 

  • Kellenberger J (2011) Keynote address. In: Heintschel von Heinegg W, Beruto GL (eds) International humanitarian law and new weapon technologies. International Institute of Humanitarian Law, Sanremo, pp 23–27

    Google Scholar 

  • Krishnan A (2009) Killer robots: legality and ethicality of autonomous weapons. Ashgate, Burlington

    Google Scholar 

  • Liu H (2012) Categorization and legality of autonomous and remote weapons systems. Int Rev Red Cross: New Technol Warf 94 No. 886:627–652

    Article  Google Scholar 

  • Melzer N (2013) Human rights implications of the usage of drones and unmanned robots in warfare. European ParliamentEXPO/B/DROI/2012/12

    Google Scholar 

  • O’Connell M (2013) Banning autonomous killing. Notre Dame Legal Studies Paper No. 1445. http://www.ssrn.com/link/notre-dame-legal-studies.html. Accessed 27 Dec 2013

  • Pedrozo R (2011) Use of unmanned systems to combat terrorism. In: Pedrozo RA, Wollschlaeger DP (eds) International law and the changing character of war. International Law Studies, vol 87. US Naval War College, Newport, pp 217–270

    Google Scholar 

  • Poitras C (2012) Smart robotic drones advance science. http://today.uconn.edu/blog/2012/10/smart-robotic-drones-advance-science/. Accessed 27 Feb 2013

  • Public Broadcasting Service (2011) Smartest machines on earth. (transcript) www.pbs.org/wgbh/nova/tech/smartest-machine-on-earth.html. Accessed 30 Dec 2013

  • Reeves S, Thurnher J (2013) Are we reaching a tipping point? How contemporary challenges are affecting the military necessity-humanity balance. Harvard Natl Secur J Featur 1–12. http://harvardnsj.org/wp-content/uploads/2013/06/HNSJ-Necessity-Humanity-Balance_PDF-format1.pdf. Accessed 27 Dec 2013

  • Russell S, Norvig P (2010) Artificial intelligence: a modern approach, 3rd edn. Prentice Hall, Upper Saddle River

    Google Scholar 

  • Schmitt MN (2011) Investigating violations of international law in armed conflict. Harv Natl Secur J 2:31–84

    Google Scholar 

  • Schmitt MN (2012) Discriminate warfare: the military necessity-humanity dialectic of international humanitarian law. In: Lovell DW, Primoratz I (eds) Protecting civilians during violent conflict: theoretical and practical issues for the 21st century. Ashgate, Farnham, pp 85–102

    Google Scholar 

  • Schmitt MN, Thurnher J (2013) ‘Out of the loop’: autonomous weapon systems and the law of armed conflict. Harv Natl Secur J 4:231–281

    Google Scholar 

  • Schmitt MN (2013a) Autonomous weapon systems and international humanitarian law: a reply to the critics. Harv Natl Secur J Featur. http://harvardnsj.org/wp-content/uploads/2013/02/Schmitt-Autonomous-Weapon-Systems-and-IHL-Final.pdf. Accessed 30 Dec 2013

  • Schmitt MN (ed) (2013b) Tallinn manual on the international law applicable to cyber warfare. International Group of Experts at the invitation of the NATO Cooperative Cyber Defence Centre of Excellence. Cambridge University Press, Cambridge

    Google Scholar 

  • Sharkey N (2011) Automating warfare: lessons learned from the drones. J Law, Inf Sci 21:140–154

    Google Scholar 

  • Sharkey N (2012) Drones proliferation and protection of civilians. In: Heitschel von Heinegg W (ed) International humanitarian law and new weapon technologies. International Institute of Humanitarian Law, Sanremo, pp 108118

    Google Scholar 

  • Singer P (2009) Wired for war: The robotics revolution and conflict in the twenty-first century. Penguin Press, New York

    Google Scholar 

  • Stewart D (2011) New technology and the law of armed conflict: technological meteorites and legal dinosaurs? In: Pedrozo RA, Wollschlaeger DP (eds) International law and the changing character of war. International law studies, vol 87. US Naval War College, Newport, pp 271–300

    Google Scholar 

  • Thurnher J, Kelly T (2012) Collateral damage estimation. US Naval War College video. www.youtube.com/watch?v=AvdXJV-N56A&list=PLam-yp5uUR1YEwLbqC0IPrP4EhWOeTf8v&index=1&feature=plpp_video. Accessed 30 Dec 2013

  • United Nations (2010) Interim report of the special rapporteur on extrajudicial, summary or arbitrary executions, UN Doc A/65/321

    Google Scholar 

  • United Nations (2013) Report of the special rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns, UN Doc A/HRC/23/47

    Google Scholar 

  • United States Air Force (2009) Unmanned aircraft systems flight plan 2009–2047. Headquarters Department of the Air Force, Washington, DC

    Google Scholar 

  • United States Army (1956) The law of land warfare, Field Manual (FM) 27-10. Headquarters Department of the Army, Washington, DC

    Google Scholar 

  • United States Defense Advanced Research Projects Agency (2013) DARPA’s anti-submarine warfare game goes live. www.darpa.mil/NewsEvents/Releases/2011/2011/04/04_DARPA’s_Anti-Submarine_Warfare_game_goes_live.aspx. Accessed 30 Dec 2013

  • United States Department of Defense (2009) FY2009–2034 unmanned systems integrated roadmap. Government Printing Office, Washington, DC

    Google Scholar 

  • United States Department of Defense (2012) Directive 3000.09: autonomy in weapon systems. Government Printing Office, Washington, DC

    Google Scholar 

  • United States Department of Defense (2012a) Directive 3000.09: autonomy in weapon systems: response-to-query talking points. Government Printing Office, Washington, DC (on file with author)

    Google Scholar 

  • United States Department of Defense (2012b) Task force report: the role of autonomy in DoD systems. www.fas.org/irp/agency/dod/dsb/autonomy.pdf. Accessed 30 Dec 2013

  • United States Department of Defense (2013) FY2013-2038 unmanned systems integrated roadmap. Government Printing Office, Washington, DC

    Google Scholar 

  • United States Joint Forces Command (2003) Rapid Assessment Process (RAP) Report No. 03-10, unmanned Effects (UFX): taking the human out of the loop, Headquarters Joint Forces Command, Suffolk

    Google Scholar 

  • United States Navy, Marine Corps & Coast Guard (2007) The commander’s handbook on the law of naval operations. Naval Warfare Publication (NWP) 1-14 M/Marine Corps Warfighting Publication (MCWP) 5-12.1/Commandant Publication (COMDTPUB) P5800.7A, Department of the Navy, Washington, DC

    Google Scholar 

  • Van Tol J et al. (2012). Air sea battle: a point-of-departure operational concept. www.csbaonline.org/wp-content/uploads/2010/05/2010.05.18-AirSea-Battle.pdf. Accessed 30 Dec 2013

  • Vogel R (2010) Drone warfare and the law of armed conflict. Denver J Int Law Policy 39:101–138

    Google Scholar 

  • Wagner M (2011) Taking humans out of the loop: implications for international humanitarian law. J Law, Inf Sci 21:1–11

    Google Scholar 

  • Wagner M (2012) Autonomy in the battlespace: independently operating weapon systems and the law of armed conflict. In: Saxon D (ed) International humanitarian law and the changing technology of war. Martinus Nijhoff, Leiden, pp 99–122

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jeffrey S. Thurnher .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 T.M.C. Asser Press and the authors

About this chapter

Cite this chapter

Thurnher, J.S. (2016). Means and Methods of the Future: Autonomous Systems. In: Ducheine, P., Schmitt, M., Osinga, F. (eds) Targeting: The Challenges of Modern Warfare. T.M.C. Asser Press, The Hague. https://doi.org/10.1007/978-94-6265-072-5_9

Download citation

  • DOI: https://doi.org/10.1007/978-94-6265-072-5_9

  • Published:

  • Publisher Name: T.M.C. Asser Press, The Hague

  • Print ISBN: 978-94-6265-071-8

  • Online ISBN: 978-94-6265-072-5

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics

Societies and partnerships