Skip to main content

Military Robots and the Principle of Humanity: Distorting the Human Face of the Law?

  • Chapter
  • First Online:
Armed Conflict and International Law: In Search of the Human Face

Abstract

This article aims to raise awareness of the potential challenges involved in sending (autonomous) robots to war. Drawing on multiple disciplines, the author finds that the advantages and disadvantages of using robotic soldiers may well allow one to argue either way. However, taking into consideration the principle of humanity as a cornerstone of international humanitarian law, particularly strong concerns arise. Since robots are not able to conceive of ethical and moral concerns in addition to lacking analytical skills, it is held that they are not able to act in accordance with the rules which are applicable during armed conflict. An urgent need is recognised for the international (legal) community to take ownership of the process to regulate the deployment of robots in war situations.

The author is PhD candidate at the University of Amsterdam.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See also McCormack and Radin 2009, pp. IX–XII.

  2. 2.

    Lovgren 2006.

  3. 3.

    iRobot website 2011; Sockification Youtube 2011.

  4. 4.

    The Telegraph 2010.

  5. 5.

    National Defense Authorization Act for Fiscal Year 2001, US Public Law 106-398, Section 220, 106th US Congress, 2nd session, 2000 http://www.dod.mil/dodgc/olc/docs/2001NDAA.pdf; Economist 2007; Sparrow 2007, p. 64.

  6. 6.

    Unmanned Aircraft Systems Roadmap 2005–2030, Office of the US Secretary of Defense, 2005; Joint Robotics Program Master Plan FY2005, LSD (AT&L) Defense Systems/Land Warfare and Munitions, 3090 Pentagon, Washington DC 20301-3090; The Navy Unmanned Undersea Vehicle (UUV) Master Plan, Department of the Navy, USA, 9 November 2004; Unmanned Systems Roadmap 2007–2032, US Department of Defense, 10 December 2007; All cited in Sharkey 2008a, p. 86.

  7. 7.

    CBC News 2011.

  8. 8.

    Ibid.

  9. 9.

    Statute of the International Court of Justice, San Francisco, 26 June 1945, Trb. 1971, No. 55, Article 38.1.(c) ‘The Statute of the International Court of Justice’ is available at http://www.icj-cij.org/documents/index.php?p1=4&p2=2&p3=0.

  10. 10.

    Capek 1920.

  11. 11.

    Sparrow 2007, p. 65.

  12. 12.

    Sparrow cites an Air Armament Center Public affairs report of 2000 ‘This bomb can think before it acts’ as published in leading Edge magazine 42(2):12; See also Kurzweil 1990, 2000; Greenwald 2011.

  13. 13.

    Kurzweil, Ibid.

  14. 14.

    Kurzweil 2009; This article was originally published in the 2008 Scientific American ‘Special Report on Robots’.

  15. 15.

    IBM website 2011.

  16. 16.

    Sparrow 2007, p. 65; For reasons of simplicity, here, autonomous robots are presumed to be those robots programmed by humans for one or more specific tasks, however, without the involvement of a human operator in important decision-making structures.

  17. 17.

    The US Air Force has four levels and the US Navy distinguishes between scripted, supervised and intelligent robots; Sharkey 2008b, p. 16.

  18. 18.

    The US Army has ten levels of autonomy, the US Air Force has four levels and the US Navy distinguishes between scripted, supervised and intelligent robots; Sharkey 2008b, p. 16.

  19. 19.

    Prices of robots were 80 % cheaper in 2006 than in 1990; Sharkey 2008c, p. 1800.

  20. 20.

    Isaacson 2011.

  21. 21.

    Kurshid et al. 2004, p. 775.

  22. 22.

    Sharkey 2008b, p. 16.

  23. 23.

    Sparrow 2007, p. 69.

  24. 24.

    Singer 2010.

  25. 25.

    Kurshid et al. 2004, p. 775.

  26. 26.

    No Hands Across America 2011 http://www.cs.cmu.edu/afs/cs/user/tjochem/www/nhaa/general_info.html; See also Singer 2010, p. 90.

  27. 27.

    Singer 2010, p. 140.

  28. 28.

    Singer 2009, p. 33; Sharkey 2008b, p. 15.

  29. 29.

    Tesla developed remote-controlled torpedoes in the late nineteenth century (Sharkey 2008a, p. 86) and during World War II, Nazi Germany used the Fieseler Fi-103, also known as Vergeltungswaffe-1, or the V1 flying bomb during the attacks on London in 1944. The V1-UAV could be pre-programmed to fly a relatively short distance before dropping to the ground and exploding and is similar in function to cruise missiles; however, it has less in common with the modern-day UAVs and robotic aerial vehicles of concern in this article.

  30. 30.

    Global Security website 2011; Fairly little public knowledge existed about the drones used in the Vietnam War and that experienced some major difficulties—16 % of the RPVs crashed; Singer 2009, p. 29.

  31. 31.

    Singer 2009, p. 30.

  32. 32.

    Despite being linked to human operators on the ground, who decide when to send out the robot, the Global Hawk carries out its mission autonomously; Singer 2009, p. 40.

  33. 33.

    Ibid.

  34. 34.

    Johansen 2011.

  35. 35.

    For an overview of the history of UAVs in warfare see Cook 2007.

  36. 36.

    ‘Biology’ and ‘mimetic’ meaning to mimic or copy; Singer 2010, p. 91.

  37. 37.

    Singer 2010, pp. 89, 90.

  38. 38.

    Singer 2009, p. 40.

  39. 39.

    Ibid.

  40. 40.

    Singer, p. 39.

  41. 41.

    Singer, p. 41.

  42. 42.

    Santos et al. 2008.

  43. 43.

    Thinkbotics website 2011.

  44. 44.

    Boston Dynamics website 2011.

  45. 45.

    Singer 2010, p. 115.

  46. 46.

    Sparrow 2007, p. 63.

  47. 47.

    Kurshid et al. 2004, p. 775.

  48. 48.

    The project is carried out at the Institute for Cognitive Systems (ICS) at the TU Munich; Innovations-report 2011.

  49. 49.

    Singer 2010, p. 113.

  50. 50.

    Singer 2010, p. 112.

  51. 51.

    These kinds of robots are being developed for instance by Applied Perception Inc., see Voth 2004, p. 2.

  52. 52.

    Schmitt 1999, p. 143.

  53. 53.

    Ishiguro 2005 cited in Wallach and Allan 2009, 162 ff., 247.

  54. 54.

    The Predator is equipped with Hellfire antitank missiles; Johansen 2011.

  55. 55.

    Singer 2009, p. 33.

  56. 56.

    Johansen 2011.

  57. 57.

    Singer 2009, p. 33.

  58. 58.

    Johansen 2011.

  59. 59.

    In 2009 the UN Special Rapporteur Philip Alston already questioned the legality of the US use of drones to kill militants in Afghanistan and Pakistan, cited in Bowcott 2010.

  60. 60.

    Johansen 2011.

  61. 61.

    Some major technological difficulties have been experienced with the SWORDS; Popular Mechanics website 2008.

  62. 62.

    Sharkey 2008b, p. 14.

  63. 63.

    Ibid.

  64. 64.

    Singer 2010, p. 30; Singer 2009, p. 35; Sharkey 2008b, p. 14.

  65. 65.

    Sharkey 2008b.

  66. 66.

    Throughout the article the term ‘international humanitarian law’ or IHL shall be used rather than that of ‘law of armed conflict’. This is on the one hand simply due to the thematic focus of this collection on the human face of the law, and, on the other, to pay tribute to the development of international law over the last decades. Further, the preference for the term IHL is also rooted in the idea that nowadays a declaration of war or the explicit acknowledgement of both parties of a state of armed conflict is no longer a prerequisite for the application of IHL. Rather, the relevant rules apply objectively as a matter of fact, and in addition to times of international armed conflict, in non-international armed conflicts as well as situations of occupation. In a sense, the term ‘IHL’ is thus wider than LOAC, except for the law of neutrality which is not primarily concerned with humanitarian considerations and therefore falls outside the scope of IHL; See Greenwood 2009, p. 11.

  67. 67.

    Keegan 1993.

  68. 68.

    Grotius 1625.

  69. 69.

    Schachter 1991, p. 36; Ago 1957, p. 693.

  70. 70.

    Radbruch et al. 2003.

  71. 71.

    Ibid.

  72. 72.

    Highly disputed, Radbruch’s formula was taken up again during the trials over the Berlin wall shootings and was recently of interest in ECHR, Kononov v. Latvia, Grand Chamber Judgment, Application no. 36376/04, 17 May 2010 http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-98669#{“itemid”:[“001-98669”]}; See also Mertens 2006, pp. 277–295 and Miller 2001, pp. 653–663.

  73. 73.

    Arendt 2006, pp. 135–150.

  74. 74.

    Lin et al. 2008, p. 42.

  75. 75.

    See Article 53 of the Vienna Convention on the Law of Treaties, Vienna, 23 May 1969, United Nations Treaty Series, Volume 1155, p. 331.

  76. 76.

    Ingierd touches upon even broader questions concerning ‘Moral Responsibility in War’ focusing specifically on complex peace operations. She recognises the difficulties involved in practically applying a concept such as morality to conflict situations; Ingierd 2010.

  77. 77.

    The Rome Statute of the International Criminal Court, Rome, 17 July 1998, UN Doc. A/CONF.183/9, Article 5. http://untreaty.un.org/cod/icc/statute/romefra.htm.

  78. 78.

    Ibid., Article 7.

  79. 79.

    Robertson 2000, p. 239.

  80. 80.

    Ruti 2011.

  81. 81.

    See for instance the Report of the Secretary-General to the Security Council 2005, para 12.

  82. 82.

    An in-depth discussion of the complementation and contradiction between these areas of international law is outside the scope of this chapter.

  83. 83.

    Convention (I) for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field (hereinafter GC I), Geneva, 12 August 1949, United Nations Treaty Series, Volume Number 75; Convention (II) for the Amelioration of the Condition of Wounded, Sick and Shipwrecked Members of Armed Forces at Sea (hereinafter GC II), Geneva, 12 August 1949, United Nations Treaty Series, Volume Number 75; Convention (III) relative to the Treatment of Prisoners of War (hereinafter GC III), Geneva, 12 August 1949, United Nations Treaty Series, Volume Number 75; Convention (IV) relative to the Protection of Civilian Persons in Time of War (hereinafter GC IV), Geneva, 12 August 1949, United Nations Treaty Series, Volume Number 75; Collectively referred to as ‘the Geneva Conventions’ or ‘the Conventions’. http://www.icrc.org/ihl.

  84. 84.

    According to the ICRC, 194 States are parties to the Geneva Conventions; ICRC 2011b.

  85. 85.

    ICJ, Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, General List No. 958, 8 July 1996, I.C.J. Reports 1996, pp. 257, 258, paras 79, 82; ICJ, Corfu Channel Case (United Kingdom of Great Britain and Northern Ireland v. Albania), Judgment No. 1, 9 April 1949, I.C.J. Reports 1949, p. 22.

  86. 86.

    Common Article 1 of the Geneva Conventions of 1949, supra note 83.

  87. 87.

    Azzam 1997, p. 55.

  88. 88.

    Declaration Renouncing the Use, in Time of War, of Explosive Projectiles Under 400 Grammes Weight, Saint Petersburg, 29 November–11 December 1868. Schindler and Toman 1988, p. 102. or http://www.icrc.org/ihl.nsf/INTRO/130?OpenDocument.

  89. 89.

    Martens 1871; Martens 1882, p. 178; Martens 1879, p. 45 (in Russian); All cited in Pustogarov 1996.

  90. 90.

    This contention echoes ideas of the Enlightenment including those previously held by scholars such as Rousseau and Locke in relation to the ‘social contract’. In addition, they adequately capture the spirit of human rights law more generally such as captured in the Universal Declaration of Human Rights; The Universal Declaration of Human Rights, United Nations, General Assembly, General Assembly Resolution 217 (III), 10 December 1948, UN GAOR, 3d Sess., Supp. No. 13, UN Doc. A/810 (1948), p. 71.

  91. 91.

    Pustogarov 1996.

  92. 92.

    Ibid.

  93. 93.

    Final Act of the International Peace Conference, The Hague, 29 July 1899. http://www.icrc.org/ihl.nsf/FULL/145?OpenDocument and Schindler and Toman 1988, pp. 50, 51.

  94. 94.

    Convention (IV) respecting the Laws and Customs of War on Land and its annex; Regulations concerning the Laws and Customs of War on Land, The Hague, 18 October 1907; Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, Geneva, 10 October 1980, United Nations Treaty Series, Volume 1342, p. 137; and GC I, Article 63(4); GC II, Article 62 (4); GC III, Article 142(4), and GC IV, Article 158(4), supra note 83; Protocol (I) Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (hereinafter AP I), Geneva, 8 June 1977, United Nations Treaty Series, Volume Number 1125, Article 1; Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of Non-International Armed Conflicts (hereinafter AP II), Geneva, 8 June 1977, United Nations Treaty Series, Volume Number 1125, Preamble.

  95. 95.

    Pictet 1958, p. 15.

  96. 96.

    Oxford English Dictionary 1989; Coupland 2001, pp. 969–989.

  97. 97.

    Vienna Convention on the Law of Treaties, supra note 75, entered into force 27 January 1980.

  98. 98.

    Coupland 2001, pp. 969–989.

  99. 99.

    Wortel 2009, pp. 779–802.

  100. 100.

    Robertson 2000, p. 239; Meron 2000; Cassese 2000.

  101. 101.

    Riesenberger and Riesenberger 2011.

  102. 102.

    Dunant 1986.

  103. 103.

    ICRC 1990, p. 8.

  104. 104.

    Pictet 1956, p. 14.

  105. 105.

    Pictet 1956, p. 12.; The founders of the organisation described their aim as preventing and alleviating human suffering, protecting life and health, ensuring respect for the human being and promoting mutual understanding, friendship, co-operation and lasting peace amongst all peoples; Durand 1981, p. 54.

  106. 106.

    Statute of the ICRC, Article 5. http://www.icrc.org/eng/resources/documents/misc/icrc-statutes-080503.htm.

  107. 107.

    Brownlie 1998, p. 28.

  108. 108.

    Meron 1998, p. 74.

  109. 109.

    ICJ, Corfu Channel Case, supra note 85, p. 22.

  110. 110.

    Written Submission by the Russian Federation as requested by the General Assembly, 13; Written Submission on the Opinion requested by the General Assembly by the United Kingdom, 21; Nauru, Written Submission on the Opinion requested by the World Health Organisation, 46; All cited in Ticehurst 1997.

  111. 111.

    ICJ, Legality of the Threat or Use of Nuclear Weapons, supra note 85, para 78.

  112. 112.

    ICJ, Ibid., Dissenting Opinion of Judge Koroma, http://www.icj-cij.org/docket/files/95/7523.pdf, p. 14.

  113. 113.

    ICJ, Ibid., Dissenting Opinion of Judge Shahabuddeen, http://www.icj-cij.org/docket/files/95/7519.pdf, p. 2.

  114. 114.

    Japan, Oral Statement before the ICJ, public sitting of Tuesday 7 November 1995, p. 18, http://www.icj-cij.org/docket/files/95/5935.pdf, see also Ticehurst 1997.

  115. 115.

    United Nations Report of the International Law Commission on the Work of its Forty-Sixth Session, 2 May–22 July 1994, GAOR A/49/10, p. 317.

  116. 116.

    Cassese 2000, p. 187.

  117. 117.

    Schwarzenberger 1958, pp. 10, 11.

  118. 118.

    Röling 1960, pp. 37, 38.

  119. 119.

    Cassese 2000, p. 189; See also Blinz 1960, pp. 139–160.

  120. 120.

    Ticehurst 1997.

  121. 121.

    Nauru, Written Submission on the Opinion requested by the World Health Organisation, supra note 85.

  122. 122.

    Dissenting Opinion of Judge Shahabuddeen, supra note 85.

  123. 123.

    United Nations, General Assembly, General Assembly Resolution 38/75, 15 December 1983, A/RES/38/75, p. 69.

  124. 124.

    McBride 1984, p. 406.

  125. 125.

    Greenwood 2009, p. 28.

  126. 126.

    The teleological approach was defined by the ICTY in the Čelebići case, Prosecutor v. Delalić et al.,

    Judgment, Trial Chamber II, Case No. IT-96-21-T, 16 November 1998, at para 163, accordingly:

    “(A)lso called the ‘progressive’ or ‘extensive’ approach of the civilian jurisprudence, (it) is in contrast with the legislative historical approach. The teleological approach plays the same role as the ‘mischief rule’ of common law jurisprudence. This approach enables interpretation of the subject matter of legislation within the context of contemporary conditions. The idea of the approach is to adapt the law to changed conditions, be they special, economic or technological, and attribute such change to the intention of the legislation”. http://www.icty.org/x/cases/mucic/tjug/en/cel-tj981116e.pdf.

  127. 127.

    Ibid., para 189.

  128. 128.

    A discussion on personhood (a hot topic especially in bioethics today) in relation to robots is outside the scope of this chapter.

  129. 129.

    Wallach and Allan 2009, pp. 42–45, 63, 163, 210.

  130. 130.

    Lin et al. 2008 briefly discuss the rather absurd possibility of sending out ‘comfort robots’ with the troops to take on the role of ‘lovers’ or ‘relationship partners’. Lin et al. 2008, pp. 81–83.

  131. 131.

    Garreau 2007.

  132. 132.

    Cited in Singer 2010, photograph comments.

  133. 133.

    Borenstein 2008, p. 5.

  134. 134.

    Minton 1988.

  135. 135.

    Oxford English Dictionary 2011.

  136. 136.

    Gardner 1985.

  137. 137.

    There are at least five other different cognitive abilities that complement one another and together form the pieces that make up the intelligence of a person. These include spatial, bodily-kinetic and musical intelligence, and interpersonal as well as intrapersonal abilities. Accordingly, next to a person’s capacity to perform mathematical calculations or recognise forms and patterns, it is also important how well one may be able to visualize certain ideas, how developed is one’s ability to cope with words and languages, or how pronounced is one’s ability to exercise control over bodily motions. In addition, musical intelligence relates to the auditory skills of a person and his or her sensitivity for sounds, rhythms and tones. Artistic intelligence in the wider sense may relate to a person’s feelings for the composition of colours and forms.

  138. 138.

    Gardner 2002; Gardner 1995 (emphasis added).

  139. 139.

    Cited in Goleman 2011, p. 64.

  140. 140.

    The two prominent schools researching personal intelligence include behaviourists like B.F. Skinner who restrict their research to describing human behaviour, and researchers focusing on (meta)-cognition like Gardner. Both refrain from analysing emotions themselves; Goleman 2011, p. 64.

  141. 141.

    Hoffman 1984.

  142. 142.

    Goleman 2011 (quoting the findings of P Ekman), p. 22.

  143. 143.

    Meaning both the level of insight into one’s own feelings as well as knowledge of the human nature more generally.

  144. 144.

    Goleman 2011, p. 65 (quoting Salovey)

  145. 145.

    Goleman 1989.

  146. 146.

    Hoffman 1984.

  147. 147.

    Goleman 1989, p. 138.

  148. 148.

    Hoffman 1984.

  149. 149.

    Ibid.

  150. 150.

    Goleman 1989, p. 138.

  151. 151.

    Voth 2004, pp. 4–5.

  152. 152.

    1. A robot may not injure a human being, or through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such order would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First of Second Law; Asimov 1942. In 1985 Asimov added a Zeroth law: 0. A robot may not harm humanity, or by inaction, allow humanity to come to harm, Asimov 1985. By adding the Zeroth law, he raised the bar significantly and included the crime of non-assistance of a person in danger.

  153. 153.

    Dilov (1974) ‘The Way of Icarus’, or Clarke (1994) ‘An extended Set of the Laws of Robotics’, cited in Lin et al. 2008, 31 ff.

  154. 154.

    In a more scientific effort to address potential problems, the United Kingdom's Engineering and Physical Sciences Research Council (EPSRC), together with the Arts and Humanities Research Council, designed a ‘semi-legal’ set of rules. See Engineering and Physical Research Council website 2011; In 2007, South Korea also initiated the creation of a ‘Robot Ethics Charter’ in which futurists and science fiction writers were to create an ethical code to prevent humans abusing robots and vice versa. To the knowledge of the author, however, there has been no outcome concerning the international legal regulation of robots; BBC News 2007.

  155. 155.

    The principle of nulla poena nullum crinem sine lege is based, amongst others, on the principle of non-retroactivity and the principle of certainty. It means “that an act can be punished only if, at the time of its commission, the act was the object of a valid, sufficiently precise, written criminal law to which a sufficiently certain sanction was attached”; See Kreß 2008.

  156. 156.

    The principle that we must be able to identify those responsible for deaths in war is based on moral consequentialism and deontology; Sparrow 2007, pp. 66–68, also citing Nagel 1972 and Walzer 2000.

  157. 157.

    Sparrow finds that in these cases, responsibility would fall on the commanding officer.

  158. 158.

    Sparrow describes a rather surreal scenario in which punishments could be foreseeable for robots. He also looks at the possibilities to attribute responsibility to the programmer or the commanding officer; Sparrow 2007, pp. 69–73.

  159. 159.

    Lin et al. 2009, p. 55; See also Wallach and Allan 2009, p. 201, 207; Asaro 2008.

  160. 160.

    Common Article 1 to the 1949 Geneva Conventions, requires States to ‘respect and ensure respect’ for the Geneva Conventions. Although there does not seem to be agreement as to the scope of this responsibility, it is clear that it would at least extend to the obligation to ensure respect for the relevant legal rules within their national jurisdiction.

  161. 161.

    Lin et al. 2008.

  162. 162.

    With the hybrid approach consisting of both top-down and bottom-up aspects; Ibid., pp. 27–42.

  163. 163.

    This kind of theory has its origins in the deontological understanding that ethics are intrinsically duty-based, and that being moral effectively means fulfilling one’s duties. Kant’s categorical imperative is a shining example of a deontological top-down theory, as are Asimov’s laws; Lin et al. 2008, p. 28.

  164. 164.

    Lin et al. 2008, 38 ff, p. 88.

  165. 165.

    Ibid.

  166. 166.

    The principle of distinction requires that only combatants, but never civilians, are made the direct object of attack (Civilians are negatively defined as all those persons who are not members of the armed forces of a party to an armed conflict with the exception of religious and medical personnel GC III, supra note 83, Articles 4, 6; AP I, supra note 94, Articles 43, 50). However, in cases where civilians are not directly targeted for instance, their loss of life may be acceptable as ‘collateral damage’ (Stein 2004). However, the distinction between a civilian and a combatant has increasingly become more blurred during modern conflicts which are often non-international in kind and consequently involve Non-State actors. In non-international, as in international armed conflicts, civilians enjoy immunity from attack for as long as they do not engage in any ‘direct participation in hostilities’ (AP II, supra note 94, Article 51 (2)). This notion, however, is a hotly debated topic and the discussion surrounding the ICRC Interpretative Guide to the notion of ‘DPH’ is far from settled (Melzer 2009). Similar to most other rules of IHL, the prohibition on killing civilians is therefore highly nuanced.

  167. 167.

    AP I, supra note 94, Article 51(5) (b).

  168. 168.

    AP I, supra note 94, Article 57 (3); Achieving international agreement as to what constitutes military necessity has also proven difficult, but it is clear that it is meant as a restriction rather than a permissive rule in the sense that all means or methods of warfare that are not directly necessary for the attainment of a definite military advantage are prohibited, not that all means necessary for attaining such a goal are allowed; Kwakwa 1992, p. 36.

  169. 169.

    Kwakwa 1992, p. 36.

  170. 170.

    Ibid.

  171. 171.

    Pictet 1985, p. 62.

  172. 172.

    Lin et al. 2008, p. 38.

  173. 173.

    Ibid.

  174. 174.

    Compare: GC I, Article 47; GC II, Article 48; GC III, Article 127; GC IV, Article 144, supra note 83.

  175. 175.

    The ICRC has issued a comprehensive handbook as a reference guide for the implementation of IHL, ICRC 2011a.

  176. 176.

    Morality and humanity thus differ from philosophical understandings such as Kant’s ‘categorical imperative’ which is an ultimate obligation that is not dependent on a certain situation; Wortel 2009, p. 790.

  177. 177.

    The Rome Statute of the International Criminal Court, supra note 77, Articles 25, 28, 30, 31; See also the Statute of the International Tribunal for the Former Yugoslavia, Security Council Resolution 827, 25 May 1993, S/RES/827, http://www.icty.org/x/file/Legal%20Library/Statute/statute_sept09_en.pdf, Article 7.

  178. 178.

    Simon 1982 (emphasis added).

  179. 179.

    Sparrow 2007, p. 68; Arkin even argues that using robots would lower the number of civilian deaths, Arkin 2007, p. 57.

  180. 180.

    Sharkey 2008b, p. 16.

  181. 181.

    Wallach and Allan 2009.

  182. 182.

    Isenberg cited in Borenstein 2008, p. 8.

  183. 183.

    ICJ, Case Concerning the Aerial Incident of 3 July 1988 (Islamic Republic of Iran v. United States of America), Settlement Agreement, 9 February 1996, http://www.icj-cij.org/docket/files/79/6639.pdf.

  184. 184.

    Against this background, it is particularly worrisome that more research is underway to develop systems of artificial intelligence to analyse all potentially relevant incoming data, identifying it as friendly or hostile and re-presenting the respective conclusion to a human operator; Sparrow 2007, p. 69.

  185. 185.

    Singer 2010, p. 31, 36; Arkin 2007, pp. 6, 7.

  186. 186.

    As Patricio Perez describes in Van Baarda 2004.

  187. 187.

    Weiner 2005, citing Gordon Johnson, Joint Forces Commander at the Pentagon.

  188. 188.

    Borenstein 2008, p. 4.

  189. 189.

    Sharkey 2008b.

  190. 190.

    Van Baarda 2004.

  191. 191.

    Office of the Surgeon Multinational Force-Iraq and Office of the Surgeon General United States Army Medical Command (2006) cited in Borenstein 2008, p. 2.

  192. 192.

    Wortel 2009, p. 787.

  193. 193.

    This scenario describes precisely the situation in which principles like the Radbruch formula discussed above are relevant.

  194. 194.

    Rome Statute of the International Criminal Court, supra note 77, Article 33.

  195. 195.

    Pictet 1956, p. 16.

  196. 196.

    Van Baarda 2004.

  197. 197.

    See also Hamilton and Reed 2009.

  198. 198.

    Weintraub 2002.

  199. 199.

    Schmitt 2007.

  200. 200.

    CNN 2009.

  201. 201.

    Dunlap 2007, pp. 117–125.

  202. 202.

    Dunlap 2007, p. 122; See also Parks 1990.

  203. 203.

    This is reflected in many treaties as well as military manuals and other practice. See for instance Geneva GC III, Articles 26, 87; and GC IV, Article 33, supra note 83; AP I, Article 75 (2) (d); and AP II, Article 4 (2) (b), supra note 94; Or the Draft Code of Crimes against the Peace and Security of Mankind, 1991, International Law Commission, A/CN.4/L.459 and Add.1, Yearbook of the International Law Commission, Volume 1, http://untreaty.un.org/ilc/documentation/english/a_cn4_l459.pdf, Article 22(2) (a); For more practice in this respect see Doswald-Beck and Henckaerts 2005, Rule 103 and the practice relating to Rule 103.

  204. 204.

    This is further reflected in the increasing endorsement of such concepts as the principle of the Responsibility to Protect, in itself a concept which is based on the understanding of the moral responsibility towards people in need; ICISS 2001.

  205. 205.

    Krishnan 2009; Sparrow 2007; Sharkey 2007.

  206. 206.

    Kahn has suggested that for non-state actors at a technological disadvantage‚ terrorism may be the only way to fight back, Kahn 2002.

  207. 207.

    Krishnan 2009.

  208. 208.

    Sharkey cited in Bowcott 2010. See also BBC News 2011a, b.

  209. 209.

    Sri Lanka, for instance, has invested in robotic weapons; Singer 2010, citing evidence of such developments within the Tamil Tigers.

  210. 210.

    Dr. Steve Wright, Reader in Applied Global ethics at Leeds Metropolitan University cited in Bowcott 2010.

  211. 211.

    This distinction has been challenged post-Nuremberg not least by the ICJ, Legality of the Threat or Use of Nuclear Weapons, op cit, para 105. The Court was unable to pronounce an absolute prohibition of nuclear weapons, leaving room for their use in ‘extreme circumstances of self-defence’, thereby seemingly blurring the two categories; Sharma 2008, p. 9, 18; This tendency has been largely rejected by Moussa 2008, p. 263; Sloan 2009, p. 47.

  212. 212.

    United Nations, Charter of the United Nations, San Francisco, 24 October 1945, 1 United Nations Treaty Series XVI; See also Lin 2010.

  213. 213.

    CNN 2009.

  214. 214.

    Jewell 2004.

  215. 215.

    Recently, even the ICRC has been concerned with violence and computer games and their influence on war. See ICRC 2011c.

  216. 216.

    Johansen 2011.

  217. 217.

    Also referred to as externalisation; Grossman’s seminal book ‘On Killing’ (1995) describes how killing becomes easier via distance and atrocities become more likely; Singer 2009, p. 44.

  218. 218.

    Wallach and Allan refer to the 2001 ARMS (Autonomous Robots for Military Systems) study by Singh and Trayer. They note that ethics and morality do not come up anywhere in the seventy-two-page text, and safety is mentioned only in the titles of other cited works; Wallach and Allan 2009, p. 223.

  219. 219.

    Examples include policing or humanitarian assignments.

  220. 220.

    Ibid.

  221. 221.

    Hudson 2011.

  222. 222.

    Most funding for research into robotics and artificial intelligence comes from the military; Sparrow 2007, p. 62.

  223. 223.

    The way in which robots may indeed transform war has already received some attention from the scholarly community For instance the conference ‘Drone Wars’ was held in London on 18 September 2010, and a three-day workshop in Berlin on 20–22 September 2010 was organised by the International Committee for Robot Arms Control (ICRAC). In late 2011, the delegation of the ICRC in Israel and the Occupied Territories together with the Minerva Center for Human Rights and the Hebrew University of Jerusalem held a Conference on‚ New Technologies, Old Law: Applying International Humanitarian Law in a New Technological Age touching on the issue.

  224. 224.

    Sparrow 2007, p. 67.

  225. 225.

    Sharkey 2007, p. 122.

  226. 226.

    Irving 2010.

References

Online Documents

Literature

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hanna Brollowski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 T.M.C. ASSER PRESS, The Hague, The Netherlands, and the author(s)

About this chapter

Cite this chapter

Brollowski, H. (2013). Military Robots and the Principle of Humanity: Distorting the Human Face of the Law?. In: Matthee, M., Toebes, B., Brus, M. (eds) Armed Conflict and International Law: In Search of the Human Face. T.M.C. Asser Press, The Hague, The Netherlands. https://doi.org/10.1007/978-90-6704-918-4_3

Download citation

Publish with us

Policies and ethics

Societies and partnerships