Skip to main content

What Do Human Rights Really Say About the Use of Autonomous Weapons Systems for Law Enforcement Purposes?

  • 943 Accesses

Abstract

This Chapter supports the view that human rights can help cutting through the fog of the concept of autonomy in weapons systems for law enforcement purposes. Building on the ongoing debate, it will be argued that an approach based on international human rights law would clarify that the only possible and acceptable definition of autonomy implies a meaningful human control on the activities of autonomous weapons systems, thus fostering the idea that a ban on fully autonomous machines is desirable. Such a conclusion will be reached after an analysis of States’ positive obligations to protect human rights during law enforcement operations, in particular the right to life and the right to privacy.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-05648-3_3
  • Chapter length: 18 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   169.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-05648-3
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Hardcover Book
USD   219.99
Price excludes VAT (USA)

Notes

  1. 1.

    McLaughlin and Nasu (2014), p. 2.

  2. 2.

    Final document of the Fifth Review Conference of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (12–16 December 2016) CCW/CONF.V/10, Decision 1.

  3. 3.

    Group of Governmental Experts of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Report of the 2017 session of the Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS) CCW/GGE.1/2017/CRP.1, para. 21.

  4. 4.

    Group of Governmental Experts of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Report of the 2018 session of the Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS) CCW/GGE.1/2018/3, para. 26.

  5. 5.

    Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I), of 8 June 1977, Article 36: “In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party”.

  6. 6.

    This has constantly been affirmed by the European Court of Human Rights in its case-law. See, e.g., Ergi v Turkey (App. Nos. 66/1997/850/1057), ECtHR [GC], judgment of 28 July 1998, para. 79; Isayeva, Yusupova and Bazayeva v Russia (App. Nos. 57947/00, 57948/00 and 57949/00), ECtHR [GC], judgment of 24 February 2005, paras. 195–200.

  7. 7.

    See ICT for Peace Foundation, Artificial Intelligence: Autonomous Technology (AT), Lethal Autonomous Weapons Systems (LAWS) and Peace Time Threats, Zurich, 21 February 2018.

  8. 8.

    UNCHR, “Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns” (2013), UN Doc. A/HRC/23/47, p. 16, paras. 82–85.

  9. 9.

    Heyns (2016), p. 350 ff.

  10. 10.

    Melzer (2013).

  11. 11.

    HRW, “Shaking the Foundations. The Human Rights Implications of Killer Robots” (12 May 2014), www.hrw.org/report/2014/05/12/shaking-foundations/human-rights-implications-killer-robots.

  12. 12.

    UNCHR, “Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions” (2016), UN Doc A/71/372, p. 13.

  13. 13.

    Brehm (2017).

  14. 14.

    Aust (2019).

  15. 15.

    UNCHR, “General Comment No. 36 on Article 6 of the International Covenant on Civil and Political Rights, on the right to life” (2018), UN Doc. CCPR/C/GC/36, para. 65.

  16. 16.

    Ibid., p. 17.

  17. 17.

    See generally Johns (2016).

  18. 18.

    It is not possible to scrutinize the impact of the evolution of AI on the conduct of business-related activities. See, in this regard, Lopucki (2018).

  19. 19.

    This definition was coined by the Committee of Ministers of the Council of Europe: “Recommendation Rec(2001)10 of the Committee of Ministers to member States on the European Code of Police Ethics” (2001) Rec(2001)10, Appendix.

  20. 20.

    “The term ‘law enforcement officials’ includes all officers of the law, whether appointed or elected, who exercise police powers, especially the powers of arrest or detention”. See UNGA Resolution 169 “Code of Conduct for Law Enforcement Officials” (1979), GAOR 34th Session, Article 1 (“UN Code of Conduct”).

  21. 21.

    Tomuschat (2008), p. 8.

  22. 22.

    See Melzer (2009), p. 91 ff.

  23. 23.

    This is admitted by anti-ban scholars and experts of artificial intelligence. See, for example, Arkin (2009), p. 30: robots will have the technical ability “of independently and objectively monitoring ethical behavior in the battlefield by all parties and reporting infractions that might be observed”.

  24. 24.

    See on this Brehm (2017), pp. 52–54.

  25. 25.

    See, for instance, “Technological convergence, artificial intelligence and human rights”, Report of the Committee on Culture, Science, Education and Media of the Parliamentary Assembly of the Council of Europe, Doc. 14288 of 10 April 2017, para. 53.

  26. 26.

    Schmitt and Thurnher (2013), p. 268.

  27. 27.

    Recommendation CM/Rec(2010)13 of the Committee of Ministers to member States on the protection of individuals with regard to automatic processing of personal data in the context of profiling, Adopted by the Committee of Ministers on 23 November 2010 at the 1099th meeting of the Ministers’ Deputies. See the Appendix at 1, (e).

  28. 28.

    See for example: UNCHR, “Report of the Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance” (2015), UN Doc A/HRC/29/46.

  29. 29.

    ICCPR, Article 17; ECHR, Article 8; ACHR, Article 11.

  30. 30.

    Klass and Others v Germany (App No. 5029/71), ECtHR [GC], judgment of 6 September 1978, para. 49: “the Court stresses that this does not mean that the Contracting States enjoy an unlimited discretion to subject persons within their jurisdiction to secret surveillance. The Court, being aware of the danger such a law poses of undermining or even destroying democracy on the ground of defending it”.

  31. 31.

    HRC, “Resolution adopted by the Human Rights Council” (2015), UN Doc A/HRC/RES/28/16.

  32. 32.

    UNGA, ‘The right to privacy in the digital age” (2014), UN Doc A/RES/68/167.

  33. 33.

    Ibid.

  34. 34.

    European Parliament and Council Regulation (EU) 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (2016) OJ L 119/1. See also European Parliament and Council Directive (EU) 2016/680 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JH (2016) OJ L 119/89.

  35. 35.

    See more in general Shelton and Gould (2013), pp. 564–568.

  36. 36.

    Shelton (2013), p. 23.

  37. 37.

    See in general on this issue Pisillo Mazzeschi (2008), p. 390 ff.

  38. 38.

    See UNCHR, “General Comment No. 31. The Nature of the General Legal Obligation Imposed on States Parties to the Covenant” (2014), UN Doc CCPR/C/21/Rev.1/Add. 13, para. 8. UNCHR, “General Comment No. 6: Article 6 (Right to Life)” (1982), para. 3: “The Committee considers that States parties should take measures not only to prevent and punish deprivation of life by criminal acts, but also to prevent arbitrary killing by their own security forces”.

  39. 39.

    Öneryildiz v Turkey (App No. 48939/99), ECtHR [GC], judgment of 30 November 2004, para. 71.

  40. 40.

    Ibid., para. 89.

  41. 41.

    Osman v. the United Kingdom (App. No. 23452/94), ECtHR [GC], judgment of 28 October 1998, para. 115.

  42. 42.

    Ibid.

  43. 43.

    McCann and Others v the United Kingdom (App No. 18984/91), ECtHR [GC], judgment of 27 September 1995, para. 153; Nachova and Others v Bulgaria (App. No 43577/98), ECtHR [GC], judgment of 6 July 2005, para. 95; Makaratzis v Greece (App No. 50385/99), ECtHR [GC], 20 December 2004, para. 11.

  44. 44.

    Again Makaratzis cit., para. 58.

  45. 45.

    Nachova and Others cit., para. 97.

  46. 46.

    Oneryildiz cit., para. 90; L.C.B. v United Kingdom (App. No. 23413/94), ECtHR [GC], judgement of 9 June 1998, para. 36.

  47. 47.

    Osman cit., para. 116; Demiray v Turkey (App No. 27308/95), ECtHR [GC], judgment of 21 November 2000, para. 45.

  48. 48.

    Pisillo Mazzeschi (2008), pp. 414–417. The ECtHR put it clearly in Kelly and Others v United Kingdom (App. No. 30054/96), ECtHR [GC], judgment of 4 May 2001, para. 96. The same approach is adopted by the Inter-American Court of Human Rights in Velasquez Rodriguez Case, Series C No. 4, IACtHR, judgment of 29 July 1988, paras. 176–177.

  49. 49.

    This is confirmed in international jurisprudence. See, for example, Isayeva, Yusupova and Bazayeva v Russia (App Nos. 57947/00, 57948/00 and 57949/00) ECtHR [GC], judgment of 24 February 2005, para. 210. See also Report No. 55/97, Case No. 11.137: Argentina, Inter-American Commission on Human Rights, OEA/ Ser/L/V/II.98, Doc. 38 (6 December 1997), para. 412.

  50. 50.

    UNCHR, General Comment No. 36, cit., para. 28.

  51. 51.

    Özkan and Others v Turkey (App. No. 21689/93), ECtHR, judgment of 6 April 2004, para. 314.

  52. 52.

    Nachova and Others cit., para. 119.

  53. 53.

    Tomuschat (2008), pp. 93–94.

  54. 54.

    ACHR, Article 30.

  55. 55.

    ECHR, Article 8(2). The ICCPR in Article 17 prohibits “arbitrary or unlawful interference with his privacy” (emphasis added).

  56. 56.

    UNCHR, “General Comment No. 16” in “Note by the Secretariat, Compilation of General Comments and General Recommendations adopted by Human Rights Treaty Bodies” (1988), UN Doc HRI/GEN/1/Rev.1, Vol. I, 191, para. 4.

  57. 57.

    Ibid., para. 8.

  58. 58.

    See, for example, Rotaru v Romania (App. No. 28341/95), ECtHR [GC], judgment of 4 May 2000, para. 55.

  59. 59.

    UNCHR, “Report of the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism” (2014), UN Doc A/69/397, para. 35 ff.

  60. 60.

    Ibid., para. 39.

  61. 61.

    UNCHR, “Report of Special Rapporteur on the Freedom of Expression” (2013), UN Doc A/HRC/23/40, para. 58.

  62. 62.

    In this regard, the words of Christopher Heyns in his 2013 Report are rather paradigmatic: “The danger here is that the world is seen as a single, large and perpetual battlefield and force is used without meeting the threshold requirements. LARs could aggravate these problems”. See UNCHR, “Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns”, para. 83.

  63. 63.

    Klass and Others cit., para. 50.

  64. 64.

    Szabo and Vissy v Hungary (App. No. 37138/14), ECtHR [GC], judgment of 12 January 2016, para. 82.

  65. 65.

    Roman Zakharov v. Russia (App. No. 47143/06), ECtHR [GC], judgment of 4 December 2015, para. 231.

  66. 66.

    UNGA, “The right to privacy in the digital age” cit., para. 40.

  67. 67.

    Szabo and Vissy v Hungary cit., paras. 85–87.

  68. 68.

    See Klass and Others cit., para. 56. See also Joint Declaration on surveillance programs and their impact on freedom of expression, issued by the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression and the Special Rapporteur for Freedom of Expression of the Inter-American Court of Human Rights, June 2013, para. 9.

  69. 69.

    Nachova and Others cit., para. 97; see also the Court’s criticism of the “shoot to kill” instructions given to soldiers in McCann and Others, paras. 211–214.

  70. 70.

    ICT for Peace Foundation, Artificial Intelligence cit., p. 19.

  71. 71.

    Ibid., p. 22.

  72. 72.

    See the paper of the Center for Autonomy and Artificial Intelligence, of the Center for Naval Analysis, authored by Larry Lewis, pp. 23–24.

  73. 73.

    Comments Supporting the Prohibition of Lethal Autonomous Weapons Systems Working Paper submitted by the Holy See, 7 April 2016, p. 2.

  74. 74.

    See again Pisillo Mazzeschi (2008), p. 394, specifically. See also Pisillo Mazzeschi (1992), p. 9. See also Barnidge (2006), p. 81.

  75. 75.

    See accordingly Brehm (2008), pp. 382–383.

  76. 76.

    See HRW, “Shaking the Foundations. The Human Rights Implications of Killer Robots”, p. 19. See more in depth on this issue Sparrow (2007), p. 72.

  77. 77.

    For a discussion on this see Amoroso and Tamburrini (2018), pp. 6–7. See also HRW, “Shaking the Foundations” cit., p. 20.

  78. 78.

    Haase and Peters (2017), p. 126 ff.

  79. 79.

    See, generally, Knuckey (2016), p. 164 ff. and Bhuta and Pantazopoulos (2016), p. 299. See also Sassoli (2014), p. 338.

  80. 80.

    Germany, statement on Transparency to the 2015 CCW Meeting of Experts on Lethal Autonomous Weapons Systems (17 April 2015); Sweden, statement on Transparency and the Way Forward to the 2015 CCW Meeting of Experts on Lethal Autonomous Weapons Systems (17 April 2015); Ghana, statement to the 2015 CCW Meeting of Experts on Lethal Autonomous Weapons Systems (17 April 2015).

  81. 81.

    Article 36, “Structuring debate on autonomous weapons systems: memorandum for delegates to the Convention on Certain Conventional Weapons” (November 2013) 3; Amnesty International, “Moratorium on fully autonomous robotics weapons needed to allow the UN to consider fully their far-reaching implications and protect human rights”, written statement to the 23rd session of the UN Human Rights Council (22 May 2013); HRW, “Shaking the Foundations” cit., p. 47.

  82. 82.

    UNCHR, “Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns” cit., paras. 111 and 115.

  83. 83.

    European Parliament, Report with recommendations to the Commission on Civil Law Rules on Robotics (2015/2103(INL)), 27 January 2017, p. 10, para. 12.

  84. 84.

    See on this point Burrell (2016), p. 1.

  85. 85.

    See accordingly Wachter et al. (2017), p. 76 ff.

  86. 86.

    See www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/news-parliament-2015/algorithms-in-decision-making-inquiry-launch-16-17/.

  87. 87.

    Written evidence submitted by Dr. Janet Bastiman (ALG0029) http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/algorithms-indecisionmaking/written/68990.html.

  88. 88.

    Written evidence submitted by Simul Systems Ltd (ALG0007) http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/algorithms-indecisionmaking/written/49780.html.

  89. 89.

    Written evidence submitted by Dr Alison Powell (ALG0067) http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/algorithms-indecisionmaking/written/69121.html.

References

  • Amoroso, Daniele, and Guglielmo Tamburrini. 2018. The Ethical and Legal Case against Autonomy in Weapon Systems. Global Jurist 18: 1–20.

    Google Scholar 

  • Arkin, Robert. 2009. Governing Lethal Behaviour in Autonomous Robots. London: Chapman & Hall/CRC.

    CrossRef  Google Scholar 

  • Aust, Helmut. 2019. “The System Only Dreams in Total Darkness”: The Future of Human Rights Law in the Light of Algorithmic Authority. German Yearbook of International Law 60: 71–90.

    Google Scholar 

  • Barnidge, Robert P., Jr. 2006. The Due Diligence Principle under International Law. International Community Law Review 8: 81–121.

    CrossRef  Google Scholar 

  • Bhuta, Nehal, and Stavros-Evdokimos Pantazopoulos. 2016. Autonomy and Uncertainty: Increasingly Autonomous Weapons Systems and the International Legal Regulation of Risk. In Autonomous Weapons Systems: Law, Ethics, Policy, ed. Nehal Bhuta et al., 284–300. Cambridge: Cambridge University Press.

    CrossRef  Google Scholar 

  • Brehm, Maya. 2008. The Arms Trade and States’ Duty to Ensure Respect for Humanitarian and Human Rights Law. Journal of Conflict & Security Law 12: 359–387.

    CrossRef  Google Scholar 

  • ———. 2017. Defending the Boundary. Constraints and Requirements on the Use of Autonomous Weapon Systems under International Humanitarian and Human Rights Law. Geneva Academy of International Humanitarian Law and Human Rights (Briefing No. 9).

    Google Scholar 

  • Burrell, Jenna. 2016. How the Machine “Thinks”: Understanding Opacity in Machine Learning Algorithms. Big Data & Society 3: 1–12.

    CrossRef  Google Scholar 

  • Haase, Adrian, and Emma Peters. 2017. Ubiquitous Computing and Increasing Engagement of Private Companies in Governmental Surveillance. International Data Privacy Law 7: 126–136.

    CrossRef  Google Scholar 

  • Heyns, Christopher. 2016. Human Rights and the Use of Autonomous Weapons Systems (AWS) During Domestic Law Enforcement. Human Rights Quarterly 38: 350–378.

    CrossRef  Google Scholar 

  • Johns, Fleur. 2016. Global Governance through the Pairing of List and Algorithm. Environment and Planning D: Society and Space 34: 126–149.

    CrossRef  Google Scholar 

  • Knuckey, Sarah. 2016. Autonomous Weapons Systems and Transparency: Towards an International Dialogue. In Autonomous Weapons Systems: Law, Ethics, Policy, ed. Nehal Bhuta et al., 164–184. Cambridge: Cambridge University Press.

    CrossRef  Google Scholar 

  • Lopucki, Lynn M. 2018. Algorithmic Entities. Washington University Law Review 95: 887–953.

    Google Scholar 

  • McLaughlin, Robert, and Hitoshi Nasu. 2014. Introduction: Conundrum of New Technologies in the Law of Armed Conflict. In New Technologies and the Law of Armed Conflict, ed. Robert McLaughlin and Hitoshi Nasu, 1–17. The Hague: TMC Asser Press.

    Google Scholar 

  • Melzer, Nils. 2009. Targeted Killings in International Law. Oxford: Oxford University Press.

    Google Scholar 

  • ———. 2013. Human Rights Implications of the Usage of Drones and Unmanned Robots in Warfare. Directorate-General for External Policies of the Union. Directorate B. Policy Department. Study. EXPO/B/DROI/2012/12. www.europarl.europa.eu/RegData/etudes/etudes/join/2013/410220/EXPO-DROI_ET(2013)410220_EN.pdf.

  • Pisillo Mazzeschi, Riccardo. 1992. The Due Diligence Rule and the Nature of the International Responsibility of States. German Yearbook of International Law 35: 9–51.

    Google Scholar 

  • ———. 2008. Responsabilité de l’État pour violation des obligations positives relatives aux droits de l’homme. Recueil des Cours de l’Académie de Droit International de La Haye 333: 171–506.

    Google Scholar 

  • Sassoli, Marco. 2014. Autonomous Weapons and International Humanitarian Law: Advantages, Open Technical Questions and Legal Issues to be Clarified. International Legal Studies 90: 308–340.

    Google Scholar 

  • Schmitt, Michael N., and Jeffrey S. Thurnher. 2013. “Out of the Loop”: Autonomous Weapon Systems and the Law of Armed Conflict. Harvard National Security Journal 1: 231–281.

    Google Scholar 

  • Shelton, Dinah. 2013. Private Violence, Public Wrongs and the Responsibility of States. Fordham International Law Journal 13: 1–34.

    Google Scholar 

  • Shelton, Dinah, and Ariel Gould. 2013. Positive and Negative Obligations. In The Oxford Handbook of International Human Rights Law, ed. Dinah Shelton, 562–586. Oxford: Oxford University Press.

    Google Scholar 

  • Sparrow, Robert. 2007. Killer Robots. Journal of Applied Philosophy 24: 62–77.

    CrossRef  Google Scholar 

  • Tomuschat, Christian. 2008. Human Rights Between Idealism and Realism. Oxford: Oxford University Press.

    Google Scholar 

  • Wachter, Sandra, Brent Mittelstadt, and Luciano Floridi. 2017. Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation. International Data Privacy Law 7: 76–99.

    CrossRef  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrea Spagnolo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Spagnolo, A. (2019). What Do Human Rights Really Say About the Use of Autonomous Weapons Systems for Law Enforcement Purposes?. In: Carpanelli, E., Lazzerini, N. (eds) Use and Misuse of New Technologies. Springer, Cham. https://doi.org/10.1007/978-3-030-05648-3_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-05648-3_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-05647-6

  • Online ISBN: 978-3-030-05648-3

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)