Skip to main content

Responsibility for AI

  • Chapter
  • First Online:
Robot Rules

Abstract

In this chapter, Turner describes how various legal mechanisms could be used to determine who or what is responsible for AI when it causes harm or creates something of value. Existing regimes which could be applied to AI include: negligence, product liability, vicarious liability, contracts, insurance, no-fault compensation schemes, as well as the criminal law. However, owing to AI’s independent agency and ability to act in an unforeseeable fashion, there are difficulties in applying each set of rules. AI can also create new content, from music and art to industrial design. Turner shows that running through each legal area surveyed is a tension as to whether AI systems should be treated as objects, or as subjects, as things or as persons.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 29.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 37.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Private law is sometimes also called “civil law ”. However, this terminology can be confusing because the term “civil law ” can also be used to describe legal systems that are founded upon one great codification (such as the Code civil in France or the Bürgerliches Gesetzbuch in Germany) and in which judicial precedent does not play the same crucial role as in Common law systems. See further Chapter 6 at Sections 3.1 and 6.3.2 .

  2. 2.

    For a formal account of this model, see the discussion of Hohfeld’s incidents in Chapter 4 at s. 1.1.

  3. 3.

    Gary Slapper and David Kelly, The English Legal System (6th edn. London: Cavendish Publishing), 6.

  4. 4.

    Courts can also order parties to do or not do certain things, for example, a court may require a party to cease production of a mobile phone containing technology which has been unlawfully copied from another company.

  5. 5.

    For discussion, see H.L.A. Hart, Punishment and Responsibility: Essays in the Philosophy of Law (Oxford: Oxford University Press, 2008). See also American Legal Institute Model Penal Code, as Adopted at the 1962 Annual Meeting of The American Law Institute at Washington, DC, 24 May 1962, para. 1.02(2) for a slightly expanded list of aims along the same lines.

  6. 6.

    John H. Farrar and Anthony M. Dugdale, Introduction to Legal Method (London: Sweet & Maxwell, 1984), 37.

  7. 7.

    Evidence of Lord Denning, Report of the Royal Commission on Capital Punishment, 1949–1953 (Cmd. 8932, 1953), s.53.

  8. 8.

    See, for example, “Felon Voting Rights”, National Conference of State Legislatures, http://www.ncsl.org/research/elections-and-campaigns/felon-voting-rights.aspx, accessed 1 June 2018; Hanna Kozlowska, “What would happen if felons could vote in the US ?”, Quartz, 6 October 2017, https://qz.com/784503/what-would-happen-if-felons-could-vote/, accessed 1 June 2018.

  9. 9.

    For an attempt at a systematic categorisation of obligations under English law, see English Private Law , edited by Andrew Burrows (3rd edn. Oxford: Oxford University Press, 2017).

  10. 10.

    For a classical statement, see Frederick Pollock, The Law of Torts: A Treatise on the Principles of Obligations Arising from Civil Wrongs in the Common Law (5th edn. London: Stevens & Sons, 1897), 3–4.

  11. 11.

    The distinction between civil liability arising from tort and contract may be traced at least as far as Roman law. The Institutes of Gaius (compiled c. 170 AD) stipulate that obligations could arise under two headings: ex delicto and ex contracto. The Institutes of Justinian (compiled in the sixth century AD) added two more categories, namely quasi ex delicto and quasi ex contractu. The latter are outside the scope of the present work. For discussion, see Lord Justice Jackson, “Concurrent Liability: Where Have Things Gone Wrong?”, Lecture to the Technology & Construction Bar Association and the Society of Construction Law, 30 October 2014, https://www.judiciary.gov.uk/wp-content/uploads/2014/10/tecbarpaper.pdf, accessed 1 June 2018.

  12. 12.

    Civil wrongs are referred to in some systems as “delicts” or “torts”. The source of the latter is the Latin torquere, to twist, which became in Medieval Latin tortum: a wrong or injustice. In the French Civil Code, the relevant Chapter is entitled: Des délits et des quasi-délits.

  13. 13.

    There may well also be criminal consequences in this type of situation.

  14. 14.

    Donal Nolan and John Davies, “Torts and Equitable Wrongs”, in English Private Law , edited by Burrows (3rd edn. Oxford: Oxford University Press, 2017), 934.

  15. 15.

    [1932] A.C. 562. See also Percy Winfield, “The History of Negligence in the Law of Torts”, Law Quarterly Review, Vol. 42 (1926), 184, an art. which pre-dated the Donoghue judgement by some six years.

  16. 16.

    Duties in contract and tort can, however, exist concurrently. See the judgement of the UK House of Lords in Henderson v. Merrett Syndicates [1994] UKHL 5.

  17. 17.

    Ibid., 580–581.

  18. 18.

    Art. 1382 of the French Civil Code provides that “any act of man, which causes damages to another, shall oblige the person by whose fault it occurred to repair it”. art. 1383 states: “One shall be liable not only by reason of one’s acts, but also by reason of one’s imprudence or negligence”. The precise standard to which a person is held when considering whether they are at fault is not defined. However, it appears that much like the common law standard of negligence, fault is an error of conduct measured against the standard of a reasonable man. British Institute of International and Comparative Law, “Introduction to French Tort Law”, https://www.biicl.org/files/730_introduction_to_french_tort_law.pdf, accessed 1 June 2018. All translations of the French Civil Code herein are those of Prof. Georges Rouhette with the assistance of Dr. Anne Rouhette-Berton http://www.fd.ulisboa.pt/wp-content/uploads/2014/12/Codigo-Civil-Frances-French-Civil-Code-english-version.pdf, accessed 1 June 2018.

  19. 19.

    German law has an equivalent provision in s. 823 of the German Civil Code: “A person who, intentionally or negligently, unlawfully injures the life, body, health, freedom, property or another right of another person is liable to make compensation to the other party for the damage arising from this”, https://www.gesetze-im-internet.de/bgb/__823.html, accessed 1 June 2018.

  20. 20.

    Tort Law of the People’s Republic of China, 2009. For an English translation of the text, see World Intellectual Property Organisation website, http://www.wipo.int/edocs/lexdocs/laws/en/cn/cn136en.pdf, accessed 1 June 2018. For discussion, see Ellen M. Bublick, “China ’s New Tort Law: The Promise of Reasonable Care”, Asian-Pacific Law & Policy Journal, Vol. 13, No. 1 (2011), 36–53, 44. Bublick writes: “To an outsider, the American notion of reasonable care for the safety of others seems compatible with the Chinese concept of ‘harmony,’ particularly if the legal focus on reasonable care for the safety of others is seen as creating a norm that generates moral and cultural power in its own right, not just when sanctions are imposed after a breach”.

  21. 21.

    See, for example, the English case Bolton v. Stone [1951] AC 850, HL in which the court set out the factors to be taken into account in determining liability.

  22. 22.

    The levels of precautions which a person is required to take so as to avoid causing harm to others can vary from system to system. In the UK , the approach is slightly less mechanical, in that certain other factors can serve to adjust the duties. The UK courts take into account positive externalities arising from dangerous conduct, as well as potential negative externalities. If an action is socially desirable, then this may reduce the duty to take precautions, notwithstanding the risk of damage eventuating. See Watt v. Hertfordshire CC [1954] 1 WLR 835. See also the US Court of Appeals in United States v. Carroll Towing Co. 159 F.2d 169 (2d. Cir. 1947).

  23. 23.

    See, for example, United States v. Carroll Towing Co. 159 F.2d 169 (2d. Cir. 1947).

  24. 24.

    See, for example, the judgement of the UK Supreme Court in Robinson v. Chief Constable of West Yorkshire Police [2018] UKSC 4.

  25. 25.

    In some systems, contractual and tort liability can be concurrent though. See, for instance, the position in the UK : Henderson v. Merrett [1995] 2 AC 145. In France, contractual and tort claims are non-cumulative, except in cases of professional negligence, under art 1792 of the French Civil Code. See Simon Whittaker, “Privity of Contract and the Law of Tort: The French Experience”, Oxford Journal of Legal Studies, Vol. 16 (1996), 327, 333–334. In Germany, liability may be concurrent in contract and Tort. See Lord Justice Jackson, “Concurrent Liability: Where Have Things Gone Wrong?” Lecture to the Technology & Construction Bar Association and the Society of Construction Law, 30 October 2014, https://www.judiciary.gov.uk/wp-content/uploads/2014/10/tecbarpaper.pdf, accessed 1 June 2018, 6 and the sources cited therein.

  26. 26.

    McQuire v. Western Morning News [1903] 2 KB 100 at 109 per Lord Collins MR.

  27. 27.

    Ryan Abbot, “The Reasonable Computer: Disrupting the Paradigm of Tort Liability”, The George Washington Law Review, Vol. 86, No. 1 (January 2017), 101–143, 138–139.

  28. 28.

    See, for example, s. 3(2) of the UK Automated and Electric Vehicles Act 2018.

  29. 29.

    This is a solution tentatively suggested by Hubbard in F. Patrick Hubbard, “‘Sophisticated Robots’: Balancing Liability, Regulation, and Innovation”, Florida Law Review, Vol. 66 (2015), 1803, 1861–1862.

  30. 30.

    See also Nick Bostrom’s “paperclip machine” thought experiment, discussed in Chapter 1 at s. 6.

  31. 31.

    As to strict liability , see the following section. It will be assumed for present purposes that Abbot’s definition of “autonomous” covers substantially the same entities AI within this book.

  32. 32.

    Ryan Abbot, “The Reasonable Computer: Disrupting the Paradigm of Tort Liability”, The George Washington Law Review, Vol. 86, No. 1 (January 2017), 101–143, 101.

  33. 33.

    Ibid.

  34. 34.

    Ibid., 140. Some efforts are currently underway at the level of standard-setting bodies such as the International Standards Organisation to establish general rules on these features, so at a minimum the agreement and articulation of such standards will be a prerequisite for Abbot’s scheme to work. For nascent efforts along these lines, see for instance the International Standards Organisation proposal: “ISO /IEC JTC 1/SC 42: Artificial Intelligence”, Website of the ISO, https://www.iso.org/committee/6794475.html, accessed 1 June 2018. See also Chapter 7 at s. 3.5.

  35. 35.

    See Bolam v. Friern Hospital Management Committee [1957] 2 All ER 118, as modified by Bolitho (Administratrix of the Estate of Patrick Nigel Bolitho (deceased)) v. City and Hackney Health Authority [1997] 4 All ER 771. In addition to being accepted by a body of medical practitioners, the practice must not be in the opinion of the court, unreasonable, illogical or indefensible.

  36. 36.

    For discussion of this problem in medical liability, see Shailin Thomas, “Artificial Intelligence, Medical Malpractice, and the End of Defensive Medicine”, Harvard Law Bill of Health blog, 26 January 2017, http://blogs.harvard.edu/billofhealth/2017/01/26/artificial-intelligence-medical-malpractice-and-the-end-of-defensive-medicine/ (Part I), and http://blogs.harvard.edu/billofhealth/2017/02/10/artificial-intelligence-and-medical-liability-part-ii/ (Part II), accessed 1 June 2018.

  37. 37.

    See Curtis E.A. Karnow, “The Application of Traditional Tort Theory to Embodied Machine Intelligence”, in Robot Law, edited by Ryan Calo, Michael Froomkin, and Ian Kerr (Cheltenham and Northampton, MA: Edward Elgar, 2015), 53.

  38. 38.

    See, for example, H.L.A. Hart, “Legal Responsibility and Excuses”, in Determinism and Freedom in the Age of Modern Science, edited by Sidney Hook (New York: New York University Press, 1958). Hart’s criticisms are of criminal law strict liability , but the same criticisms can be made to civil law .

  39. 39.

    In English law, the paradigm example of such strict liability is the rule in Rylands v. Fletcher (1866) L.R. 1 Ex. 265; (1868) L.R. 3 H.L. 330.

  40. 40.

    See, for example, Justice Frankfurter in United States v. Dotterweich 320 U.S. 277 (1943): “Hardship there doubtless may be under a statute which penalizes the transaction though conscious wrongdoing may be totally wanting. Balancing relative hardships, Congress has preferred to place it upon those who have at least the opportunity of informing themselves of the existence of conditions imposed for the protection of consumers before sharing in illicit commerce, rather than to throw the hazard on the innocent public who are totally helpless”. Another justification for strict liability is that in order to live in a broadly fair society, a person who seeks gain must assume the potential price of risks associated with that gain. Tony Honoré has called this “outcome responsibility”. See Tony Honoré, “Responsibility and Luck: The Moral Basis of Strict Liability ”, Law Quarterly Review, Vol. 104 (October 1988), 530–553, 553. Stapleton has expressed this idea as follows: “Perhaps [strict liability ] can be accounted for solely by the pragmatic interest in ease of adjudication which is achieved by adopting the (impossible) target of a perfect production-line norm across all units, but it seems more likely that such a widespread consensus also has a moral dimension - a view that enterprise should pay its way for this bad luck, even if unavoidable”. Jane Stapleton, Product Liability (London: Butterworths, 1994), 189.

  41. 41.

    Under the Products Liability Directive, the producer is “…the manufacturer of a finished product, the producer of any raw material or the manufacturer of a component part and any person who, by putting his name, trademark or other distinguishing feature on the product presents himself as its producer”. Products Liability Directive, art. 2(1).

  42. 42.

    See Products Liability Directive, art. 1.

  43. 43.

    Though there were some judicial moves towards this position in the earlier part of the twentieth century. For a notable example, see Justice Traynor’s concurring opinion in the US case Escola v. Coca Cola Bottling Co. 24 Cal. 2d 453, 461, 150 P.2d 436, 440 (1944). In 1931, Justice Cardozo had noted “The assault upon the citadel of privity is proceeding in these days apace”. Ultramares Corp. v. Touche, 255 N.Y. 170, 180, 174 N.E. 441, 445 (1931).

  44. 44.

    See, for example, the UK ’s investigation into the issue: “Lord Chancellor’s Department: Royal Commission on Civil Liability and Compensation for Personal Injury”, better known as the “Pearson Commission” LCO 20, which was established in 1973 and reported in 1978 (Cmnd. 7054, Vol. I, Chapter 22). Its terms of references included to consider the liability for death or personal injury “…through the manufacture, supply or use of goods or services”. See also The Law Commission and the Scottish Law Commission, Liability for Defective Products (June 1977) Cmnd. 6831; Strasbourg Convention on Products Liability in Regard to Personal Injury and Death, Council of Europe, 27 January 1977. See also Ontario Law Reform Commission, Report on Product Liability (Ministry of the Attorney-General, 1979).

  45. 45.

    There are some differences between them but the following analysis will concentrate on the shared features which appear to be common to these and other systems around the world which have been based on them. Though for such a comparison, see, for example, Lord Griffiths, Peter de Val, and R.J. Dormer, “Developments in English Product Liability Law: A Comparison with the American System”, Tulane Law Review, Vol. 62 (1987–1988), 354.

  46. 46.

    Council Directive 85/374/EEC 25 July 1985 on the approximation of the laws, regulations and administrative provisions of the Member States concerning liability for defective products (hereafter the Products Liability Directive). As a Directive, this piece of legislation is not directly binding on individuals but rather must be transposed by individual Member States. See, for example, the Consumer Products Act 1987 in the UK ; art. 1386 (1–18) in the French Civil Code.

  47. 47.

    Restatement (Third) of Torts: Products Liability paras. 12–14, at 206, 221, 227 (1997). The USA does not have Federal laws of product liability. Instead, these matters are addressed state by state. The Restatement on Products Liability is an attempt by the American Law Institute to compile existing jurisprudence on the area. See Mark Shifton, “The Restatement (Third) of Torts : Products Liability-The Alps Cure for Prescription Drug Design Liability”, Fordham Urban Law Journal, Vol. 29, No. 6 (2001), 2343–2386. For discussion see Lawrence B. Levy and Suzanne Y. Bell, “Software Product liability: Understanding and Minimizing the Risks”, Berkeley Tech. L.J., Vol. 5, No. 1 (1990), 2–6; Michael C. Gemignani, “Product Liability and Software”, 8 Rutgers Computer & Tech. L.J., Vol. 173, (1981), 204, esp. at 199 et seq. and at FN 70.

  48. 48.

    Ibid., art. 6(1).

  49. 49.

    David G. Owen, Products Liability Law (2nd edn. St. Paul, MN: Thompson West, 2008), 332 et seq.

  50. 50.

    Two of the UK Law Commissioners whose report preceded the legislation implementing the Defective Products Directive, the Consumer Protection Act 1987, suggested that the UK courts would be likely to adopt a similar approach to the tripartite categorisation to defects. See Lord Griffiths, Peter de Val, and R.J. Dormer, “Developments in English Product Liability Law: A Comparison with the American System”, Tulane Law Review, Vol. 62 (1987–1988), 354. The UK courts have been somewhat more reticent about adopting the US approach wholesale though: A and Others v. National Blood Authority and another [2001] 3 All ER 289, in which Burton J preferred the terminology “standard” and “non-standard”, rather than “manufacturing defect” and “design defect”. It is questionable how much difference this change in terminology makes in practice though.

  51. 51.

    Ellen Wang and Yu Du, “Product Recall: China ”, Getting the Deal Through, November 2017, https://gettingthedealthrough.com/area/31/jurisdiction/27/product-recall-china/, accessed 1 June 2018.

  52. 52.

    Discussed in Fumio Shimpo, “The Principal Japanese AI and Robot Strategy and Research Toward Establishing Basic Principles”, Journal of Law and Information Systems, Vol. 3 (May 2018).

  53. 53.

    For a similar real-life fact pattern, see Danny Yadron and Dan Tynan, “Tesla Driver Dies in First Fatal Crash While Using Autopilot Mode”, The Guardian, 1 July 2016, https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-death-self-driving-car-elon-musk, accessed 1 June 2018.

  54. 54.

    Horst Eidenmüller, “The Rise of Robots and the Law of Humans”, Oxford Legal Studies Research Paper No. 27/2017, 8.

  55. 55.

    Michael C. Gemignani, “Product Liability and Software”, Rutgers Computer & Technology Law Journal, Vol. 173, 204 (1981), 204.

  56. 56.

    See, for instance, Andrea Bertolini, “Robots as Products: The Case for a Realistic Analysis of Robotic Applications and Liability Rules”, Law Innovation and Technology, Vol. 5, No. 2 (2013), 214–247, 238–239; Jeffrey K. Gurney, “Sue My Car Not Me: Products Liability and Accidents Involving Autonomous Vehicles ”, University of Illinois Journal of Technology Law and Policy (2013), 247–277, 257; and Horst Eidenmüller, “The Rise of Robots and the Law of Humans”, Oxford Legal Studies Research Paper No. 27/2017, 8.

  57. 57.

    938 F.2d 1033 (9th Cir. 1991). See also Alm v. Van Nostrand Reinhold, Co., 480 N.E.2d 1263 (Ill. App. Ct. 1985): a book on construction that led to injuries. In Brocklesby v. United States 767 F.2d 1288 (9th Cir. 1985), the court held a publisher of an instrument approach procedure for aircraft strictly liable for injuries incurred due to the faulty information.

  58. 58.

    Fumio Shimpo, “The Principal Japanese AI and Robot Strategy and Research Toward Establishing Basic Principles”, Journal of Law and Information Systems, Vol. 3 (May 2018).

  59. 59.

    European Commission , “Evaluation of the Directive 85/374/EEC concerning liability for defective products”, http://ec.europa.eu/smart-regulation/roadmaps/docs/2016_grow_027_evaluation_defective_products_en.pdf, accessed 1 June 2018.

  60. 60.

    Results of the public consultation on the rules on producer liability for damage caused by a defective product, 29 April 2017, http://ec.europa.eu/docsroom/documents/23470, accessed 1 June 2018.

  61. 61.

    “Brief factual summary on the results of the public consultation on the rules on producer liability for damage caused by a defective product”, 30 May 2017, GROW/B1/HI/sv(2017) 3054035, http://ec.europa.eu/docsroom/documents/23471, accessed 1 June 2018.

  62. 62.

    Ibid., 26–27.

  63. 63.

    The Commission has announced that “[b]y mid-2019 the Commission will also issue guidance on the interpretation of the Product Liability Directive in the light of technological developments, to ensure legal clarity for consumers and producers in case of defective products”. See European Commission , “Press Release: Artificial intelligence: Commission outlines a European approach to boost investment and set ethical guidelines”, Website of the European Commission , 25 April 2018, http://europa.eu/rapid/press-release_IP-18-3362_en.htm, accessed 1 June 2018.

  64. 64.

    Products Liability Directive, art. 7. For the relationship between these defences and those available under the US system, see Lord Griffiths, Peter de Val, and R.J. Dormer, “Developments in English Product Liability Law: A Comparison with the American System”, Tulane Law Review, Vol. 62, (1987–1988), 354, 383–385.

  65. 65.

    See also art. 6(2) of the Directive: “A product shall not be considered defective for the sole reason that a better product is subsequently put into circulation”. This may have been a reasonable rule for traditional industrial products, but seems ill-suited for software where everyone rightly expects constant security updates, patches, bug fixes, etc. This is not a problem unique to AI, but it is especially pertinent for programs which by their nature learn and improve over time.

  66. 66.

    This technique is described in various different contexts as liability through agency, employment or vicarious liability , but broadly speaking they reflect the same central idea. For consistency, they will be referred to collectively as vicarious liability .

  67. 67.

    For Roman law, see William Buckland, The Roman Law of Slavery : The Condition of the Slave in Private Law from Augustus to Justinian (Cambridge: Cambridge University Press, 1908). For Islamic law, see the discussion in Muhammad Taqi Uusmani, An Introduction to Islamic Finance (London: Kluwer Law International, 2002), 108.

  68. 68.

    See Evelyn Atkinson “Out of the Household: Master-Servant Relations and Employer Liability Law”, Yale Journal of Law & the Humanities, Vol. 25, No. 2, art. 2 (2013).

  69. 69.

    See Lister v. Hesley Hall Ltd [2001] UKHL 22, in which a boarding house for children was found vicariously liable for abuse of children carried out by one of its employees, the warden.

  70. 70.

    See, for example, art. 1384 of the French Civil Code: “A person is liable not only for the damages he causes by his own act, but also for that which is caused by the acts of persons for whom he is responsible, or by things which are in his custody…”.

  71. 71.

    The French language original is as follows: «On est responsable non seulement du dommage que l’on cause par son propre fait, mais encore de celui qui est causé par le fait des personnes dont on doit répondre, ou des choses que l’on a sous sa garde.»

  72. 72.

    In the UK , such claims are made pursuant to the Civil Liability (Contribution) Act 1978.

  73. 73.

    This applies even to parents and children. The French Civil Code provides in art. 1384: “(Act of 5 April 1937) The above liability exists, unless the father and mother or the craftsmen prove that they could not prevent the act which gives rise to that liability”.

  74. 74.

    [2016] UKSC 11.

  75. 75.

    Ibid., [45]–[47].

  76. 76.

    This may not be far away. It was reported in June 2017 that the Dubai police had employed a robotic patrol robot: Agence France-Presse, “First Robotic Cop joins Dubai police”, 1 June 2017, http://www.telegraph.co.uk/news/2017/06/01/first-robotic-cop-joins-dubai-police/, accessed 1 June 2018. In reality, the “robot” does not appear to use AI, but rather acts more as a mobile computer interface which allows humans to seek information and report crimes. Nonetheless, it is apparent from examples such as this that people are increasingly accepting of the prospect of roles such as police officers being undertaken by AI/ robots.

  77. 77.

    The victim would be likely ton lose the right to sue the perpetrator insofar as is required to prevent the victim being compensated twice for the same harm (a phenomenon known as “double-recovery”). Accordingly, there may be an exception to this principle for exemplary damages (for extreme conduct such as deliberate and vindictive harm), where such additional damages are not provided for under the compensation scheme.

  78. 78.

    An example of a more limited scheme is §§ 104, 105 Sozialgesetzbuch VII in Germany. The Sozialgesetzbuch VII introduces and regulates a mandatory public insurance for workplace accidents. It is funded through mandatory contributions by all employers. If an employee suffers a workplace accident (“Arbeitsunfall”), that employee (or their family) will be paid compensation from the mandatory insurance scheme. The employer and other co-workers who may have caused the accident negligently are, in turn, freed from liability (unless they have acted wilfully).

  79. 79.

    “The levy setting process”, Website of the Accident Compensation Scheme, https://www.acc.co.nz/about-us/how-levies-work/the-levy-setting-process/?smooth-scroll=content-after-navs, accessed 1 June 2018.

  80. 80.

    Donald Harris, “Evaluating the Goals of Personal Injury Law: Some Empirical Evidence”, in Essays for Patrick Atiyah, edited by Cane and Stapleton (Oxford: Clarendon Press, 1991). Though Harris advocates replacing tort liability for personal injuries with a no-fault compensation system, he admits that the evidence supporting a link between damages liability and deterrence is inconclusive. Harris says in this regard: “the symbolic effect of tort law may greatly exceed its actual impact”.

  81. 81.

    See Uri Gneezy and Aldo Rustichini, “A Fine is a Price”, The Journal of Legal Studies, Vol. 29, No. 1 (2000).

  82. 82.

    “Population”, Government of New Zealand Website, https://www.stats.govt.nz/topics/population?url=/browse_for_stats/population.aspx, accessed 1 June 2018.

  83. 83.

    “Keeping You Safe”, Website of the Accident Compensation Scheme, https://www.acc.co.nz/preventing-injury/keeping-you-safe/, accessed 1 June 2018.

  84. 84.

    For discussions of the more general merits and disadvantages of a no-fault compensation scheme, see, for example, Geoffrey Palmer, “The Design of Compensation Systems: Tort Principles Rule, OK?” Valparaiso University Law Review, Vol. 29 (1995), 1115; Michael J. Saks, “Do We Really Know Anything About the Behavior of the Tort Litigation System—and Why Not?” University of Pennsylvania Law Review, Vol. 140 (1992), 1147; Carolyn Sappideen, “No Fault Compensation for Medical Misadventure-Australian Expression of Interest”, Journal of Contemporary Health Law and Policy, Vol. 9 (1993), 311; Stephen D. Sugarman, “Doing Away with Tort Law”, California Law Review, Vol. 73 (1985), 555, 558; Paul C. Weiler, “The Case for No-Fault Medical Liability”, Maryland Law Review, Vol. 52 (1993), 908; and David M. Studdert, Eric J. Thomas, Brett I.W. Zbar, Joseph P. Newhouse, Paul C. Weiler, Jonathon Bayuk, and Troyen A. Brennan, “Can the United States Afford a “No-Fault” System of Compensation for Medical Injury?” Law & Contemporary Problems, Vol. 60 (1997), 1.

  85. 85.

    There is some academic debate as to whether contract should be defined exclusively in terms of an agreement or promises but this is outside the scope of the present work. See, for discussion, Chitty on Contracts, edited by Hugh Beale (32nd edn. London: Sweet & Maxwell Ltd, 2015), 1-014–1-024. In the Proposal for a Regulation of the European Parliament and of the Council on a Common European Sales, Law Com (2011) 635 final, art. 2 (a) defines a contract as “an agreement intended to give rise to obligations or other legal effects”.

  86. 86.

    Historically, this was more common but has now been abandoned. Other requirements might include stipulations as to the language of the contract and the jurisdiction to which they are subject. See Mark Anderson and Victor Warner, Drafting and Negotiating Commercial Contracts (Haywards Heath: Bloomsbury Professional, 2016), 18.

  87. 87.

    In some systems, the requirement for something of value to pass is known as “consideration”.

  88. 88.

    However, it can also be the case that a contract , and indeed contractual terms, will be deemed to have been agreed by the parties as a result of their relationship. When a person buys a crate of apples, there is usually an implied term that those apples will not be full of maggots.

  89. 89.

    Kirsten Korosec, “Volvo CEO: We Will Accept All Liability When Our Cars Are in Autonomous Mode”, Fortune, 7 October 2015, http://fortune.com/2015/10/07/volvo-liability-self-driving-cars/, accessed 1 June 2018.

  90. 90.

    [1892] EWCA Civ 1.

  91. 91.

    Fumio Shimpo, “The Principal Japanese AI and Robot Strategy and Research toward Establishing Basic Principles”, Journal of Law and Information Systems, Vol. 3 (May 2018).

  92. 92.

    Dirk A. Zetzsche, Ross P. Buckley, and Douglas W. Arner, “The Distributed Liability of Distributed Ledgers: Legal Risks of Blockchain ”, EBI Working Paper Series (2017), No. 14; “Blockchain & Liability”, Oxford Business Law Blog, 28 September 2017, https://www.law.ox.ac.uk/business-law-blog/blog/2017/09/blockchain-liability, accessed 1 June 2018.

  93. 93.

    Paulius Čerkaa, Jurgita Grigienėa, Gintarė Sirbikytėb, “Liability for Damages Caused By Artificial Intelligence”, Computer Law & Security Review, Vol. 31, No. 3 (June 2015), 376–389.

  94. 94.

    However, the conclusion they point to was apparently reached by UNCITRAL in its deliberations, though does not formally form part of the convention. This is noted in the materials accompanying the published version of the Convention, which states at 70: “UNCITRAL also considered that, as a general principle, the person (whether a natural person or a legal entity) on whose behalf a computer was programmed should ultimately be responsible for any message generated by the machine (see A/CN.9/484, paras. 106 and 107)”. See http://www.uncitral.org/pdf/english/texts/electcom/06-57452_Ebook.pdf, accessed 1 June 2018.

  95. 95.

    See also the discussion of s. 9(3) of the UK Copyright, Designs and Patents Act, discussed at s. 4.1, which contains similar language.

  96. 96.

    See, for example, Robert Joseph Pothier, Treatise on Obligations, or Contracts, translated by William David Evans (London: Joseph Butterworths, 1806); James Gordley, The Philosophical Origins of Modern Contract Doctrine (Oxford: Clarendon Press, 1993), Chapter 6.

  97. 97.

    The term “privity” is derived from the Latin: privatus - meaning private.

  98. 98.

    For an influential analysis of the signalling effect of agreements in the labour market, see Michael Spence, “Signaling, Screening and Information”, in Studies in Labor Markets, edited by Sherwin Rosen (Chicago: University of Chicago Press, 1981), 319–358.

  99. 99.

    See, for instance, Parker v. South Eastern Railway Co (1877) 2 CPD 41.

  100. 100.

    Dylan Curran, “Are You Ready? Here Is All the Data Facebook and Google Have on You”, The Guardian, 30 March 2018, https://www.theguardian.com/commentisfree/2018/mar/28/all-the-data-facebook-google-has-on-you-privacy, accessed 1 June 2018.

  101. 101.

    In EU countries, see, for example, the Unfair Terms in Consumer Contracts Directive (93/13/EC).

  102. 102.

    Additional protection is provided by the various government and non-governmental bodies tasked with reviewing and periodically raising awareness of particularly egregious or harmful conduct undertaken by companies under the cover of contractual agreements. See, for example, the Federal Trade Commission in the USA , the Consumer Protection Association in the UK or the Consumer Rights Organisation in India.

  103. 103.

    See Jacob Turner, “Return of the Literal Dead: An Unintended Consequence of Rainy Sky v. Kookmin on Interpretation?” European Journal of Commercial Contract Law, Vol. 1 (2013).

  104. 104.

    See, generally, Kenneth S. Abraham, “Distributing Risk: Insurance”, Legal Theory, and Public Policy, Vol. 48 (1986).

  105. 105.

    “Primary layer” insurers will often pass on some or even all of the risk above a certain threshold to re-insurers, who may in turn do the same, thereby spreading such risk further through the market.

  106. 106.

    Curtis E.A. Karnow, “Liability for Distributed Artificial Intelligences”, Berkeley Technology Law Journal, Vol. 11, No. 1 (1996), 147–204, 176. Karnow may not be correct in his assessment that higher intelligence leads to more risks; at least some risks in the use of AI arise from it having not enough intelligence to recognise the costs of its actions or their wider impact. It might be more correct to say that the higher the level of responsibility which AI is accorded, the higher the risks. More intelligent AI is likely to be given more responsibility, thereby creating the link between intelligence and risk (albeit indirectly, and with the caveat that the intelligent AI may well be safer).

  107. 107.

    In the USA , a state-by-state list of mandatory car insurance requirements is provided at the consumer website, The Balance, “Understanding Minimum Car Insurance Requirements”, 18 May 2017, https://www.thebalance.com/understanding-minimum-car-insurance-requirements-2645473, accessed 1 June 2018. For the position in the UK , see “Vehicle Insurance”, UK Government, https://www.gov.uk/vehicle-insurance, accessed 1 June 2018.

  108. 108.

    For early arguments in favour of such a rule, at a time when car driving was in its infancy, see Wayland H. Elsbree and Harold Cooper Roberts, “Compulsory Insurance Against Motor Accidents”, University of Pennsylvania Law Review, Vol. 76 (1927–1928), 690; Robert S. Marx “Compulsory Compensation Insurance”, Columbia Law Review, Vol. 25, No. 2 (February 1925), 164–193; and for a more modern perspective, see Harvey Rosenfield, “Auto Insurance: Crisis and Reform”, University of Memphis Law Review, Vol. 29 (1998), 69, 72, 86–87.

  109. 109.

    For more information on the drafting process see “Automated and Electric Vehicles Act”, Parliament Website, https://services.parliament.uk/bills/2017-19/automatedandelectricvehicles.html, accessed 1 June 2018. See also Chapter 8 at s. 5.3.3.

  110. 110.

    Terrorism is often excluded from main policies and provided in a supplementary policy with its own premium.

  111. 111.

    Chapters 7 and 8 of this book set out the potential content for such requirements.

  112. 112.

    Curtis E.A. Karnow, “Liability for Distributed Artificial Intelligences”, Berkeley Technology Law Journal, Vol. 11, No. 1 (1996), 147–204, 196.

  113. 113.

    Indeed, some legal systems expressly prohibit insurance policies from covering wilful acts. For example, s. 533 of the California Insurance Code. For commentary, see James M. Fischer, “Accidental or Willful?: The California Insurance Conundrum”, Santa Clara Law Review, Vol. 54 (2014), 69, http://digitalcommons.law.scu.edu/lawreview/vol54/iss1/3, accessed 1 June 2018.

  114. 114.

    Olga Khazan, “Why So Many Insurers Are Leaving Obamacare: How Rejecting Medicaid and Other Government Decisions Have Hurt Insurance Markets”, The Atlantic, 11 May 2017, https://www.theatlantic.com/health/archive/2017/05/why-so-many-insurers-are-leaving-obamacare/526137/, accessed 1 June 2018.

  115. 115.

    J.Ll.J. Edwards, “The Criminal Degrees of Knowledge”, Modern Law Review, Vol. 17 (1954), 294.

  116. 116.

    Extreme carelessness may not suffice for murder, though it could be enough for the lesser crime of “manslaughter”. “Homicide: Murder and Manslaughter”, website of the UK Crown Prosecution Service, http://www.cps.gov.uk/legal/h_to_k/homicide_murder_and_manslaughter/#intent, accessed 1 June 2018.

  117. 117.

    For an exploration of innocent agency, see Peter Alldridge, “The Doctrine of Innocent Agency”, Criminal Law Forum, Autumn 1990, 45.

  118. 118.

    This analysis follows a structure proposed by Gabriel Hallevy in “The Criminal Liability of Artificial Intelligence Entities—From Science Fiction to Legal Social Control”, Akron Intellectual Property Journal, Vol. 4, No. 2, art. 1. Hallevy later expanded on these ideas in two books: Liability for Crimes Involving Artificial Intelligence Systems (Springer, 2015), and When Robots Kill: Artificial Intelligence Under Criminal Law (Boston: Northeastern University Press, 2013).

  119. 119.

    958 P.2d 1083 (Cal. 1998).

  120. 120.

    For a recent restatement of this principle with regard to joint enterprise criminal liability in the UK , see the joint decision of the UK Supreme Court and Judicial Committee of the Privy Council in R v. Jogee, Ruddock v. The Queen [2016] UKSC 8, [2016] UKPC 7.

  121. 121.

    “The Criminal Liability of Artificial Intelligence Entities—From Science Fiction to Legal Social Control”, Akron Intellectual Property Journal, Vol. 4, No. 2, art. 1, 13.

  122. 122.

    See generally: Roger Cotterell, Emile Durkheim: Law in a Moral Domain (Jurists: Profiles in Legal Theory) (Edinburgh: Edinburgh University Press, 1999).

  123. 123.

    See, for example, Carlsmith and Darley, “Psychological Aspects of Retributive Justice”, in Advances in Experimental Social Psychology, edited by Mark Zanna (San Diego, CA: Elsevier, 2008).

  124. 124.

    John Danaher, “Robots, Law and the Retribution Gap”, Ethics and Information Technology, Vol. 18, No. 4 (December 2016), 299–309.

  125. 125.

    Anthony Duff, Answering for Crime: Responsibility and Liability in Criminal Law (Oxford: Hart Publishing, 2007).

  126. 126.

    John Danaher, “Robots, Law and the Retribution Gap”, Ethics and Information Technology, Vol. 18, No. 4 (December 2016), 299–309.

  127. 127.

    See also Chapter 5 at s. 4.5 where this factor is discussed as a potential motivation for giving AI legal personality .

  128. 128.

    See Artificial Intelligence in Engineering Design, edited by Duvvuru Siriam and Christopher Tong (New York: Elsevier, 2012).

  129. 129.

    Bartu Kaleagasi, “A New AI Composer Can Write Music as well as a Human Composer”, Futurism, 9 March 2017, https://futurism.com/a-new-ai-can-write-music-as-well-as-a-human-composer/, accessed 1 June 2018.

  130. 130.

    Elgammal et al., “CAN: Creative Adversarial Networks Generating ‘Art’ by Learning About Styles and Deviating from Style Norms”, Paper published on the eighth International Conference on Computational Creativity (ICCC), held in Atlanta, GA, 20–22 June 2017 arXiv:1706.07068v1 [cs.AI], 21 June 2017, https://arxiv.org/pdf/1706.07068.pdf, accessed 1 June 2018.

  131. 131.

    For examples, see Ryan Abbott, “I Think, Therefore I Invent: Creative Computers and the Future of Patent Law”, Boston College Law Review, Vol. 57 (2016), 1079, http://lawdigitalcommons.bc.edu/bclr/vol57/iss4/2, accessed 1 June 2018. See in particular FN 23–138 and accompanying text.

  132. 132.

    Jonathan Turner, Intellectual Property and EU Competition Law (2nd edn. Oxford: Oxford University Press, 2015), at para. 6.03 et seq.

  133. 133.

    C-5/08 Infopaq International v. Danske Dagblades judgment paras. 34–39, CJ; C-403, 429/08 FAPL v. QC Leisure judgment paras. 155–156.

  134. 134.

    Eva-Maria Painer v. Standard VerlagsGmbH, Axel Springer AG, Süddeutsche Zeitung GmbH, Spiegel-Verlag Rudolf Augstein GmbH & Co KG, Verlag M. DuMont Schauberg Expedition der Kölnischen Zeitung GmbH & Co KG (Case C-145/10).

  135. 135.

    Ibid. See also SAS Institute v. World Programming judgement paras. 65–67, CJ. 37 C-393/09 Bezpečnostní softwarová asociace v. Ministerstvo kultury judgment paras. 48–50, CJ.

  136. 136.

    Directive 2001/29, Arts. 2–4; Directive 2006/115, Arts. 3(1), 7, and 9(1).

  137. 137.

    The recitals to Directive 2006/116/EC on the term of protection of copyright and certain related rights refer to cases where “one or more physical persons are identified as authors” (emphasis added)—presumably in distinction to references to “persons” elsewhere in the directive, which would refer to legal persons also.

  138. 138.

    Andres Guadamuz, “Artificial Intelligence and copyright”, WIPO Magazine, October 2017, http://www.wipo.int/wipo_magazine/en/2017/05/article_0003.html, accessed 1 June 2018. For Spain, see Law No. 22/1987 of 11 November 1987, on intellectual property, and for Germany, see Urheberrechtsgesetz Teil 1 - Urheberrecht (§§ 1–69g), Abschnitt 3 - Der Urheber (§ 7). § 7 UrhG does not state expressly that the author of a copyrighted work has to be human being. It merely states: “The creator (‘Schöpfer’) is the author”. It is generally understood, though, that the law supposes that only humans can “create” and thus be “creators”.

  139. 139.

    The Compendium of U.S. Copyright Office Practices: Chapter 300, https://copyright.gov/comp3/chap300/ch300-copyrightable-authorship.pdf, accessed 1 June 2018.

  140. 140.

    111 U.S. 53, 58 (1884). The position is supported by later US case law (e.g. Feist Publications v. Rural Telephone Service Company, Inc. 499 U.S. 340 (1991)) which specifies that copyright law only protects “the fruits of intellectual labor” that “are founded in the creative powers of the mind”.

  141. 141.

    519 A.2d 1337, 1338 (Md. 1987), overturned on other grounds in 318 North Market Street, Inc. et al. v. Comptroller of the Treasury, 554 A.2d 453 (Md. 1989).

  142. 142.

    Ibid., at 1339.

  143. 143.

    For discussions of how computer-generated creations might be addressed particularly in US copyright law, as well as a proposal for a general scheme applicable to AI-generated works, see Annemarie Bridy, “Coding Creativity: Copyright and the Artificially Intelligent Author”, Stanford Technology Law Review (2012), 1. See also Ralph D. Clifford, “Intellectual Property in the Era of the Creative Computer Program: Will the True Creator Please Stand Up?” Tulane Law Review, Vol. 71 (1997), 1675, 1696–1697; and Pamela Samuelson, “Allocating Ownership Rights in Computer-Generated Works”, University of Pittsburgh Law Review, Vol. 47 (1985), 1185.

  144. 144.

    New Zealand and Ireland both use the same language. See Copyright Act of 1994, 2 (New Zealand); Copyright and Related Rights Act 2000, Part I, 2 (Act. No. 28/2000) (Ireland).

  145. 145.

    Toby Bond, “How Artificial Intelligence Is Set to Disrupt Our Legal Framework for Intellectual Property Rights”, IP Watchdog, 18 June 2017, http://www.ipwatchdog.com/2017/06/18/artificial-intelligence-disrupt-legal-framework-intellectual-property-rights/id=84319/, accessed 1 June 2018. See also Burkhard Schafer et al., “A Fourth Law of Robotics? Copyright and the Law and Ethics of Machine Coproduction”, Artificial Intelligence and Law, Vol. 23 (2015), 217–240; Burkhard Schafer, “Editorial: The Future of IP Law in an Age of Artificial Intelligence”, SCRIPTed, Vol. 13, No. 3 (December 2016), via: https://script-ed.org/wp-content/uploads/2016/12/13-3-schafer.pdf, accessed 1 June 2018.

  146. 146.

    Guadamuz, Andrés, “The Monkey Selfie: Copyright Lessons for Originality in Photographs and Internet Jurisdiction”, Internet Policy Review, Vol. 5, No. 1 (2016), https://doi.org/10.14763/2016.1.398. http://policyreview.info/articles/analysis/monkey-selfie-copyright-lessons-originality-photographs-and-internet-jurisdiction, accessed 1 June 2018.

  147. 147.

    NARUTO, a Crested Macaque, by and through his Next Friends, People for the Ethical Treatment of Animals, Inc., Plaintiff-Appellant, v. DAVID JOHN SLATER; BLURB, INC., a Delaware corporation; WILDLIFE PERSONALITIES, LTD., a United Kingdom private limited company, No. 16-15469 D.C. No. 3:15-cv-04324- WHO, https://assets.documentcloud.org/documents/2700588/Gov-Uscourts-Cand-291324-45-0.pdf, accessed 1 June 2018.

  148. 148.

    Jason Slotkin, “‘Monkey Selfie’ Lawsuit Ends With Settlement Between PETA, Photographer”, NPR, 12 September 2017, https://www.npr.org/sections/thetwo-way/2017/09/12/550417823/-animal-rights-advocates-photographer-compromise-over-ownership-of-monkey-selfie, accessed 1 June 2018.

  149. 149.

    Monkey selfie case : Judge rules animal cannot own his photo copyright, The Guardian, 7 January 2016, https://www.theguardian.com/world/2016/jan/06/monkey-selfie-case-animal-photo-copyright, accessed 1 June 2018. David Slater announced in 2017 that he was “broke” as a result of the court case, despite having ultimately prevailed. Julia Carrie Wong, “Monkey Selfie Photographer Says He’s Broke: ‘I’m Thinking of Dog Walking”, The Guardian, 13 July 2017, https://www.theguardian.com/environment/2017/jul/12/monkey-selfie-macaque-copyright-court-david-slater, accessed 1 June 2018.

  150. 150.

    Ibid.

  151. 151.

    Meagan Flyn, “Monkey Loses Selfie Copyright Case. Maybe Monkey Should Sue PETA, Appeals Court Suggests”, The Washington Post, 24 April 2018, https://www.washingtonpost.com/news/morning-mix/wp/2018/04/24/monkey-loses-selfie-copyright-case-maybe-monkey-should-sue-peta-appeals-court-suggests/?utm_term=.afe1b1b181d6, accessed 1 June 2018.

  152. 152.

    NARUTO, a Crested Macaque, by and through his Next Friends, People for the Ethical Treatment of Animals, Inc., Plaintiff-Appellant, v. DAVID JOHN SLATER; BLURB, INC., a Delaware corporation; WILDLIFE PERSONALITIES, LTD., a United Kingdom private limited company, No. 16-15469 D.C. No. 3:15-cv-04324- WHO, http://cdn.ca9.uscourts.gov/datastore/opinions/2018/04/23/16-15469.pdf, accessed 1 June 2018, citing at p. 11 Cetacean Community, 386 F.3d at 1171.

  153. 153.

    For the US rules, see, 35 U.S.C. paras. 101–02, 112 (2000). In the European system, the criteria are that the invention must be “new, involve an inventive step and are susceptible of industrial application”. Art. 52 European Patent Convention.

  154. 154.

    Ryan Abbot, “Everything is Obvious”, 22 October 2017, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3056915, accessed 1 June 2018.

  155. 155.

    Constitution of South Africa, s. 16.

  156. 156.

    Constitution of India, art. 19.

  157. 157.

    Toni M. Massaro and Helen Norton, “Siri-ously? Free Speech Rights and Artificial Intelligence”, Northwestern University Law Review, Vol. 110, No. 5, 1175, citations omitted.

  158. 158.

    Though see Chapter 4 for discussion of when AI might justify such protection in its own right.

  159. 159.

    At present, AI lacks the consciousness required for it to be deemed worthy of non-instrumentalist protections, but as shown in Chapter 4, this may not always be the case.

  160. 160.

    “Lese-majeste Explained: How Thailand Forbids Insult of Its Royalty”, BBC Website, http://www.bbc.co.uk/news/world-asia-29628191, accessed 1 June 2018.

  161. 161.

    Citizens United v. Federal Election Commission, 558 U.S. 310 (2010).

  162. 162.

    Ross Luipold, “Colbert Trolls Fox News By Offering @RealHumanPraise On Twitter, and It’s Brilliant”, Huffington Post, 5 November 2013, http://www.huffingtonpost.co.uk/entry/colbert-trolls-fox-news-realhumanpraise_n_4218078, accessed 1 June 2018.

  163. 163.

    Samuel C. Woolley, “Automating Power: Social Bot Interference in Global Politics”. First Monday, Vol. 21, No. 4 (2016).

  164. 164.

    Alexei Nikolsky and Ria Novosti, “Russia Used Twitter Bots and Trolls ‘to Disrupt’ Brexit Vote”, The Times, 15 November 2017. See also Brundage, Avin et al., The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation, February 2018, https://img1.wsimg.com/blobby/go/3d82daa4-97fe-4096-9c6b-376b92c619de/downloads/1c6q2kc4v_50335.pdf, accessed 1 June 2018.

  165. 165.

    Rich McCormick, “Amazon Gives up Fight for Alexa’s First Amendment Rights After Defendant Hands Over Data”, The Verge, 7 March 2017, https://www.theverge.com/2017/3/7/14839684/amazon-alexa-firstamendment-case, accessed 20 August 2018. The case was State of Arkansas v. James A. Bates Case No. CR-2016-370-2.

  166. 166.

    Helena Horton, “Microsoft Deletes ‘Teen Girl’ AI After It Became a Hitler-Loving Sex Robot Within 24 hours”, The Telegraph, 24 March 2016, http://www.telegraph.co.uk/technology/2016/03/24/microsofts-teen-girl-ai-turns-into-a-hitler-loving-sex-robot-wit/, accessed 1 June 2018. It should be noted that Tay did not generate the content unprompted; various computer programmers swiftly discovered how to game its algorithms to cause it to generate offensive content. See Chapter 8 at s. 3.2.2 for discussion of how the program was corrupted.

  167. 167.

    Yascha Mounk, “Verboten: Germany’s Risky Law for Stopping Hate Speech on Facebook and Twitter”, New Republic, 3 April 2018, https://newrepublic.com/article/147364/verboten-germany-law-stopping-hate-speech-facebook-twitter, accessed 1 June 2018.

  168. 168.

    Toni M. Massaro and Helen Norton, “Siri-ously? Free Speech Rights and Artificial Intelligence”, Northwestern University Law Review, Vol. 110, No. 5.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jacob Turner .

Rights and permissions

Reprints and permissions

Copyright information

© 2019 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Turner, J. (2019). Responsibility for AI. In: Robot Rules . Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-96235-1_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-96235-1_3

  • Published:

  • Publisher Name: Palgrave Macmillan, Cham

  • Print ISBN: 978-3-319-96234-4

  • Online ISBN: 978-3-319-96235-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics