Skip to main content

Legal Responsibility in the Case of Robotics

  • Chapter
  • First Online:
Developing Support Technologies

Part of the book series: Biosystems & Biorobotics ((BIOSYSROB,volume 23))

Abstract

The development of robotics poses problems in terms of ascribing responsibility to specific individuals. This could leave the person whose rights are violated by a robot without possibility to receive damages or, in general, making someone legally responsible for the damage done. Several possible solutions to these problems posed by the technological developments are discussed, such as the introduction of the so-called “electronic person”. But these solutions will have repercussions on social concepts such as personhood, dignity, or responsibility. The article will analyze some of the legal problems posed by robotics, will show and discuss some of the discussed solutions and their possibleconsequences.

This article is a short version of the former publication “The problem of ascribing legal responsibility in the case of robotics”, AI & SOCIETY 2015.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    In the following, “autonomous” is used in a broad sense, meaning nothing more than a certain space for decision-making for the machine. For a project working on different understandings of autonomy and their changes because of new human-technology interactions see http://www.isi.fraunhofer.de/isi-de/v/projekte/WAK-MTI.php. Of course, notions such as “decision” or “learning”, which I will have to use at some points in this paper, are not meant to imply that these processes are similar to human processes of deciding or learning—instead, these are always insufficient analogies (see to the criticism about the similarity argument below). Still, until now, there are no more adequate notions to describe the processes, and the analogies do represent the minimum resemblance between humans and machines which is, inter alia, problematic.

  2. 2.

    “Fortunately, these potential failings of man [passion for inflicting harm, cruel thirst for vengeance, unpacific and relentless spirit, fever of revolt, lust of power, etc.] need not be replicated in autonomous battlefiled robots” [Ark08, p. 2].

  3. 3.

    The following aspects are discussed in more detail in [EuR13].

  4. 4.

    For the purposes of the Product Liability Directive 85/374, “damage” means (Article 9) damage caused by death or by personal injuries; damage to an item of property intended for private use or consumption other than the defective product itself.

  5. 5.

    Article 6 product liability Directive 85/374 states that a product is defective when “it does not provide the safety which a person is entitled to expect, taking all circumstances into account, including:(a) the presentation of the product;(b) the use to which it could reasonably be expected that the product would be put;(c) the time when the product was put into circulation”.

  6. 6.

    The producer is freed from all liability if he proves (Article 7): “(a) that he did not put the product into circulation; or (b) that, having regard to the circumstances, it is probable that the defect which caused the damage did not exist at the time when the product was put into circulation by him or that this defect came into being afterwards; or (c) that the product was neither manufactured by him for sale or any form of distribution for economic purpose nor manufactured or distributed by him in the course of his business; or (d) that the defect is due to compliance of the product with mandatory regulations issued by the public authorities; or (e) that the state of scientific and technical knowledge at the time when he put the product into circulation was not such as to enable the existence of the defect to be discovered; or (f) in the case of a manufacturer of a component, that the defect is attributable to the design of the product in which the component has been fitted or to the instructions given by the manufacturer of the product”.

  7. 7.

    First of all, again, the notions of adaptation and learning are meant only as analogies to human processes. Second, it is important to note that, although in many ways the problem of “many hands” [Jon82] is relevant here, the possibility of learning and adaption is, in fact, the crucial difference. Because this development is inbuilt into the machine, one does not only deal with side-effects of cooperation but with intended effects of uncontrolled development of machines.

  8. 8.

    See Amtsgericht München, Urteil vom 19.7.2007 – Az.: 275 C 15658/07, NJW RR 2008, 40.

  9. 9.

    This is not a typical legal question, thus it is not discussed in depth here—the background aspects would also probably require a different paper. It is mainly for politics (and ethics, political and social sciences) to discuss this question in more detail. Still, I think it should be mentioned because until now, the main focus of the debate lies on the question of responsibility after damaging a third party. For this to happen, one has to decide beforehand if robots should actually be used in a certain area of life. This question is rarely discussed—in public as well as in the academic debate—although it should, in my eyes, be the first step before discussing responsibility for damages. For further inspiration see, inter alia, [Bro08].

References

  1. Arkin. R. C. (2008). Governing lethal behavior: Embedding ethics in a hybrid deliberative/reactive robot architecture, GIT-GVU-07-11. https://smartech.gatech.edu/jspui/bitstream/1853/22715/1/formalizationv35.pdf.

  2. Both, G., & Weber, J. (2014). Hands-free driving? Automatisiertes Fahren und Mensch-Maschine Interaktion. In E. Hilgendorf (Ed.), Robotik im Kontext von Moral und Recht (pp. 171–188). Baden-Baden: Nomos.

    Google Scholar 

  3. Boscarato, C. (2011). Who is responsible for a robot’s actions? In B. van der Berg & L. Klaming (Eds.), Technologies on the stand: legal and ethical questions in neuroscience and robotics (pp. 383–402). Nijmegen: Wolf.

    Google Scholar 

  4. Brownsword, R. (2008). Rights, regulation and technological revolution. Oxford: Oxford University Press.

    Book  Google Scholar 

  5. euRobotics. (2013). Suggestion for a green paper on legal issues in robotics. In Leroux, C., & Labrut, R. (Eds.). http://www.eu-robotics.net/cms/upload/PDF/euRobotics_Deliverable_D.3.2.1_Annex_Suggestion_GreenPaper_ELS_IssuesInRobotics.pdf.

  6. Fitzi, G. (2013). Roboter als ‘legale Personen’ mit begrenzter Haftung. Eine soziologische Sicht. In E. Hilgendorf & J.-P. Günther (Eds.), Robotik und Recht I (pp. 377–398). Baden-Baden: Nomos.

    Google Scholar 

  7. Heyns, C. (2013). Report of the special rapporteur on extrajudicial, summary or arbitrary executions. http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf.

  8. Jonas, H. (1982). Technology as a subject for ethics. Social Research, 49, 89182898.

    Google Scholar 

  9. Weng, Y. H., Chen, C. H., & Sun, C. T. (2009). Toward the human robot co-existence society: On safety intelligence for next generation robots. International Journal of Social Robotics, 1, 267–282.

    Article  Google Scholar 

Judgement

  • Amtsgericht München, Urteil vom 19.7.2007 – Az.: 275 C 15658/07, NJW RR 2008, 40.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Susanne Beck .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Beck, S. (2018). Legal Responsibility in the Case of Robotics. In: Karafillidis, A., Weidner, R. (eds) Developing Support Technologies. Biosystems & Biorobotics, vol 23. Springer, Cham. https://doi.org/10.1007/978-3-030-01836-8_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-01836-8_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-01835-1

  • Online ISBN: 978-3-030-01836-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics