Skip to main content

The Construction of a Normative Framework for Technology-Driven Innovations: A Legal Theory Perspective

  • Chapter
  • First Online:
Use and Misuse of New Technologies
  • 1309 Accesses

Abstract

Technology developments change the way we conceive the normative force of law and legal systems. Traditionally based on written texts, and on their interpretation by a professional class of jurists, normativity seems nowadays to migrate into technological devices, increasing the performative effect of regulation. This shift calls into question the “flexibility” of law as a fundamental performance of the rule of law and of constitutional democracy. These problems can only be addressed by taking into consideration the multifactorial prism of regulation, in a pluralistic dimension that has been highlighted by studies on the architectural dimension of cyberspace and, in particular, on the “code”. In this perspective, asserting that technological devices are sheer “instruments” divested of normative implications is anything but an illusion: their regulative force, in fact, is embedded in their own “design” from the outset. Before envisaging scenarios dominated by ungovernable technology, it is therefore useful to emphasize the “responsibility” of coders and operators. In this way, the question of human responsibility re-emerges as a crucial factor for the elaboration of a normative framework that preserves the conditions of an intersubjective coexistence marked by freedom.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Change history

  • 06 July 2019

    The name of the author of the chapter 10 was incorrectly captured as “Francesco Vanna” instead of it being mentioned as “Francesco De Vanna”.

Notes

  1. 1.

    Irti (2007), p. 13 (my own translation).

  2. 2.

    Rodotà (2012), p. 352. In this regard, Pascuzzi (2016) writes as follows: “The technological standards are defined based this is not a field on the most advanced knowledge in a specific historical context in a certain field. The industry experts are able to define the most advanced notions: in this sense, they are referred to as technicians. The law standards in the digital age are determined by technicians (who address other technicians.)”, p. 297 (my own translation).

  3. 3.

    Khanna and Khanna (2012).

  4. 4.

    “[T]echnologies of the self, which permit individuals to effect by their own means or with the help of others a certain number of operations on their own bodies and semis, thoughts, conduct, and way of being, so as to transform themselves in order to attain a certain state of happiness, purity, wisdom, perfection, or immortality”. Foucault (1988), p. 18.

  5. 5.

    Salazar (2014), p. 257.

  6. 6.

    di Robilant (1973), p. 230.

  7. 7.

    Butler (1872).

  8. 8.

    The introduction and the following development of the telephone technology, for example, shows the change in the interpersonal behaviour: soon it was evident that the ‘sociality’ value embodied by the telephone (albeit involuntarily) was in conflict with the ‘privacy’ one. A telephone call, in fact, is an intrusion into the lives of friends, family members and (potential) customers. The protection of the private space of the individuals has been further questioned by the introduction of mobile telephones, through which we can be reached anywhere and at any time. (…) Users have started (…) to call the others without any prior notice, via email or an SMS, which is regarded as an improper behaviour. (…) This change in the habits goes hand in hand with a different priority relationship between the value of sociality and privacy. In this context, privacy is stronger than sociability”. Bisol et al. (2014), p. 247.

  9. 9.

    Warren and Brandeis (1890).

  10. 10.

    Floridi (2014).

  11. 11.

    Rodotà (2012), p. 335.

  12. 12.

    Weiser (1993).

  13. 13.

    It is no coincidence that the traditional social divisions, represented by the classes, have been replaced by new categorisations based on the ability to use the new information technologies. In 1999 the American writer William Gibson said during a radio program: “The future is already here, it’s not just very evenly distributed”.

  14. 14.

    Balkin (2016), p. 40.

  15. 15.

    Warren and Brandeis (1890), p. 193.

  16. 16.

    Westin (1967), p. 7.

  17. 17.

    Westin (1982), p. 112.

  18. 18.

    Lévy (1990), pp. 100–101 (my own translation).

  19. 19.

    Recalling Radbruch (1950), Agata Amato Mangiameli says that “no weaver knows what law weaves”, Amato Mangiameli (2017).

  20. 20.

    Hildebrandt (2008), p. 175.

  21. 21.

    For an insight into the meaning of Rechsstaat, especially in relation to the Rule of Law, see Krygier (2009) and Palombella (2009).

  22. 22.

    Rawls (1955) and Searle (1964).

  23. 23.

    Palombella (1990) says that: “The “regulatory crisis” of the contemporary State is certainly the result of the principle of the omnipotence of law, which has turned into the widespread standardisation of every aspect of social reality. But the fact that the law influences the events through a qualification process leads to at least two alternatives: on the one hand, the principle of juridical qualification becomes the constitutive principle of reality; on the other hand, the intervention of the juridical rule adheres to a pre-existing reality”, p. 367 (my own translation).

  24. 24.

    Leenes and Lucivero (2014), p. 14.

  25. 25.

    Koops (2008), p. 165. Cfr. Brownsword (2005).

  26. 26.

    “[L]iberty is constructed by structures that preserve a space for individual choice, however that choice may be constrained”, Lessig (1999a, b), pp. 7–8. As evidenced by Koops, recalling Brownsword’s assumptions “[f]or human dignity, it is important not only that right choices are made (to comply with the rules) but also that wrong choices can be made, and that not all ‘bad’ things are simply made impossible, for human life is enriched by suffering”. Koops (2008), p. 165.

  27. 27.

    “Technology that sets new norms clearly raises questions about the acceptability of the norms, but also if technology is used ‘only’ to enforce existing legal norms, its acceptability can be questioned, since the reduction of ‘ought’ or ‘ought not’ to ‘can’ or ‘cannot’ threatens the flexibility and human interpretation of norms that are fundamental elements of law in practice”. Ibid., pp. 157–158.

  28. 28.

    “Because technology is often irreversible — once it is developed and applied in society, it is hard to fundamentally remove it from society in those applications—the process of developing technology is a key focus when normativity is at stake. After all, it may well be too late when technology simply appears in society to ask whether it is acceptable to use this technology; quite often, the genie may then be out of the bottle never to be put back in”. Ibid., p. 166.

  29. 29.

    Zagrebelsky (1992).

  30. 30.

    As stated by Zagrebelsky (1992): “The modern English parliament does not rely on a clear shift from the production of law through the activity of the Courts to the “legislative” production. Among the essential criteria of this “extraction” of law from the practical cases, there are “circumstances, conveniency, expediency, probability”—The progresses of law did not actually depend on the increasingly refined deduction from great immutable and rational principles (the scientia iuris), but rather on the induction from empirical experience, enlightened by situations (the iuris prudentia), through “challenge and answer, trial and error”” (pp. 27–28). See also Kluxen (1980), p. 103.

  31. 31.

    Koops (2012).

  32. 32.

    Reidenberg (1998), p. 584.

  33. 33.

    Klabbers (2017), p. 28.

  34. 34.

    This is the meaning generally attributed to Article 6 of the European Convention on Human Rights associated with the concept of due process, in the sense that the accused must be able to effectively defend himself/herself.

  35. 35.

    Hildebrandt and Koops (2010), p. 438.

  36. 36.

    It is worth mentioning the significant contributions of Mitchell (1995) and De Monchaux and Schuster (1997).

  37. 37.

    A “code policy” is, first of all, a policy concerning the intellectual property and the forms that it can take in the digital era. This leads the code to be regarded as a “common good” (commons), together with the contribution provided by the opening of the code to the re-evaluation of the idea of commons”. Goldoni (2007), p. 23.

  38. 38.

    Lessig (1999b), p. 511.

  39. 39.

    Posner (2000).

  40. 40.

    Lessig (1999a, b). This article was written to reply to Judge Frank Easterbook who, during a lecture held at the University of Chicago, said that a law of cyberspace could not express any general heuristic resource, precisely just like a “law of the horse” or any law specifically focusing on a particular object or space. See Easterbrook (1996), pp. 207 ff.

  41. 41.

    Cfr. Vermaas et al. (2008) and Yeung (2008).

  42. 42.

    According to Langdon Winner, the overpasses in Long Island were low, because they had been intentionally designed as such: thus the buses could not pass under the overpasses, and the lower social classes could not reach the beaches of New York. See Winner (1980).

  43. 43.

    Lessig (1999b), p. 511.

  44. 44.

    It was observed that “architecture should be intended in a broad sense, i.e. as the organization of any kind of space by means of the materials available. Architecture somehow represents the “nature” of a context but, unlike natural data (which can be rarely changed, therefore they are considered stable or unchanging), it can be either fully or partially changed to review the organizational structure of the space in question”. Goldoni (2007), p. 3.

  45. 45.

    Lessig (1999b), pp. 508–509.

  46. 46.

    “The ‘net regulation’ of any particular policy is the sum of the regulatory effects of the four modalities together. A policy trades off among these four regulatory tools. It selects its tool depending upon what works best”. Lessig (1999b), p. 507.

  47. 47.

    Koops (2008), p. 159.

  48. 48.

    Janic et al. (2013), p. 18; in terms of differentials “[w]hile PET’s ‘think’ in terms of shielding personal data, TETs ‘think’ in terms of empowering individuals by making profiling activities visible”. See Hildebrandt and Koops (2010), p. 450.

  49. 49.

    “[R]ules must be embedded in such a way that they share the nuance and flexibility of the natural-language rules that determine the written law. (…) there is a democratic challenge: is value-embedded technology or the articulation of legal norms in digital technologies legitimate?”, Hildebrandt and Koops (2010), p. 452.

  50. 50.

    Koops (2008), p. 172.

  51. 51.

    Lessig (1999a, b), p. 546.

  52. 52.

    Lessig’s assumption falls within paradigm of the downfall of the legal exclusivism (or centralism) because, “although it did not give up the role of the authoritative constituting, it has put it in relation with other factors which, in various ways, affect the conduct of the partners, precisely identifying the binding character in the result of this mechanism”. Laghi (2015), p. 156.

  53. 53.

    Lévy (1990), p. 68 (my own translation).

  54. 54.

    “Technology is neither good nor bad; nor is it neutral”, Kranzberg (1986).

  55. 55.

    “The way a technology is designed, in short it defines and influences the actions of the subject, preventing some, allowing or helping others. In this sense, objects of daily use have a moral content: prescribe, oblige, allow, prohibit and regulate the behavior of users”. See Bisol et al. (2014), p. 246.

  56. 56.

    “[M]oving from one type of infrastructure to the next has major consequences for the manner in which legal authority and normativity can be sustained. For lawyers and legislators it may be too obvious to note that modern law is in fact technologically embodied, namely in the technology of the script and the printing press”, Hildebrandt (2011), p. 237.

  57. 57.

    Ibid., p. 231, thus wrote: “If we want to sustain the rights and freedoms that developed with modern legal systems, legislators need to engage in the design of the novel computational infrastructures, taking care that they at least provide effective legal protection against their own omniscience and their capability to enforce a normativity that goes against the grain of constitutional democracy”.

  58. 58.

    Hildebrandt (2010), p. 454.

  59. 59.

    Leenes and Lucivero (2014), p. 209.

  60. 60.

    Cfr. Thaler and Sunstein (2008).

  61. 61.

    Becker (1976).

  62. 62.

    “Computers weren’t initially created to persuade; they were built for handling data – calculating, storing, and retrieving. But as computers have migrated from research labs onto desktops and into everyday life, they have become more persuasive by design. Today computers are taking on a variety of roles as persuaders, including roles of influence that traditionally were filled by teachers, coaches, clergy, therapists, doctors, and salespeople, among others. We have entered an era of persuasive technology, of interactive computing systems designed to change people’s attitude and behaviors”. Fogg (2003), p. 1.

  63. 63.

    Norman (1988), p. 9.

  64. 64.

    Norman (2007).

  65. 65.

    Akrich (1992).

  66. 66.

    Latour (1992).

  67. 67.

    Leenes (2011).

  68. 68.

    Rossato (2006).

  69. 69.

    Pagallo (2014), p. 130.

  70. 70.

    Kant (1991), p. 74.

  71. 71.

    “Regulation affects the behaviour of individuals and (often) restricts their autonomy and freedom to act. This requires justification. This requirement equally applies to restrictions imposed by the state and to those imposed by private entities. The nature of the justification may differ. The legitimation of state intervention is well understood: it has to be legitimate and based on the rule of law. The justification of intervention in freedoms of individuals by private entities is usually based on consent or the protection of rights”, Leenes (2011), p. 149.

  72. 72.

    Lessig (1999a), p. 98.

  73. 73.

    Smith (2000).

  74. 74.

    This type of intelligence is largely used, for example, for self-driving cars as well as for the autopilot system for civil aviation and drones. A famous example is Deep Blue, a robot which defeated the chess champion G. Kasparov in 1997.

  75. 75.

    Fabris (2012), p. 80.

  76. 76.

    Ivi, p. 81.

  77. 77.

    The three laws formulated by Asimov in 1942 in his “Runaround” short story are listed below:

    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm; 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law; 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. Cfr. Asimov (1950).

  78. 78.

    From this point of view, the application of a “black box” to smart cars has been suggested so that any failure can be detected; moreover, the users of these cars should enter into an insurance agreement which covers any damage possibly caused by “autonomous” robots.

  79. 79.

    Moro (2015), p. 530 (my own translation).

  80. 80.

    Kurzweil (2005, 2012).

  81. 81.

    Searle (1980).

  82. 82.

    These cases are referred to as “anthropomorphism” or “zoomorphism”, i.e. the tendency to assign the characteristics of men or animals to inanimate beings: “[h]umans may also project emotions, feelings of pleasure and pain, the capacity to form relationships with others, and the capacity to care for others and be cared by them in turn. The projection of human or animal emotions onto inanimate objects is as old as history itself. People hear the wind howl and the ocean roar; they project agency and loyalty onto their ships and cars. The projection of humanity onto what is not human is the reflection of the self on the outside world”, Balkin (2015), p. 56.

  83. 83.

    “When we criticize algorithms, we are really criticizing the programming, or the data, or their interaction. But equally important, we are also criticizing the use to which they are being put by the humans who programmed the algorithms, collected the data, or employed the algorithms and the data to perform particular tasks”, Balkin (2017), p. 14.

  84. 84.

    Grion (2016).

  85. 85.

    Balkin (2017), p. 8.

  86. 86.

    Balkin (2016).

  87. 87.

    “Confidentiality law arose centuries ago to keep certain kinds of shared information private. Multiple areas of the law provide confidentiality protections for preventing the disclosure of information in intermediate states, whether through professional duties of confidentiality, implied or expressed contracts for confidentiality, evidentiary privileges, or statutory rules. We have long had confidentiality rules like the duties lawyers owe to their clients and doctors owe to their patients to incent individuals to feel safe in sharing their confidences to advance important societal values of providing effective legal representation and medical care. We also have statutory rules that explicitly create confidential relationships regarding health, financial, and video records information. We also protect obligations of confidentiality that arise through voluntary promises or confidentiality agreements like preventing employees from revealing business secrets. Confidentiality law reveals how we have long recognized shared information can still be kept private using effective legal tools. Expanding confidentiality law approaches would seem to be one way to help keep shared information private”, Richards and King (2014), pp. 415–416.

  88. 88.

    As written by Frank Pasquale, the large companies of the digital era are “black boxes” whose operation, often subjected to market targets, are not accessible not only to users but also to the analysts. Pasquale (2015).

  89. 89.

    Rodotà (2012), p. 322.

  90. 90.

    Reidenberg (1998), p. 583.

  91. 91.

    Zittrain (2014).

  92. 92.

    “[G]overnments should intervene…when private action has public consequences”, writes Lessig (1999a, b), p. 233. Mireille Hildebrandt pointed out and further developed this aspect stating that “genetic tests or technologically enhanced soldiers should be obligated to present their case to the public that is composed of those that will suffer or enjoy the consequences. In other words, the hybrids that are propelled into the collective must survive the scrutiny of the public that constitutes itself around what it considers to be a matter concern”.

  93. 93.

    The current Italian legal system includes a civilistic category which addresses the same need for “protection”: they are agreements with “protective effects” towards any third party leading to the division between the obligation of protection and performance obligation, and the content of the obligation is not only what is written (primary obligation of performance) but also what is right (secondary obligations of protection, either instrumental or accessory).

  94. 94.

    “[T]he algorithm affects your reputation by placing you in a category or class, which is not necessarily an assessment of risk. The algorithm constructs groups in which you are placed and through which you are known and therefore potentially acted upon. Classification can affect your reputation without an assessment of risk because it says what kind of person you are and who you are treated as equivalent to (and, implicitly, better than or worse than according to some metric)”, Balkin (2017), p. 40.

  95. 95.

    “[H]uman beings and organizations can use algorithms to lead you and others like you to make (more or less) predictable choices that benefit the algorithm operator but do not enhance your welfare and may actually reduce your welfare”, Balkin (2017), p. 41.

  96. 96.

    “[T]he algorithm causes you to internalize its classifications and assessments of risk, causing you to alter your behavior in order to avoid surveillance or avoid being categorized as risky”, ibid.

  97. 97.

    Balkin discourages the application of criminal law and anti-discriminatory law: in fact, the use of the “respondeat superior” principle would make no sense, since the algorithm itself does not represent any subjective attribution and, consequently, it is impossible to transfer the responsibility which must be assigned—since the very beginning, to the human being.

  98. 98.

    Ivi, p. 34.

References

  • Akrich, Madeline. 1992. The De-Scription of Technological Objects. In Shaping Technology Building Society: Studies in Sociotechnical Change, ed. Wiebe Bijker and John Law, 205–224. Cambridge: MIT Press.

    Google Scholar 

  • Amato Mangiameli, Agata. 2017. Tecno-regolazione e diritto. Brevi note su limiti e differenze. Diritto dell’informazione e dell’informatica: 147–167.

    Google Scholar 

  • Asimov, Isaac. 1950. Rounaround. In I, Robot. New York: Bantam.

    Google Scholar 

  • Balkin, Jack. 2015. The Path of Robotics Law. California Law Review 6: 45–60.

    Google Scholar 

  • Balkin, M. Jack. 2016. Information Fiduciaries and the First Amendment. U.C Davis Law Review 49: 1183–1234.

    Google Scholar 

  • ———. 2017. 2016 Sidley Austin Distinguished Lecture on Big Data Law and Policy: The Three Laws of Robotics in the Age of Big Data. Ohio State Law Journal 78: 1217. Yale Law School, Public Law Research Paper No. 592.

    Google Scholar 

  • Becker, Gary. 1976. The Economic Approach to Human Behavior. Chicago: University of Chicago Press.

    Book  Google Scholar 

  • Bisol, Benedetta, Antonio Carnevale, and Federica Lucivero. 2014. Diritti umani, valori e nuove tecnologie. Metodo. International Studies in Phenomenology and Philosophy 2: 235–252.

    Article  Google Scholar 

  • Brownsword, Roger. 2005. Code, Control, and Choice: Why East is East and West is West. Legal Studies 1: 1–21.

    Article  Google Scholar 

  • Butler, Samuel. 1872. Erewhon or Over the Range. Londra: Trubner.

    Book  Google Scholar 

  • De Monchaux, John, and Mark J. Schuster. 1997. Five Things to Do. In Preserving the Built Heritage: Tools for Implementation, ed. John De Monchaux and Mark J. Schuster, 3–12. Waltham: University Press of New England.

    Google Scholar 

  • di Robilant, Enrico. 1973. Il diritto nella società industriale. Rivista internazionale di filosofia del diritto: 225–262.

    Google Scholar 

  • Easterbrook, Frank. 1996. Cyberspace and the Law of the Horse. University of Chicago Legal Forum 113: 207–216.

    Google Scholar 

  • Fabris, Adriano. 2012. Etica delle nuove tecnologie. Brescia: Editrice La Scuola.

    Google Scholar 

  • Floridi, Luciano. 2014. The Fourth Revolution. How the Infosphere is Reshaping Human Reality. Oxford: Oxford University Press.

    Google Scholar 

  • Fogg, J. Brian. 2003. Persuasive Technology. Using Computers to Change What We Think and Do. Amsterdam: Morgan Kaufmann Publishers.

    Google Scholar 

  • Foucault, Michel. 1988. In Technologies of the Self: A Seminar with Michel Foucault, ed. L. Martin, H. Gutman, and P. Hutton, 16–49. Amherst: The University of Massachusetts Press.

    Google Scholar 

  • Goldoni, Marco. 2007. Politiche del codice. Architettura e diritto nella teoria di Lessig. Available online https://archiviomarini.sp.unipi.it/350/1/lessig.pdf

  • Grion, Luca. 2016. Ethics in the Age of Robotics Revolution. Cosmopolis, Rivista di Filosofia e Teoria Politica 2.

    Google Scholar 

  • Hildebrandt, Mireille. 2008. Legal and Technological Normativity: More (and Less) Than Twin Sisters. Techné: Journal of the Society for Philosophy and Technology 3: 169–183; Vrije Universiteit Brussel, From the Selected Works of Mireille Hildebrandt, https://works.bepress.com/mireille_hildebrandt/13/.

    Google Scholar 

  • ———. 2010. The Challenges of Ambient Law. Modern Law Review 73: 428–460.

    Article  Google Scholar 

  • ———. 2011. Legal Protection by Design: Objections and Refutations. Legisprudence 5: 223–248.

    Article  Google Scholar 

  • Hildebrandt, Mireille, and Bert-Jaap Koops. 2010. The Challenges of Ambient Law and Legal Protection in the Profiling Era. Modern Law Review 73: 428–460.

    Article  Google Scholar 

  • Irti, Natalino. 2007. Il diritto nell’età della tecnica. Napoli: Editoriale Scientifica.

    Google Scholar 

  • Janic, Milena, Pieter Wijbenga, and Thijs Veugen. 2013. Transparency Enhancing Tools (TETs): An Overview. Washington, D.C.: IEEE Computer Society.

    Google Scholar 

  • Kant, Immanuel. 1991. On the Common Saying: ‘This May Be True in Theory, but It Does Not Apply in Practice’. In Kant’s Political Writings, ed. Hans Reiss. Trans. H.B. Nisbet. Cambridge: Cambridge University Press.

    Google Scholar 

  • Khanna, Ayesha, and Parag Khanna. 2012. Hybrid Reality. Thriving in the Emerging Human-Technology Civilization. Ted Books.

    Google Scholar 

  • Klabbers, Jan. 2017. Doing Justice? Bureaucracy, the Rule of Law and Virtue Ethics. Journal of Legal Philosophy 1: 27–49.

    Google Scholar 

  • Kluxen, Kurt. 1980. Die geistesgeschichtlichen Grundlagen des englischen Parlamentarismus. In Parlamentarismus, ed. Kurt Kluxen, 99–111. Berlin: Athenäum.

    Google Scholar 

  • Koops, Bert-Jaap. 2008. Criteria for Normative Technology. The Acceptability of ‘Code as Law’ in Light of Democratic and Constitutional Values. In Regulating Technologies: Legal Futures, Regulatory Frames and Technological Fixes, ed. Roger Brownsword and Karen Yeung, 157–174. Oxford: Hart Publishing.

    Google Scholar 

  • ———. 2012. The (In)Flexibility of Techno-Regulation and the Case of Purpose-Binding. Legisprudence 5: 171–194.

    Article  Google Scholar 

  • Kranzberg, Melvin. 1986. Technology and History: ‘Kranzberg’s Laws’. Technology and Culture 27: 544–560.

    Article  Google Scholar 

  • Krygier, Martin. 2009. The Rule of Law: Legality, Teleology, Sociology. In Relocating the Rule of Law, ed. Gianluigi Palombella and Neil Walker, 45–70. Oxford: Hart Publishing.

    Google Scholar 

  • Kurzweil, Ray. 2005. How to Create a Mind: The Secret of Human Thought Revealed. New York: Penguin.

    Google Scholar 

  • ———. 2012. The Singularity is Near: When Humans Transcend Biology. New York: Penguin.

    Google Scholar 

  • Laghi, Pasquale. 2015. Cyberspazio e sussidiarietà. Napoli: Edizioni Scientifiche Italiane.

    Google Scholar 

  • Latour, Bruno. 1992. Where are the Missing Masses. In Shaping Technology Building Society: Studies in Sociotechnical Change, ed. Wiebe Bijker and John Law, 225–258. Cambridge: MIT Press.

    Google Scholar 

  • Leenes, Ronald. 2011. Framing Techno-Regulation: An Exploration of State and Non-state Regulation by Technology. Legisprudence 5: 143–169.

    Article  Google Scholar 

  • Leenes, Ronald, and Federica Lucivero. 2014. Laws on Robots, Laws by Robots, Laws in Robots: Regulating Robot Behaviour by Design. Innovation and Technology 6: 194–222.

    Google Scholar 

  • Lessig, Lawrence. 1999a. Code and Other Laws of Cyberspace. New York: Basic Books.

    Google Scholar 

  • ———. 1999b. The Law of the Horse: What Cyberlaw Might Teach. Harvard Law Review 113: 501–549.

    Article  Google Scholar 

  • Lévy, Pierre. 1990. Le tecnologie dell’intelligenza. Trans. F. Berardi. Milano: Synergon.

    Google Scholar 

  • Mitchell, J. William. 1995. City of Bits: Space, Place, and the Infobahn. Cambridge: Harvard University Press.

    Google Scholar 

  • Moro, Paolo. 2015. Libertà del robot? Sull’etica delle macchine intelligenti. In Filosofia del diritto e nuove tecnologie. Prospettive di ricerca tra teoria e pratica, ed. Raffaella Brighi and Silvia Zullo, 525–544. Roma: Aracne.

    Google Scholar 

  • Norman, A. Donald. 1988. The Design of Everyday Things. New York: Basic Books.

    Google Scholar 

  • ———. 2007. The Design of Future Things. New York: Basic Books.

    Google Scholar 

  • Pagallo, Ugo. 2014. Il diritto nell’età dell’informazione. Il riposizionamento tecnologico degli ordinamenti giuridici tra complessità sociale, lotta per il potere e tutela dei diritti. Torino: Giappichelli.

    Google Scholar 

  • Palombella, Gianluigi. 1990. L’istituzione del diritto. Una prospettiva di ricerca. Materiali per una storia della cultura giuridica 2: 367–401.

    Google Scholar 

  • ———. 2009. The Rule of Law Beyond the State: Failures, Promises, and Theory. International Journal of Constitutional Law 7: 442–467.

    Article  Google Scholar 

  • Pascuzzi, Giovanni, ed. 2016. Il diritto dell’era digitale. Bologna: il Mulino.

    Google Scholar 

  • Pasquale, Frank. 2015. The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge: Harvard University Press.

    Book  Google Scholar 

  • Posner, Eric. 2000. Law and Social Norms. Cambridge: Harvard University Press.

    Google Scholar 

  • Radbruch, Gustav. 1950. Introduzione alla scienza giuridica. Torino: Giappichelli.

    Google Scholar 

  • Rawls, John. 1955. Two Concepts of Rules. The Philosophical Review 64: 3–32.

    Article  Google Scholar 

  • Reidenberg, R. Joel. 1998. Lex Informatica: The Formulation of Information Policy Rules Through Technology. Texas Law Review 76: 553–584.

    Google Scholar 

  • Richards, M. Neil, and Jonathan H. King. 2014. Big Data Ethics. Wake Forest Law Review 49: 393–432.

    Google Scholar 

  • Rodotà, Stefano. 2012. Il diritto di avere diritti. Roma-Bari: Laterza.

    Google Scholar 

  • Rossato, Andrea. 2006. Diritto e architettura nello spazio digitale. Il ruolo del software libero. Padova: Cedam.

    Google Scholar 

  • Salazar, Carmela. 2014. Umano troppo umano…o no? Robot, androidi e cyborg nel “mondo del diritto” (prime notazioni). BioLaw Journal 1: 255–276.

    Google Scholar 

  • Searle, R. John. 1964. How to Derive “Ought” from “Is”. The Philosophical Review 73: 43–58.

    Article  Google Scholar 

  • ———. 1980. Minds, Brains, and Programs. Behavioral and Brain Sciences 3: 417–457.

    Article  Google Scholar 

  • Smith, J. David. 2000. Changing Situations and Changing People. In Ethical and Social Perspectives on Situational Crime Prevention, ed. Andrew von Hirsch, David W. Garland, and Alison Wakefield, 147–173. Oxford: Hart Publishing.

    Google Scholar 

  • Thaler, H. Richard, and Cass Sunstein. 2008. Nudge. Improving Decisions About Health, Wealth and Happiness. New Haven: Yale University Press.

    Google Scholar 

  • Vermaas, E. Pieter, Kroes Peter, Light Andrew, and Steven A. Moore, eds. 2008. Philosophy and Design: From Engineering to Architecture. Dordrecht: Springer.

    Google Scholar 

  • Warren, D. Samuel, and Louis D. Brandeis. 1890. The Right to Privacy. Harvard Law Review 4 (5): 193–220.

    Article  Google Scholar 

  • Weiser, Mark. 1993. Some Computer Science Problems in Ubiquitous Computing. Communications of the ACM 36: 7; reprinted as Ubiquitous Computing, Nikkei Electronics, 199: 137–143.

    Article  Google Scholar 

  • Westin, F. Alan. 1967. Privacy and Freedom. New York: Athenum.

    Google Scholar 

  • ———. 1982. Home Information Systems: The Privacy Debate. Datamation 28: 100–113.

    Google Scholar 

  • Winner, Langdon. 1980. Do Artefacts Have Politics. Daedalus 109: 121–136.

    Google Scholar 

  • Yeung, Karen. 2008. Towards an Understanding of Regulation by Design. In Regulating Technologies: Legal Futures, Regulatory Frames and Technological Fixes, ed. Roger Brownsword and Karen Yeung, 79–108. London: Hart Publishing.

    Google Scholar 

  • Zagrebelsky, Gustavo. 1992. Il diritto mite. Torino: Einaudi.

    Google Scholar 

  • Zittrain, Jonathan. 2014. Response, Engineering an Election: Digital Gerrymandering Poses a Threat to Democracy. Harvard Law Review 127: 335–336.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francesco De Vanna .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

De Vanna, F. (2019). The Construction of a Normative Framework for Technology-Driven Innovations: A Legal Theory Perspective. In: Carpanelli, E., Lazzerini, N. (eds) Use and Misuse of New Technologies. Springer, Cham. https://doi.org/10.1007/978-3-030-05648-3_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-05648-3_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-05647-6

  • Online ISBN: 978-3-030-05648-3

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics