Philosophy & Technology

, Volume 31, Issue 4, pp 507–524 | Cite as

Algo-Rhythms and the Beat of the Legal Drum

  • Ugo Pagallo
Research Article


The paper focuses on concerns and legal challenges brought on by the use of algorithms. A particular class of algorithms that augment or replace analysis and decision-making by humans, i.e. data analytics and machine learning, is under scrutiny. Taking into account Balkin’s work on “the laws of an algorithmic society”, attention is drawn to obligations of transparency, matters of due process, and accountability. This US-centric analysis on drawbacks and loopholes of current legal systems is complemented with the analysis of norms and principles of the EU data protection law, or “GDPR”. The aim is twofold. On the one hand, the intent is to shed light on some crucial differences between the US and EU law on the regulation of algorithmic operators, both public and private. Whereas, in the USA, scholars debate whether and to what extent new duties and responsibilities of algorithmic operators, e.g. information fiduciaries, have to amend the current framework of self-regulation and light government—as shown by the White House’s Office of Science and Technology Policy report from November 2016—in EU law much of the new duties and responsibilities of algorithmic operators have been passed upon them as data controllers. Whether such approaches will successfully tackle the normative challenges of the algorithmic society is, on the other hand, an open issue that will likely represent the main topic of debate over the next years. Disagreement may concern: (i) the terms framing the legal question in e.g. statistical purposes of the data processing; (ii) how such terms are related to each other in legal reasoning (e.g. a right to explanation as valid law in the EU); and (iii) legal hard cases that will increasingly have to do with the principles that are at stake also but not only in data protection (e.g. informational self-determination). By entrusting such legal hard cases to algorithms, or some sort of smart artificial agent, humans still bear full responsibility for the judgment of what is socially, ethically, and legally “plain” and “hard” in social affairs. The balance between delegation of decisions to algorithms and non-delegation will be the leitmotiv of the algorithmic society. Since the devil is in the detail, the current paper is devoted to some of them.


Algorithm Algorithmic society Data protection law Legal hard case Right to explanation Technological convergence 


  1. Balkin, Jack M. (2016) The Three Laws of Robotics in the Age of Big Data, October, at (last accessed 4 February 2017).
  2. Cath, Corinne, Wachter, Sandra, Mittelstadt, Brent, Taddeo, Mariarosaria and Luciano Floridi (2016) Artificial Intelligence and the ‘Good Society’: the US, EU, and UK Approach, December, at last accessed 4 Feb 2017.
  3. Christian, B., & Griffiths, T. (2016). Algorithms to live by: the computer science of human decisions. New York: Holt.Google Scholar
  4. Durante, M. (2015). The democratic governance of information societies. A Critique to the Theory of Stakeholders. Philosophy and Technology, 28(1), 11–32.CrossRefGoogle Scholar
  5. Dworkin, R. (1985). A Matter of Principle. Oxford: Oxford University Press.Google Scholar
  6. Floridi, L. (2012). Big data and their epistemological challenge. Philosophy & Technology, 25(4), 435–437.CrossRefGoogle Scholar
  7. Floridi, L. (2014). The ethics of information. Oxford: Oxford University Press.CrossRefGoogle Scholar
  8. Goodman, Bryce and Seth Flaxman (2016) EU regulations on algorithmic decision-making and a “right to explanation”, at (last accessed 4 Feb 2017).
  9. Grindrod, P. (2014). Mathematical underpinnings of analytics: theory and applications. Oxford: Oxford University Press.CrossRefGoogle Scholar
  10. Mittelstadt, Patrick Allo, Mariarosaria Taddeo, Sandra Wachter and Luciano Floridi (2016) The ethics of algorithms: mapping the debate, Big Data & Society, July–December, 1–21.Google Scholar
  11. Hart, H. L. A. (1961). The concept of law. Oxford: Clarendon.Google Scholar
  12. Hildebrandt, M. (2013). Legal protection by design in the smart grid. Report commissioned by the Smart Energy Collective (SEC). Nijmegen: Radbound University Nijmegen.Google Scholar
  13. Hildebrandt, M. (2016) The New Imbroglio—living with machine algorithms, The Art of Ethics in the Information Society, at (last accessed 24 Feb 2017).
  14. Kahneman, D. (2011) Thinking fast and slow. New York: MacMillanGoogle Scholar
  15. Klimas, Tadas and Jurate Vaiciukaite (2008) The law of recitals in European Community legislation, ILSA Journal of International & Comparative Law, 15, at (last accessed 4 February 2017).
  16. Koops, B.-J., & Leenes, R. (2014). Privacy regulation cannot be hardcoded: a critical comment on the “privacy by design” provision in data protection law. International Review of Law, Computers & Technology, 28, 159.CrossRefGoogle Scholar
  17. Kroll, Joshua A., Huey, Joanna, Barocas, Solon, Felten, Edward., Reindenberg, Joel R., Robinson, David G. and Harlan Yu, Accountable algorithms, University of Pennsylvania Law Review, 165 (forthcoming 2017).Google Scholar
  18. O’Neil, C. (2016). Weapons of math destruction: how big data increases inequality and threatens democracy. New York: Random House.Google Scholar
  19. OSTP (2016) National Science and Technology Council Networking and Information Technology, Research and Development Subcommittee, and National Science and Technology Council Networking and Information Technology, “The National Artificial Intelligence Research and Development Strategic Plan”, Washington D.C.Google Scholar
  20. Pagallo, U. (2011). ISPs & rowdy web sites before the law: should we change today’s safe harbour clauses? Philosophy and Technology, 24(4), 419–436.CrossRefGoogle Scholar
  21. Pagallo, U. (2016a). Even angels need the rules: on AI, roboethics, and the law. In G. A. Kaminka et al. (Eds.), ECAI proceedings (pp. 209–215). Amsterdam: IOS Press.Google Scholar
  22. Pagallo, U. (2016b) Three lessons learned for intelligent transport systems that Abide by the law, JusLetter IT, 2016, at (last accessed 21 Jan 2017).
  23. Pagallo, U. (2017a). The group, the private, and the individual: a new level of data protection? In L. Taylor, L. Floridi, & B. van der Sloot (Eds.), Group privacy: New challenges of data technologies (pp. 159–173). Dordrecht: Springer.Google Scholar
  24. Pagallo, U. (2017b) When morals ain’t enough: robots, ethics, and the rules of the law, Minds and Machines, January.Google Scholar
  25. Pagallo, U. (2017c). The legal challenges of big data: putting secondary rules first in the field of EU data protection. European Data Protection Law Review, 3(1), 36–46.CrossRefGoogle Scholar
  26. Pagallo, U., & Durante, M. (2009). Three roads to P2P systems and their impact on business ethics. Journal of Business Ethics, 90(4), 551–564.CrossRefGoogle Scholar
  27. Pagallo, U., & Durante, M. (2016a). The philosophy of law in an information society. In L. Floridi (Ed.), The Routledge handbook of philosophy of information (pp. 396–407). Oxon & New York: Routledge.Google Scholar
  28. Pagallo, U., & Durante, M. (2016b). The pros and cons of legal automation and its governance. European Journal of Risk Regulation, 7(2), 323–334.CrossRefGoogle Scholar
  29. Pasquale, F. (2015). The black box society: the secret algorithms that control money and information. Cambridge, MA: Harvard University Press.CrossRefGoogle Scholar
  30. Roth, A., & Work, C. (2014). The algorithmic foundations of differential privacy. Foundation and Trends in Theoretical Computer Science, 9(3–4), 211–407.Google Scholar
  31. Schönberger, V. M. and Y. Padova (2016) Regime change? Enabling big data through Europe’s new data protection regulation. Columbia Science and Technology Law Review, 17: 315–335.Google Scholar
  32. Taddeo, M., & Floridi, L. (2016). The debate on the moral responsibilities of online service providers. Science and Engineering Ethics, 22(6), 1575–1603.CrossRefGoogle Scholar
  33. Van Otterlo, M. (2013) A machine learning view on profiling. In M. Hildebrandt & K. de Kries (Eds.), Privacy, Due Process and the Computational Turn – Philosophers of Law Meet Philosophers of Technology, (pp. 41–64). Abington: Routledge.Google Scholar
  34. Vladeck, D. (2015). Separated by common goals: a U.S. perspective on narrowing the U.S.-EU privacy divide. In A. R. Lombarte & R. G. Mahamut (Eds.), Hacia un nuevo derecho europeo de protección de datos (pp. 207–243). Valencia: Tirant lo blanch.Google Scholar
  35. Wachter, Sandra, Mittelstadt, Brent and Luciano Floridi (2016) Why a right to explanation of automated decision-making does not exist in the general data protection regulation, December, at (last accessed 4 February 2017).
  36. Yeung, K. (2007). Towards an understanding of regulation by design. In R. Brownsword & K. Yeung (Eds.), Regulating technologies: legal futures, regulatory frames and technological fixes (pp. 79–108). London: Hart.Google Scholar
  37. Zittrain, J. (2014) Facebook could decide an election without anyone ever finding out, New Republic, June 1.Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2017

Authors and Affiliations

  1. 1.Law SchoolUniversity of TorinoTorinoItaly

Personalised recommendations