Advertisement

GDPR and the Concept of Risk:

The Role of Risk, the Scope of Risk and the Technology Involved
  • Katerina DemetzouEmail author
Chapter
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 547)

Abstract

The prominent position of risk in the GDPR has raised questions as to the meaning this concept should be given in the field of data protection. This article acknowledges the value of extracting information from the GDPR and using this information as means of interpretation of risk. The ‘role’ that risk holds in the GDPR as well as the ‘scope’ given to the concept, are both examined and provide the reader with valuable insight as to the legislature’s intentions with regard to the concept of risk. The article also underlines the importance of taking into account new technologies used in personal data processing operations. Technologies such as IoT, AI, algorithms, present characteristics (e.g. complexity, autonomy in behavior, processing and generation of vast amounts of personal data) that influence our understanding of risk in data protection in various ways.

Keywords

Risk Concept Data protection Accountability Compliance Role Scope Fundamental rights New technologies 

References

  1. 1.
    Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data OJ L 281, 23 November 1995, pp. 31–50 (1995)Google Scholar
  2. 2.
    Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119, 4 May 2016, pp. 1–88 (2016)Google Scholar
  3. 3.
    Case C-131/12 Google Spain SL, Google Inc. v AEPD, Mario Costeja González ECLI:EU:C:2014:317 (2014)Google Scholar
  4. 4.
    Case C-131/12 Google Spain SL, Google Inc. v AEPD, Mario Costeja González ECLI:EU:C:2013:424, Opinion of AG JÄÄSKINEN (2013)Google Scholar
  5. 5.
    Case C-274/99 P Connolly v Commission ECLI:EU:C:2001:127 (2001)Google Scholar
  6. 6.
    Case C-465/00 Österreichischer Rundfunk and Others ECLI:EU:C:2003:294 (2003)Google Scholar
  7. 7.
    Case C-212/13 Ryneš ECLI:EU:C:2014:2428 (2014)Google Scholar
  8. 8.
    Joined Cases C-293/12 and C-594/12, Digital Rights Ireland and Seitlinger and Others ECLI:EU:C: 2014:238 (2014)Google Scholar
  9. 9.
    Case C-582/14 Breyer ECLI:EU:C:2016:779 (2016)Google Scholar
  10. 10.
    Case T-259/03 Nikolaou v Commission ECLI:EU:T:2007:254 (2007)Google Scholar
  11. 11.
    Case C-473/12, IPI EU:C:2013:715 (2013)Google Scholar
  12. 12.
    Case 215/88 Casa Fleischhandels-GmbH v Bundesanstalt für landwirtschaftliche Marktordnung ECLI:EU:C:1989:331 (1989)Google Scholar
  13. 13.
    Article 29 Data Protection Working Party, ‘Opinion 4/2007 on the Concept of Personal Data’Google Scholar
  14. 14.
    Article 29 Data Protection Working Party, ‘Opinion 1/2010 on the Concepts of “Controller” and “Processor”’Google Scholar
  15. 15.
    Article 29 Data Protection Working Party ‘Opinion 3/2010 on the Principle of Accountability’Google Scholar
  16. 16.
    Article 29 Data Protection Working Party ‘Opinion 05/2014 on Anonymisation Techniques’Google Scholar
  17. 17.
    Article 29 Data Protection Working Party ‘Statement on the role of a risk-based approach in data protection legal frameworks.’ Technical report WP 218, 30 May 2014Google Scholar
  18. 18.
    Article 29 Data Protection Working Party ‘Guidelines on Data Protection Impact Assessment (DPIA) and Determining Whether Processing Is “Likely to Result in a High Risk” for the Purposes of Regulation 2016/679’ WP 248 rev 0.1, 4 April 2017. Accessed 4 OctoberGoogle Scholar
  19. 19.
    Article 29 Data Protection Working Party ‘Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679 (WP251rev.01)’Google Scholar
  20. 20.
    Article 29 Data Protection Working Party WP 168 The Future of Privacy: Joint Contribution to the Consultation of the European Commission on the Legal Framework for the Fundamental Right to Protection of Personal Data. 28Google Scholar
  21. 21.
    Commission of the European Communities, ‘Amended Proposal for a COUNCIL DIRECTIVE on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data’Google Scholar
  22. 22.
    Commission, Communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions, Artificial Intelligence for Europe COM(2018) 237 Final. https://ec.europa.eu/digital-single-market/en/news/communication-artificial-intelligence-europe
  23. 23.
    Commission, Communication from the Commission to the European Parliament and the Council, Stronger protection, new opportunities - Commission guidance on the direct application of the General Data Protection Regulation as of 25 May 2018, COM(2018) 43 final. https://ec.europa.eu/commission/sites/beta-political/files/data-protection-communication-com.2018.43.3_en.pdf
  24. 24.
    Commission, Commission Staff Working Document: Liability for Emerging Digital Technologies Accompanying the document Communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions, Artificial Intelligence for Europe SWD (2018) 137 Final (2018). https://ec.europa.eu/digital-single-market/en/news/european-commission-staff-working-document-liability-emerging-digital-technologies
  25. 25.
    Council of Europe, Internet and Electoral Campaigns – Study on the use of Internet in electoral campaigns, DGI(2017)11. https://rm.coe.int/use-of-internet-in-electoral-campaigns-/16807c0e24
  26. 26.
    Council of Europe, Algorithms and Human Rights Study on the Human Rights dimensions of automated data processing techniques (in particular algorithms) and possible regulatory implications, DGI(2017)12. https://edoc.coe.int/en/internet/7589-algorithms-and-human-rights-study-on-the-human-rights-dimensions-of-automated-data-processing-techniques-and-possible-regulatory-implications.html
  27. 27.
    EDPS (European Data Protection Supervisor), ‘Opinion 5/2018, Preliminary Opinion on Privacy by Design’, 31 May 2018Google Scholar
  28. 28.
    EDPS (European Data Protection Supervisor), Opinion of the EDPS on the Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of Regions – “A comprehensive approach on personal data protection in the European Union”, Brussels, 14 January 2011. https://edps.europa.eu/sites/edp/files/publication/11-01-14_personal_data_protection_en.pdf
  29. 29.
    OECD Guidelines on the Protection of privacy and transborder flows of personal data (1980). http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm
  30. 30.
    Alhadeff, J., Van Alsenoy, B., Dumortier, J.: The accountability principle in data protection regulation: origin, development and future directions. In: Guagnin, D., Hempel, L., Ilten, C., Kroener, I., Neyland, D., Postigo, H. (eds.) Managing Privacy through Accountability, pp. 49–82. Palgrave Macmillan UK, London (2012).  https://doi.org/10.1057/9781137032225_4CrossRefGoogle Scholar
  31. 31.
    Biczók, G., Chia, P.H.: Interdependent privacy: let me share your data. In: Sadeghi, A.-R. (ed.) FC 2013. LNCS, vol. 7859, pp. 338–353. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-39884-1_29CrossRefGoogle Scholar
  32. 32.
    Cadwalladr, C., Graham-Harrison, E.: Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian, 17 March 2018Google Scholar
  33. 33.
    Floridi, L.: Group privacy: a defence and an interpretation. In: Taylor, L., Floridi, L., van der Sloot, B. (eds.) Group Privacy. Philosophical Studies Series, vol. 126, pp. 83–100. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-46608-8_5CrossRefGoogle Scholar
  34. 34.
    Gellert, R.: Why the GDPR risk-based approach is about compliance risk, and why it’s not a bad thing. In: Schweighofer, E., Kummer, F., Sorge, C. (eds.) Trends und Communities der Rechtsinformatik - Trends and Communities of legal informatics: Tagungsband des 20. Internationalen Rechtsinformatik Symposions - IRIS 2017 - Proceedings of the 20th International Legal Informatics Symposium. Austrian Computer Society, pp. 527–532 (2017)Google Scholar
  35. 35.
    Gellert, R.: Understanding the notion of risk in the general data protection regulation. Comput. Law Secur. Rev. 34(2), 279–288 (2018).  https://doi.org/10.1016/j.clsr.2017.12.003MathSciNetCrossRefGoogle Scholar
  36. 36.
    Gellert, R.: Understanding data protection as risk regulation. Internet J. Law 18(11), 3–15 (2015)Google Scholar
  37. 37.
    Gillespie, T.: The relevance of algorithms. In: Gillespie, T., Boczkowski, P., Foot, K. (eds.) Media Technologies: Essays on Communication, Materiality and Society. MIT Press, Cambridge (2012).  https://doi.org/10.7551/mitpress/9780262525374.003.0009Google Scholar
  38. 38.
    Rodotà, S.: Data protection as a fundamental right. In: Gutwirth, S., Poullet, Y., De Hert, P., de Terwangne, C., Nouwt, S. (eds.) Reinventing Data Protection?. Springer, Dordrecht (2009).  https://doi.org/10.1007/978-1-4020-9498-9_3CrossRefGoogle Scholar
  39. 39.
    Kaminski, M.: The Right to Explanation, Explained, 19 June 2018.  https://doi.org/10.31228/osf.io/rgeus
  40. 40.
    Kuner, C., et al.: The Challenge of “Big Data” for Data Protection’ 2 International Data Privacy Law 47 (2012)Google Scholar
  41. 41.
    Lenaerts, K., Gutiérrez-Fons, J.A.: To Say What the Law of the EU Is : Methods of Interpretation and the European Court of Justice’ EUI Working Papers AEL 2013/9 (2013). http://cadmus.eui.eu//handle/1814/28339. Accessed 16 June 2018
  42. 42.
    Lynskey, O.: The Foundations of EU Data Protection Law. Oxford University Press (2015). ISBN 9780198718239Google Scholar
  43. 43.
    Macenaite, M.: The riskification of European data protection law through a two-fold shift. Eur. J. Risk Regul. 8(3), 506–540 (2017).  https://doi.org/10.1017/err.2017.40CrossRefGoogle Scholar
  44. 44.
    Mantelero, A.: From group privacy to collective privacy: towards a new dimension of privacy and data protection in the big data era. In: Taylor, L., Floridi, L., van der Sloot, B. (eds.) Group Privacy: New Challenges of Data Technologies. PSS, vol. 126, pp. 139–158. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-46608-8_8CrossRefGoogle Scholar
  45. 45.
    Mittelstadt, B.: From individual to group privacy in big data analytics. Philos. Technol. 30, 475 (2017).  https://doi.org/10.1007/s13347-017-0253-7CrossRefGoogle Scholar
  46. 46.
    Purtova, N.: The law of everything. Broad concept of personal data and future of EU data protection law. Law, Innov. Technol. 10(1), 40–81 (2018).  https://doi.org/10.1080/17579961.2018.1452176CrossRefGoogle Scholar
  47. 47.
    Pasquale, F.: The Black Box Society The Secret Algorithms That Control Money and Information. Harvard University Press (2015). ISBN 9780674368279Google Scholar
  48. 48.
    Quelle, C.: ‘The “risk revolution” in EU data protection law: we can’t have our cake and eat it, too. In: Leenes, R., van Brakel, R., Gutwirth, S., De Hert, P. (eds.) Data Protection and Privacy: The Age of Intelligent Machines, 1st edn, vol. 10. Hart Publishing (2017)Google Scholar
  49. 49.
    Spina, A.: A regulatory mariage de figaro: risk regulation, data protection, and data ethics. Eur. J. Risk Regul. 8(1), 88–94 (2017).  https://doi.org/10.1017/err.2016.15CrossRefGoogle Scholar
  50. 50.
    Tene, O., Polonetsky, J.: Big data for all: privacy and user control in the age of analytics. Nw. J. Tech. Intell. Prop. 11, 239 (2013)Google Scholar
  51. 51.
    Vedder, A., Naudts, L.: Accountability for the use of algorithms in a big data environment. Int. Rev. Law Comput. Technol. - Justice Algorithmic Robes 31(2), 206–224 (2017)CrossRefGoogle Scholar
  52. 52.
    Wachter, S.: The GDPR and the Internet of Things: a three-step transparency model. Law Innov. Technol. 10(2), 266–294 (2018).  https://doi.org/10.1080/17579961.2018.1527479MathSciNetCrossRefGoogle Scholar
  53. 53.
    Wachter, S., Mittelstadt, B., Floridi, L.: Transparent, explainable, and accountable AI for robotics. Sci. Robot. 2(6), eaan6080 (2017)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  1. 1.Business and Law Research Centre (OO&R)Radboud UniversityNijmegenNetherlands

Personalised recommendations