Advertisement

Towards a Roadmap for Privacy Technologies and the General Data Protection Regulation: A Transatlantic Initiative

  • Stefan SchiffnerEmail author
  • Bettina Berendt
  • Triin Siil
  • Martin Degeling
  • Robert Riemann
  • Florian Schaub
  • Kim Wuyts
  • Massimo Attoresi
  • Seda Gürses
  • Achim Klabunde
  • Jules Polonetsky
  • Norman Sadeh
  • Gabriela Zanfir-Fortuna
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11079)

Abstract

The EU’s General Data Protection Regulation is poised to present major challenges in bridging the gap between law and technology. This paper reports on a workshop on the deployment, content and design of the GDPR that brought together academics, practitioners, civil-society actors, and regulators from the EU and the US. Discussions aimed at advancing current knowledge on the use of abstract legal terms in the context of applied technologies together with best practices following state of the art technologies. Five themes were discussed: state of the art, consent, de-identification, transparency, and development and deployment practices. Four traversal conflicts were identified, and research recommendations were outlined to reconcile these conflicts.

Notes

Acknowledgments

This work was partially funded by the European Union’s Horizon 2020 project grant no. 740829 and 778615, the Luxembourg National Research Fund project PETIT grant agreement no. 10486741, the National Science Foundation grant agreements CNS-1330596 and SBE-1513957, under the Brandeis privacy initiative DARPA, AFRL grant agreement no. FA8750-15-2-0277, the Research Foundation Flanders, the Research Fund KU Leuven, and the KUL-PRiSE research project.

The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official positions, policies or endorsements, either expressed or implied, of the institutions they are affiliated with, the EDPS, the National Science Foundation, DARPA, the Air Force Research Laboratory or the US Government.

We thank Ian Oliver and Jef Ausloos and the APF reviewers for their valuable input and comments, and all workshop participants for their contributions and stimulating discussions.

References

  1. 1.
    Hoepman, J.-H.: Privacy design strategies. In: Cuppens-Boulahia, N., Cuppens, F., Jajodia, S., Abou El Kalam, A., Sans, T. (eds.) SEC 2014. IAICT, vol. 428, pp. 446–459. Springer, Heidelberg (2014).  https://doi.org/10.1007/978-3-642-55415-5_38CrossRefGoogle Scholar
  2. 2.
    ENISA: Privacy Enhancing Technologies: Evolution and State of the Art A Community Approach to PETs Maturity Assessment (2016). https://www.enisa.europa.eu/publications/pets-evolution-and-state-of-the-art
  3. 3.
    Schaub, F., Balebako, R., Durity, A.L., Cranor, L.F.: A design space for effective privacy notices. In: Eleventh Symposium on Usable Privacy and Security (SOUPS 2015), Ottawa, pp. 1–17. USENIX Association (2015)Google Scholar
  4. 4.
    President’s Council of Advisors on Science and Technology: Big data and privacy: a technological perspective. Report to the U.S. President, Executive Office of the President, May 2014Google Scholar
  5. 5.
    Cranor, L.F.: Necessary but not sufficient: standard mechanisms for privacy notice and choice. J. Telecommun. High Technol. Law 10, 273 (2012)Google Scholar
  6. 6.
    Cate, F.H.: The limits of notice and choice. IEEE Secur. Priv. 8(2), 59–62 (2010)CrossRefGoogle Scholar
  7. 7.
    Schaub, F., Balebako, R., Cranor, L.F.: Designing effective privacy notices and controls. IEEE Internet Comput. 21(3), 70–77 (2017)CrossRefGoogle Scholar
  8. 8.
    Wenning, R., et al.: The platform for privacy preferences 1.1 (P3P 1.1) specification (2006). https://www.w3.org/TR/2018/NOTE-P3P11-20180830/
  9. 9.
    Fielding, R.T., Singer, D.: Tracking preference expression (DNT) W3C candidate recommendation (2017). https://www.w3.org/TR/2017/CR-tracking-dnt-20171019/
  10. 10.
    Article 29 Working Party. Opinion 05/2014 on anonymisation techniques (2014). WP216. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp216_en.pdf
  11. 11.
    Narayanan, A., Shmatikov, V.: Robust de-anonymization of large sparse datasets. In: 2008 IEEE Symposium on Security and Privacy, SP 2008 (2008)Google Scholar
  12. 12.
    Cavoukian, A., Castro, D.: Big data and innovation, setting the record straight: de-identification does work. In: Information and Privacy Commissioner, p. 18 (2014)Google Scholar
  13. 13.
    Hu, R., Stalla-Bourdillon, S., Yang, M., Schiavo, V., Sassone, V.: Bridging policy, regulation and practice? A techno-legal analysis of three types of data in the GDPR. In: Data Protection and Privacy: The Age of Intelligent Machines, p. 39 (2017)Google Scholar
  14. 14.
    Ye, L.R.: The value of explanation in expert systems for auditing: an experimental investigation. Expert Syst. Appl. 9(4), 543–556 (1995)CrossRefGoogle Scholar
  15. 15.
    Article 29 Working Party. Guidelines on transparency under regulation 2016/679 (2016). 17/EN WP260. http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id = 615250
  16. 16.
    Wachter, S., Mittelstadt, B., Floridi, L.: Why a right to explanation of automated decision-making does not exist in the general data protection regulation. Int. Data Priv. Law 7, 76–99 (2017)CrossRefGoogle Scholar
  17. 17.
    Selbst, A.D., Powles, J.: Meaningful information and the right to explanation. Int. Data Priv. Law 7(4), 233–242 (2017)CrossRefGoogle Scholar
  18. 18.
    Biran, O., Cotton, C.: Explanation and justification in machine learning: a survey. In: IJCAI-17 Workshop on Explainable AI (XAI) Proceedings, pp. 8–13 (2017). http://www.intelligentrobots.org/files/IJCAI2017/IJCAI-17_XAI_WS_Proceedings.pdf#page=8
  19. 19.
    Lipton, Z.C.: The mythos of model interpretability. In: ICML 2016 Workshop on Human Interpretability in Machine Learning (WHI 2016) (2016). http://zacklipton.com/media/papers/mythos_model_interpretability_lipton2016.pdf
  20. 20.
    Edwards, L., Veale, M.: Slave to the algorithm? Why a ’right to an explanation’ is probably not the remedy you are looking for. Duke Law Technol. Rev. 16, 18 (2017)Google Scholar
  21. 21.
    Article 29 Working Party. Guidelines on automated individual decision-making and profiling for the purposes of regulation 2016/679 (2018). 17/EN WP251rev.01. http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053
  22. 22.
    Obar, J.A., Oeldorf-Hirsch, A., The biggest lie on the internet: ignoring the privacy policies and terms of service policies of social networking services. In: TPRC 44: The 44th Research Conference on Communication, Information and Internet Policy (2016)Google Scholar
  23. 23.
    Cate, F.H.: Information security breaches: looking back & thinking ahead. Technical report Paper 233, Articles by Maurer Faculty (2008). http://www.repository.law.indiana.edu/facpub/233
  24. 24.
    Atzori, M., Bonchi, F., Giannotti, F., Pedreschi, D.: Anonymity preserving pattern discovery. VLDB J. 17(4), 703–727 (2008)CrossRefGoogle Scholar
  25. 25.
    Hansen, M., Jensen, M., Rost, M.: Protection goals for privacy engineering. In: 2015 IEEE Security and Privacy Workshops (SPW), pp. 159–166, May 2015Google Scholar
  26. 26.
    Schmidt , A., Herrmann, T., Degeling, M.: From interaction to intervention: an approach for keeping humans in control in the context of socio-technical systems. In: 4th Workshop on Socio-Technical Perspective in IS development (STPIS 2018) (2018)Google Scholar
  27. 27.
    Ribeiro, M.T., Singh, S., Guestrin, C.: “Why should I trust you?”: explaining the predictions of any classifier. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2016, pp. 1135–1144. ACM, New York (2016)Google Scholar
  28. 28.
    Gürses, S., van Hoboken, J.: Privacy after the agile turn. In: Selinger, E., Polonetsky, J., Tene, O. (eds.) The Cambridge Handbook of Consumer Privacy (Cambridge Law Handbooks, pp. 579–601). Cambridge University Press, Cambridge (2018).  https://doi.org/10.1017/9781316831960.032
  29. 29.
    Ding, L., Bao, J., Michaelis, J.R., Zhao, J., McGuinness, D.L.: Reflections on provenance ontology encodings. In: McGuinness, D.L., Michaelis, J.R., Moreau, L. (eds.) IPAW 2010. LNCS, vol. 6378, pp. 198–205. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-17819-1_22CrossRefGoogle Scholar
  30. 30.
    Oliver, I.: Privacy Engineering: A Data Flow and Ontological Approach. CreateSpace Independent Publishing, July 2014. 978-1497569713Google Scholar
  31. 31.
    Anton, A.I., Earp, J.B.: A requirements taxonomy for reducing web site privacy vulnerabilities. Requirements Eng. 9(3), 169–185 (2004)CrossRefGoogle Scholar
  32. 32.
    Solove, D.J.: A taxonomy of privacy. Univ. Pennsylvania Law Rev. 154(3), 477 (2006). GWU Law School Public Law Research Paper No. 129CrossRefGoogle Scholar
  33. 33.
    Solove, D.J.: Conceptualizing privacy. Calif. Law Rev. 90(4), 1087–1155 (2002)CrossRefGoogle Scholar
  34. 34.
    Kost, M., Freytag, J.C., Kargl, F., Kung, A.: Privacy verification using ontologies. In: ARES, pp. 627–632. IEEE (2011)Google Scholar
  35. 35.
    Kern, T.: Flight Discipline. McGraw-Hill Education, New York (1998)Google Scholar
  36. 36.
    Card, A.J., Ward, J.R., Clarkson, P.J.: Beyond FMEA: the structured what-if technique (SWIFT). J. Healthc. Risk Manag. 31, 23–29 (2012)CrossRefGoogle Scholar
  37. 37.
    Scandariato, R., Wuyts, K., Joosen, W.: A descriptive study of Microsoft’s threat modeling technique. Requirements Eng. 20(2), 163–180 (2015)CrossRefGoogle Scholar
  38. 38.
    Gawande, A.: The Checklist Manifesto. Profile Books (2011)Google Scholar
  39. 39.
    Reason, J.T.: Managing the Risks of Organizational Accidents. Ashgate, Farnham (1997)Google Scholar
  40. 40.
    Pfleeger, S.L.: Risky business: what we have yet to learn about risk management. J. Syst. Softw. 53(3), 265–273 (2000)CrossRefGoogle Scholar
  41. 41.
    Oliver, I.: Experiences in the development and usage of a privacy requirements framework. In: 24th IEEE International Requirements Engineering Conference, RE 2016, Beijing, China, 12–16 September 2016, pp. 293–302. IEEE Computer Society (2016)Google Scholar
  42. 42.
    Power, M.: The risk management of everything. J. Risk Finance 5, 58–65 (2004)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Stefan Schiffner
    • 1
    Email author
  • Bettina Berendt
    • 2
  • Triin Siil
    • 3
  • Martin Degeling
    • 4
  • Robert Riemann
    • 5
  • Florian Schaub
    • 6
  • Kim Wuyts
    • 2
  • Massimo Attoresi
    • 5
  • Seda Gürses
    • 2
  • Achim Klabunde
    • 5
  • Jules Polonetsky
    • 7
  • Norman Sadeh
    • 8
  • Gabriela Zanfir-Fortuna
    • 7
  1. 1.University of LuxembourgEsch-sur-AlzetteLuxembourg
  2. 2.KU LeuvenLeuvenBelgium
  3. 3.CyberneticaTallinnEstonia
  4. 4.Ruhr-Universität BochumBochumGermany
  5. 5.EDPSBrusselsBelgium
  6. 6.University of MichiganAnn ArborUSA
  7. 7.Future of Privacy ForumWashingtonUSA
  8. 8.Carnegie Mellon UniversityPittsburghUSA

Personalised recommendations