RESPECT4U – Privacy as Innovation Opportunity

  • Marc van LieshoutEmail author
  • Sophie Emmert
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11079)


The right to privacy is enshrined in the European charter of fundamental rights. The right to data protection is a relatively novel right, also enshrined in the same European charter. While these rights seem to focus on a defensive and protective approach, they also give rise to a positive and constructive interpretation. The GDPR may act as driver for innovation. Not only for assuring a better way of dealing with personal data, but including a more encompassing approach of assuring privacy. RESPECT4U offers a framework of seven privacy principles that help organisations in promoting this positive attitude towards the reconciliation of privacy and innovation: Responsible processing, Empowering data subjects, Secure data handling, Pro-active risk management, Ethical awareness, Cost-benefit assessment, Transparent data processing. This paper introduces the background of RESPECT4U, and elaborates the seven principles that form its foundation. Together they demonstrate that privacy can act as innovation driver.


Privacy Data protection Innovation Privacy as innovation driver Privacy principles GDPR Responsible data processing Empowerment Transparency 


  1. Article 29 Working Party: Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is” likely to result in a high risk” for the purposes of Regula- tion 2016/679, wp248rev.01. = 611236. Accessed 15 Apr 2018
  2. Acquisti, A., Friedman, A., Telang, R.: Is there a cost to privacy breaches? an event study. In: ICIS 2006 Proceedings, p. 94 (2006). Accessed 16 Apr 2018
  3. Aquisti, A.: Nudging privacy – the behavioral economics of privacy. In: IEEE Privacy & Security, November/December, pp. 72–75 (2009)Google Scholar
  4. Acquisti, A., Taylor, C., Wagman, L.: The economics of privacy. J. Econ. Lit. 54(2), 442–492 (2016)CrossRefGoogle Scholar
  5. Barker, K., et al.: A data privacy taxonomy. In: Sexton, A.P. (ed.) BNCOD 2009. LNCS, vol. 5588, pp. 42–54. Springer, Heidelberg (2009). Scholar
  6. Baduri, G., Ha-Brookshire, J.E.: Do transparent business practices pay? Exploration of transparency and consumer purchase intention. Cloth. Text. Res. J. 29(2), 135–149 (2011)CrossRefGoogle Scholar
  7. Bennet, C.J., Raab, C.D.: The Governance of Privacy. The MIT Press, Cambridge/London (2006)Google Scholar
  8. Bost, R., Popa, R.A., Tu, S., Goldwasser, S.: Machine learning classification over encrypted data. Crypto ePrint Archive (2014).
  9. Cavoukian, A.: Privacy by Design – The 7 Foundational Principles (2011).
  10. Colesky, M., Hoepman, J.-H., Hillen, C.: A critical analysis of privacy design strategies. In: IEEE Security and Privacy Workshops (SPW) (2016).
  11. Council of Europe: Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data. Strassbourg (2013)Google Scholar
  12. Danezis, G., et al.: Privacy and Data Protection by Design – from policy to engineering. ENISA (2014)Google Scholar
  13. De Vos, H., et al.: 16244 – PIME - A1602, Guidelines on inclusion of users’ perception and attitude on offering control and choice with respect to health data and health data services, EIT PIME Deliverable 2 (2016)Google Scholar
  14. Eggert, E., Helm, S.: Exploring the impact of relationship transparency on business relationships: a cross-sectional study among purchasing managers in German. Ind. Mark. Manag. 32(2), 101–108 (2003)CrossRefGoogle Scholar
  15. Executive Office of the President: Big Data: A Report on Algorithmic Systems, Opportunity and Civil Rights, US Presidency (2016)Google Scholar
  16. Erikin, Z., Veugen, T., Toft, T., Lagendijk, R.L.: Generating private recommendations efficiently using homomorphic encryption and data packing. IEEE Trans. Inf. Forensics Secur. 7(3), 1053–1066 (2012)CrossRefGoogle Scholar
  17. Finn, R.L., Wright, D., Friedewald, M.: Seven types of privacy. In: Gutwirth, S., Leenes, R., De Hert, P., Poulett, Y. (eds.) Data Protection: Coming of Age. Springer, Dordrecht (2013). Scholar
  18. Jentsch, N., Preibusch, S., Harasser, A.: Study on monetizing privacy – an economic model for pricing personal information. ENISA (2012)Google Scholar
  19. Galic, M.: Covert surveillance of privileged consultations and the weakening of the legal professional privilege. Eur. Data Prot. Law Rev. 4, 602–607 (2016)CrossRefGoogle Scholar
  20. Gellert, R., Gutwirth, S.: The legal construction of privacy and data protection. Comput. Law Secur. Rev. 29, 522–530 (2013)CrossRefGoogle Scholar
  21. Hildebrandt, M.: Smart Technologies and the End(s) of Law. Edward Elgar Publishing, Cheltenham, Northampton (2015)CrossRefGoogle Scholar
  22. Keymolen, E.: Trust on the line – a philosophical exploration of trust in the networked era. Erasmus University, Rotterdam (2016)Google Scholar
  23. Kumaraguru, P., Cranor, L.F.: Privacy Indexes: A Survey of Westin’s Studies, CMU-ISRI-5– 138, Carnegie Mellon University (2005)Google Scholar
  24. London Economics: Study on the Economic Benefits of PET; Study for the European Commission, DG Justice, Freedom and Security. EC, Brussel (2010)Google Scholar
  25. Paulk, M.: Capability maturity model. In: J.-J. Macinaik (ed.) Encyclopedia of Software Engineering, Wiley Online Library (first published 2002)., Accessed 15 Apr 2018
  26. Palmer, J.W., Bailey, J.P., Faraj, S.: The role of intermediaries in the development of trust on the www: the use and prominence of trusted third parties and privacy statements. J. Comput.-Mediat. Commun. 5(3) (2000). Accessed 15 Apr 2018CrossRefGoogle Scholar
  27. Stone, M.: The new (and ever-evolving) direct and digital marketing ecosystem. J. Direct, Data Digit. Mark. Pract. 16(2), 71–74 (2014)CrossRefGoogle Scholar
  28. Saunders, J., Hunt, P., Hollywood, J.S.: Predictions put into practice: a quasi-experimental evaluation of Chicago’s predictive policing pilot. J. Exp. Criminol. 12(3), 347–371 (2016)CrossRefGoogle Scholar
  29. Steen, M., Van der Poel, I.: Making values explicit during the design process. IEEE Technol. Soc. Mag. 31(4), 63–72 (2012)CrossRefGoogle Scholar
  30. Turilli, M., Floridi, L.: The ethics of information transparency. Ethics Inf. Technol. 11(2), 105–112 (2009)CrossRefGoogle Scholar
  31. Youyou, W., Kosinski, M., Stillwella, D.: Computer-based personality judgments are more accurate than those made by humans, PNAS (2015). Accessed 15 Apr 2018CrossRefGoogle Scholar
  32. Van den Broek, T., Ooms, M., Friedewald, M., Van Lieshout, M., Rung, S.: Privacy and security – citizens’ desires for an equal footing. In: Friedewald, M., Burgess, J.P., Cas, J., Bellanova, R., Preissl, W. (eds.) Surveillance, Privacy and Security, pp. 15–35. Routledge, Abingdon, Oxon/New York, (2017)CrossRefGoogle Scholar
  33. Van der Hoven, M.J., Lokhorst, G.J., Van de Poel, I.R.: Engineering and the problem of moral overload. Sci. Eng. Ethics 18(1), 153–155 (2012)Google Scholar
  34. Van der Hoven, J., Manders-Huits, N.: Value sensitive design. In: Berg Olsen, J.K., Pedersen, S.A., Hendricks, V.F. (eds.) A Companion to the Philosophy of Technology, Wiley Online, Chapter 86, Accessed 16 Apr 2018
  35. Verheul, E., Jacobs, B.: Polymorphic encryption and pseudonymisation in identity management and medical research. NAW 5/18(3), 168–72 (2017)Google Scholar
  36. Veugen, T., De Haan, R., Cramer, R., Muller, F.: A framework for secure computations with two non-colluding servers and multiple clients, applied to recommendations. IEEE Trans. Inf. Forensics Secur. 10(3), 445–457 (2015)CrossRefGoogle Scholar
  37. Wright, D., De Hert, P. (eds.): Privacy Impact Assessment, Law, Governance & Technology Series, vol. 6. Springer, Heidelberg (2012). Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.TNOThe HagueThe Netherlands

Personalised recommendations