Privacy Risk Assessment: From Art to Science, by Metrics

  • Isabel Wagner
  • Eerke Boiten
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11025)


Privacy risk assessments aim to analyze and quantify the privacy risks associated with new systems. As such, they are critically important in ensuring that adequate privacy protections are built in. However, current methods to quantify privacy risk rely heavily on experienced analysts picking the “correct” risk level on e.g. a five-point scale. In this paper, we argue that a more scientific quantification of privacy risk increases accuracy and reliability and can thus make it easier to build privacy-friendly systems. We discuss how the impact and likelihood of privacy violations can be decomposed and quantified, and stress the importance of meaningful metrics and units of measurement. We suggest a method of quantifying and representing privacy risk that considers a collection of factors as well as a variety of contexts and attacker models. We conclude by identifying some of the major research questions to take this approach further in a variety of application scenarios.


Privacy risk metrics Privacy impact assessment 



This work was supported by the UK Engineering and Physical Sciences Research Council (EPSRC) grant EP/P006752/1. We thank Lee Hadlington, Richard Snape, and the expert participants of our workshop on “Privacy risk: harm, impact, assessment, metrics” in January 2018 for their thoughts and discussions.


  1. 1.
    Albakri, A., Boiten, E., de Lemos, R.: Risks of sharing cyber incident information. In: 1st International Workshop on Cyber Threat Intelligence Management (CyberTIM) (2018, to appear)Google Scholar
  2. 2.
    Boiten, E.: What is the unit of security? (2016). FOSAD Summer School 2016.
  3. 3.
    Brooks, S., Garcia, M., Lefkovitz, N., Lightman, S., Nadeau, E.: An introduction to privacy engineering and risk management in federal systems. Technical report NIST IR 8062, National Institute of Standards and Technology, Gaithersburg, MD, January 2017.
  4. 4.
    Calder, A., Watkins, S.: IT Governance: An International Guide to Data Security and ISO27001/ISO27002. Kogan Page, London (2015)Google Scholar
  5. 5.
    Cavoukian, A.: Privacy by design: the 7 foundational principles (2011).
  6. 6.
    Commission Nationale de l’Informatique et des Libertés: Methodology for privacy risk management: How to implement the data protection act (2012).
  7. 7.
    Commission Nationale de l’Informatique et des Libertés: Privacy impact assessment (PIA) 1: Methodology (2018).
  8. 8.
    Commission Nationale de l’Informatique et des Libertés: Privacy impact assessment (PIA) 3: Knowledge bases (2018).
  9. 9.
    Deng, M., Wuyts, K., Scandariato, R., Preneel, B., Joosen, W.: A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements. Requirements Eng. 16(1), 3–32 (2011). Scholar
  10. 10.
    Eckhoff, D., Wagner, I.: Privacy in the smart city - applications, technologies, challenges and solutions. IEEE Commun. Surv. Tutorials 20(1), 489–516 (2018). Scholar
  11. 11.
    European Parliament and Council of the European Union: Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).
  12. 12.
    Evans, K.: Vidal-hall and risk management for privacy breaches. IEEE Secur. Priv. 13(5), 80–84 (2015). Scholar
  13. 13.
    Information Commissioner’s Office: Data Protection Impact Assessments (DPIAs) (2018).
  14. 14.
    Information Commissioner’s Office (ICO): Guide to the General Data Protection Regulation (GDPR), May 2018.
  15. 15.
    Lin, J., Amini, S., Hong, J.I., Sadeh, N., Lindqvist, J., Zhang, J.: Expectation and purpose: understanding users’ mental models of mobile app privacy through crowdsourcing. In: Proceedings of the 2012 ACM Conference on Ubiquitous Computing, pp. 501–510. ACM, Pittsburgh (2012)Google Scholar
  16. 16.
    Liu, K., Terzi, E.: A framework for computing the privacy scores of users in online social networks. ACM Trans. Knowl. Discov. Data 5(1), 6:1–6:30 (2010). Scholar
  17. 17.
    Meng, W., Ding, R., Chung, S.P., Han, S., Lee, W.: The price of free: privacy leakage in personalized mobile in-app ads. In: NDSS. Internet Society (2016).
  18. 18.
    National Institute of Standards and Technology (NIST): Guide for Conducting Risk Assessments. NIST Special Publication 800-30 r1, September 2012.
  19. 19.
    Nissenbaum, H.: Privacy as contextual integrity. Wash. L. Rev. 79, 119 (2004)Google Scholar
  20. 20.
    Open Web Application Security Project: OWASP Risk Rating Methodology (2018).
  21. 21.
    Pérez-Peña, R., Rosenberg, M.: Strava Fitness App Can Reveal Military Sites, Analysts Say.
  22. 22.
    SnoopWall: Flashlight apps threat assessment report (2014).
  23. 23.
    Solove, D.J.: A taxonomy of privacy. Univ. Pennsylvania Law Rev. 154(3), 477–564 (2006). Scholar
  24. 24.
    Stahl, F., Burgmair, S.: OWASP Top 10 Privacy Risks Project (2017).
  25. 25.
    Stevens, S.S.: On the theory of scales of measurement. Science 103(2684), 677–680 (1946)CrossRefGoogle Scholar
  26. 26.
    Sweeney, L.: k-anonymity: a model for protecting privacy. Int. J. Uncertainty Fuzziness Knowl. Based Syst. 10(05), 557–570 (2002)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Wagner, I.: Evaluating the strength of genomic privacy metrics. ACM Trans. Priv. Secur. 20(1), 2:1–2:34 (2017). Scholar
  28. 28.
    Wagner, I., Eckhoff, D.: Technical privacy metrics: a systematic survey. ACM Comput. Surv. (CSUR) 51(3) (2018)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Cyber Technology InstituteDe Montfort UniversityLeicesterUK

Personalised recommendations