Skip to main content

A Refinement Approach for the Reuse of Privacy Risk Analysis Results

  • Conference paper
  • First Online:
Privacy Technologies and Policy (APF 2017)

Part of the book series: Lecture Notes in Computer Science ((LNSC,volume 10518))

Included in the following conference series:

Abstract

The objective of this paper is to improve the cost effectiveness of privacy impact assessments through (1) a more systematic approach, (2) a better integration with privacy by design and (3) enhanced reusability. We present a three-tier process including a generic privacy risk analysis depending on the specifications of the system and two refinements based on the architecture and the deployment context respectively. We illustrate our approach with the design of a biometric access control system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Incentives should be taken in a general sense here, including lack of awareness in the case of unintentional breach.

  2. 2.

    And are independent of the architecture and the context.

  3. 3.

    The user’s identity \(ID_i\) is used to fetch his enrolled template \(br_{i}\).

  4. 4.

    For example, different locations correspond to different applicable laws (the motivation of a risk source may vary depending on the existence of data protection regulations and how strongly they are enforced), the strength (e.g., technical resources) or motivation of the local state to interfere [33], etc.

  5. 5.

    This assumption should be valid at least for large scale attacks. However, one could argue that casinos may possess background information about certain frequent customers. Similarly, the state would be considered as having potentially a lot of background information but it is a more relevant risk source for surveillance than for identity theft. In any case, the assumptions made in this paper are for illustrative purposes only: different assumptions about background information could be made within the same framework.

  6. 6.

    In order to err on the safe side in terms of privacy protection, we consider dependent nodes such that one node may imply the other nodes.

  7. 7.

    In order to err on the safe side in terms of privacy protection, we consider dependent nodes such that each node may exclude the other nodes.

  8. 8.

    Keys are assumed to be protected by techniques which are not discussed here (e.g. obfuscation).

  9. 9.

    Data elements that are stored persistently in a component are marked in red in Figs. 3, 8 and 10.

  10. 10.

    Data elements that are stored transiently in a component are marked in blue in Figs. 3, 8 and 10.

References

  1. Article 29 Data Protection Working Party: Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679 (2017)

    Google Scholar 

  2. BBC Technology: Millions of Fingerprints Stolen in US Government Hack (2015)

    Google Scholar 

  3. Bringer, J., Chabanne, H., Métayer, D., Lescuyer, R.: Privacy by design in practice: reasoning about privacy properties of biometric system architectures. In: Bjørner, N., de Boer, F. (eds.) FM 2015. LNCS, vol. 9109, pp. 90–107. Springer, Cham (2015). doi:10.1007/978-3-319-19249-9_7

    Chapter  Google Scholar 

  4. Cavoukian, A.: Privacy by Design: The 7 Foundational Principles Implementation and Mapping of Fair Information Practices. Office of the Information and Privacy Commissioner, Ontario, Canada Standards (2010)

    Google Scholar 

  5. Cavoukian, A., Chibba, M., Stoianov, A.: Advances in biometric encryption: taking privacy by design from academic research to deployment. Rev. Policy Res. 29(1), 37–61 (2012)

    Article  Google Scholar 

  6. Cavoukian, A., Stoianov, A.: Privacy by Design Solutions for Biometric One-to-Many Identification Systems (2014)

    Google Scholar 

  7. Colesky, M., Hoepman, J., Hillen, C.: A critical analysis of privacy design strategies. In: 2016 IEEE Security and Privacy Workshops, SP Workshops 2016, San Jose, CA, USA, 22–26 May 2016, pp. 33–40 (2016)

    Google Scholar 

  8. Commission Nationale de l’Informatique et des Libertes (CNIL): Methodology for Privacy Risk Management - How to Implement the Data Protection Act (2012)

    Google Scholar 

  9. Commission Nationale de l’Informatique et des Libertes (CNIL): Privacy Impact Assessment (PIA) Methodology (How to Carry Out a PIA) (2015)

    Google Scholar 

  10. Commission Nationale de l’Informatique et des Libertes (CNIL): Privacy Impact Assessment (PIA) Tools (templates and knowledge bases) (2015)

    Google Scholar 

  11. Dantcheva, A., Elia, P., Ross, A.: What Else Does Your Biometric Data Reveal? A Survey on Soft Biometrics (2015)

    Google Scholar 

  12. De, S.J., Métayer, D.: PRIAM: a privacy risk analysis methodology. In: Livraga, G., Torra, V., Aldini, A., Martinelli, F., Suri, N. (eds.) DPM/QASA -2016. LNCS, vol. 9963, pp. 221–229. Springer, Cham (2016). doi:10.1007/978-3-319-47072-6_15

    Chapter  Google Scholar 

  13. De, S.J., Le Métayer, D.: Privacy harm analysis: a case study on smart grids. In: International Workshop on Privacy Engineering (IWPE). IEEE (2016)

    Google Scholar 

  14. De, S.J., Le Métayer, D.: Privacy risk analysis. In: Synthesis Series. Morgan & Claypool Publishers (2016)

    Google Scholar 

  15. De, S.J., Le Métayer, D.: A Risk-based Approach to Privacy by Design (Extended Version). No. RR-9001, December 2016

    Google Scholar 

  16. De, S.J., Le Métayer, D.: PRIAM: A Privacy Risk Analysis Methodology. INRIA Research Report (RR-8876), July 2016

    Google Scholar 

  17. Deng, M., Wuyts, K., Scandariato, R., Preneel, B., Joosen, W.: A privacy threat analysis framework: supporting the elicitation and fulfilment of privacy requirements. Requirements Eng. 16(1), 3–32 (2011)

    Article  Google Scholar 

  18. European Commission: Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), April 2016

    Google Scholar 

  19. Expert Group. 2 of Smart Grid Task Force: Data Protection Impact Assessment Template for Smart Grid and Smart Metering Systems (2014)

    Google Scholar 

  20. Frakes, W.B., Kang, K.: Software reuse research: status and future. IEEE Trans. Softw. Eng. 31(7), 529–536 (2005)

    Article  Google Scholar 

  21. Friginal, J., Guiochet, J., Killijian, M.O.: A privacy risk assessment methodology for location-based systems. http://homepages.laas.fr/guiochet/telecharge/MOBIQUITOUS2013.pdf. Accessed 13 July 2016

  22. Garcia, M., Lefkovitz, N., Lightman, S.: Privacy Risk Management for Federal Information Systems (NISTIR 8062 (Draft)). National Institute of Standards and Technology (2015)

    Google Scholar 

  23. Gartland, C.: Biometrics Are a Grave Threat to Privacy (2016). The New York Times

    Google Scholar 

  24. Gürses, S., Troncoso, C., Diaz, C.: Engineering privacy by design. Comput. Priv. Data Prot. 14(3) (2011)

    Google Scholar 

  25. Gürses, S., Troncoso, C., Diaz, C.: Engineering Privacy by Design Reloaded. Amsterdam Privacy Conference (2015)

    Google Scholar 

  26. Hoepman, J.-H.: Privacy design strategies. In: Cuppens-Boulahia, N., Cuppens, F., Jajodia, S., Abou El Kalam, A., Sans, T. (eds.) SEC 2014. IFIP AICT, vol. 428, pp. 446–459. Springer, Heidelberg (2014). doi:10.1007/978-3-642-55415-5_38

    Chapter  Google Scholar 

  27. Kobie, N.: Surveillance State: Fingerprinting Pupils Raises Safety and Privacy Concerns (2016). The Guardian

    Google Scholar 

  28. Mcilroy, M.: Mass produced software components (1969)

    Google Scholar 

  29. Miglani, S., Kumar, M.: India’s Billion-member Biometric Database Raises Privacy Fears (2016). Reuters

    Google Scholar 

  30. Mili, A., Chmiel, S.F., Gottumukkala, R., Zhang, L.: An integrated cost model for software reuse. In: Proceedings of the 2000 International Conference on Software Engineering, pp. 157–166. IEEE (2000)

    Google Scholar 

  31. Oetzel, M.C., Spiekermann, S.: A systematic methodology for privacy impact assessments: a design science approach. Eur. J. Inform. Syst. 23(2), 126–150 (2014)

    Article  Google Scholar 

  32. Oetzel, M.C., Spiekermann, S., Grüning, I., Kelter, H., Mull, S.: Privacy Impact Assessment Guideline for RFID Applications (2011)

    Google Scholar 

  33. Oppenheim, C.: Big Brother Spying is Reaching Scary Levels (2013). http://edition.cnn.com/2013/12/10/opinion/oppenheim-privacy-reform/

  34. Pearson, S., Benameur, A.: A decision support system for design for privacy. In: Fischer-Hübner, S., Duquenoy, P., Hansen, M., Leenes, R., Zhang, G. (eds.) Privacy and Identity 2010. IFIP AICT, vol. 352, pp. 283–296. Springer, Heidelberg (2011). doi:10.1007/978-3-642-20769-3_23

    Chapter  Google Scholar 

  35. Prabhakar, S., Pankanti, S., Jain, A.K.: Biometric recognition: security and privacy concerns. IEEE Secur. Priv. 1(2), 33–42 (2003)

    Article  Google Scholar 

  36. del Prado, N., Cortez, M., Friginal, J.: Geo-location inference attacks: from modelling to privacy risk assessment. In: Tenth European Dependable Computing Conference (EDCC), pp. 222–225. IEEE (2014)

    Google Scholar 

  37. Prieto-Díaz, R.: Status report: software reusability. IEEE Softw. 10(3), 61–66 (1993)

    Article  Google Scholar 

  38. Spiekermann, S., Cranor, L.F.: Engineering privacy. IEEE Trans. Softw. Eng. 35(1), 67–82 (2009)

    Article  Google Scholar 

  39. Standish, T.A.: An essay on software reuse. IEEE Trans. Softw. Eng. 10(5), 494–497 (1984)

    Article  Google Scholar 

  40. Tillman, G.: Opinion: Stolen Fingers: The Case Against Biometric Identity Theft Protection (2009). Computer World

    Google Scholar 

  41. Woodward, J.D.: Biometrics: privacy’s foe or privacy’s friend? Proc. IEEE 85(9), 1480–1492 (1997)

    Article  Google Scholar 

  42. Wright, D., De Hert, P.: Privacy Impact Assessment. Springer, Netherlands (2012)

    Book  Google Scholar 

Download references

Acknowledgments

This work has been partially funded by the French ANR-12-INSE-0013 project BIOPRIV and Inria Project Lab CAPPRIS.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sourya Joyee De .

Editor information

Editors and Affiliations

Appendices

A Description of Phase 2 for Arch.1 and Arch.3

1.1 A.1 Arch.1: Use of an Encrypted Database

Description of Arch.1. In the simple biometric access control architecture pictured in Fig. 8, the server S stores the database of encrypted reference templates \(\overline{ebr}\) and the access control rules ac. When the user presents his identity \(ID_i\) and fresh biometric \(rd_i\) to the terminal T, T fetches the encrypted reference template \(\overline{ebr_{i}}\) from S, decrypts it using the key \(k_{br}\) and compares \(br_i\) with \(bs_i\) produced from \(rd_i\) by T (taking into account thr). The access control decision \(dec_i\) is used to allow or deny access. The access logs \(\overline{at}\) of different users are encrypted into \(\overline{eat}\) and sent back by the terminal T at regular intervals to be stored in the server S. The access log \(\overline{at}\) is updated after each access control.

The keysFootnote 8 \(k_{at}\) and \(k_{br}\), the threshold thr and access control rules ac are persistently stored in the terminal T Footnote 9. In contrast, \(\overline{at}\) is stored in T only for short time intervals. \(dec_i\), \(rd_i\), \(bs_i\), \(\overline{br_{i}}\), \(ts_i\), \(\overline{at}\), \(\overline{eat}\), \(\overline{ebr_{i}}\), \(ID_i\) are deleted from the terminal T as soon as their use is overFootnote 10.

The components in this architecture are therefore: the terminal T (C.1) and the server S (C.2).

Risk Sources for Arch.1. Since the architecture does not include any security components, we assume that no security operator is involved. The risk sources are therefore: the owner (A.1), cybercriminals (A.3), the state (A.4) and third parties (A.5). The owner (A.1) controls both the server S and the terminal T.

Personal Data for Arch.1 and Their Exploitability. At this stage, the privacy analyst presents each data element stored in each system component and its exploitability (see Table 7). As explained in Sect. 2, by “transient exploitation” of a component we mean exploitation for a short period of time or infrequent exploitation, (e.g., once in several months), whereas “persistent exploitation” means the exploitation of a component for a long period of time (e.g., for several days or months). For example, \(dec_i\) provides the result of one access control for user i, whereas \(\overline{at}\) provides the access log of all users for all previous days. So to know the access log of all users over t days, the risk source must either access all \(dec_i\) for all users for each of the t days (persistent exploitation) or access \(\overline{at}\) at the end of the t days (transient exploitation).

Table 7. Personal data in Arch.1 and their exploitability values

Refinement of Generic Harm Trees for Arch.1. In this phase, we consider the harm identity theft (H.2). Figure 9 shows the harm tree corresponding to this harm. Figure 12 in Appendix B shows how the generic harm tree (Fig. 2) for identity theft is pruned to obtain the architecture specific harm tree in Fig. 9. From Sect. A.1, we know that the risk sources for Arch.1 do not include A.2. Therefore, all branches of the generic harm tree for identity theft (H.2) that contain A.2 are pruned (pruned branches are marked by a red cross in Fig. 12). The definition of the architecture also makes it possible to instantiate the generic components \(C_i\), \(C_j\), \(C_k\), \(C_l\), \(C_m\) and \(C_n\).

Fig. 9.
figure 9

Identity theft (H.2) harm tree for architecture Arch.1

1.2 A.2 Arch.3: Match-on-Card Technology

Description of Arch.3. Arch.2 is more protective than Arch.1 as the former uses a secure component M to perform the comparison between the fresh template and the reference template. In addition, it involves a security operator (A.2) for a better separation of responsibilities. However, in Arch.2, the fresh reference template \(bs_i\) is still available in T along with \(ID_i\). Moreover, the clear template \(\overline{br_i}\) can still be accessed by the security operator (A.2) who controls M. In fact, A.2 has access to a lot of personal data. One way to overcome these difficulties is to use the match-on-card technology. In Arch.3, pictured in Fig. 10, each user possesses a smart card C that stores his identity \(ID_i\) along with his enrolled template \(br_i\) (i.e., it stores \(\overline{br_i}\)), the threshold thr and access control rules ac and performs the matching operation without disclosing \(ID_i\) or \(br_i\) to the terminal T. The owner does not store any database of reference templates.

The user inserts the card into the terminal T and submits the fresh biometric raw data \(rd_i\). T derives a fresh template \(bs_i\) from \(rd_i\) and transfers it to C. C compares \(bs_i\) with \(br_i\) using the threshold thr and transfers the result of the access control \(dec_i\) to T. T informs the user about \(dec_i\) and sends it to the physical access control mechanism. The card C does not transfer any information apart from \(dec_i\) (not even the user identity \(ID_i\)) to T. C is assumed to be completely secure (e.g., it is tamper-resistant and personalized by a certified issuer during the enrolment phase). Both \(rd_i\) and \(bs_i\) as well as \(dec_i\) are deleted from T and C as soon as their uses are over. No access log \(\overline{at}\) is recorded.

The system components in this architecture are: the terminal T (C.1) and the smart card C (C.4).

Risk Sources for Arch.3. We assume that there is no security operator (A.2) in this architecture, since the security relies only on the smart cards possessed by the users. Therefore, the risk sources to be considered include: the owner (A.1), cybercriminals (A.3), the state (A.4) and third parties (A.5). The owner (A.1) controls the terminal T.

Fig. 10.
figure 10

Architecture Arch.3: Match-On-Card technology (Color figure online)

Table 8. Personal data in Arch.3 and their exploitability values

Personal Data and Their Exploitability for Arch.3. Table 8 presents each data item stored in each system component and the corresponding exploitability values for Arch.3. A risk source must have enough technical resources to exploit T persistently to get access to \(dec_i\), \(rd_i\) or \(bs_i\). However, in contrast with Arch.1 and Arch.2, \(ID_i\) is not stored in any component in Arch.3. Thus, in order to exploit \(dec_i\) or \(rd_i\), \(bs_i\), risk sources must have \(ID_i\) as background information. Since C is considered to be secure and belongs to the user, it does not appear in Table 8.

Refinement of Generic Harm Trees for Arch.3. Figure 15 in Appendix B shows how the generic harm tree for identity theft (H.2) (presented in Fig. 2) can be pruned to derive the corresponding harm tree for Arch.3 (presented in Fig. 11). In Arch.3, \(ID_i\), \(\overline{br_i}\), \(\overline{ebr_i}\) and \(k_{br}\) are not present at any moment in any of the components that the risk sources may access (i.e., terminal T). So all branches in the generic tree corresponding to these data elements are pruned. Also, the risk source A.2 is not a part of Arch.3. So all branches concerning A.2 are pruned too. The definition of the architecture also makes it possible to instantiate the generic components \(C_i\), \(C_j\), \(C_k\), \(C_l\), \(C_m\) and \(C_n\).

Fig. 11.
figure 11

Identity theft (H.2) harm tree for architecture Arch.3

B Pruning of Harm Trees and Likelihood Computation for Identity Theft (H.2)

In this appendix, we present the harm trees for identity theft, showing in detail how branches of the generic tree are pruned based on different conditions (related to the architecture and the context) discussed in the paper.

Fig. 12.
figure 12

Pruning of the generic harm tree for identity theft (H.2) to derive the harm tree for Arch.1 (Phase 2)

Fig. 13.
figure 13

Identity theft (H.2) final harm tree for architecture Arch.1

Fig. 14.
figure 14

Pruning of the generic harm tree for identity theft (H.2) to derive the harm tree for Arch.2 (Phase 2) (Color figure online)

Fig. 15.
figure 15

Pruning of the generic harm tree for identity theft (H.2) to derive the harm tree specific to Arch.3 (Phase 2)

Fig. 16.
figure 16

Final pruning of the harm tree for identity theft (H.2) for architecture Arch.3 (Phase 3)

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

De, S.J., Le Métayer, D. (2017). A Refinement Approach for the Reuse of Privacy Risk Analysis Results. In: Schweighofer, E., Leitold, H., Mitrakas, A., Rannenberg, K. (eds) Privacy Technologies and Policy. APF 2017. Lecture Notes in Computer Science(), vol 10518. Springer, Cham. https://doi.org/10.1007/978-3-319-67280-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-67280-9_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-67279-3

  • Online ISBN: 978-3-319-67280-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics