Skip to main content

Better Data Protection by Design Through Multicriteria Decision Making: On False Tradeoffs Between Privacy and Utility

  • Conference paper
  • First Online:
Privacy Technologies and Policy (APF 2017)

Part of the book series: Lecture Notes in Computer Science ((LNSC,volume 10518))

Included in the following conference series:

Abstract

Data Protection by Design (DPbD, also known as Privacy by Design) has received much attention in recent years as a method for building data protection into IT systems from the start. In the EU, DPbD will become mandatory from 2018 onwards under the GDPR. In earlier work, we emphasized the multidisciplinary nature of DPbD. The present paper builds on this to argue that DPbD also needs a multicriteria approach that goes beyond the traditional focus on (data) privacy (even if understood in its multiple meanings).

The paper is based on the results of a survey (n = 101) among employees of a large institution concerning the introduction of technology that tracks some of their behaviour. Even though a substantial portion of respondents are security/privacy researchers, concerns revolved strongly around social consequences of the technology change, usability issues, and transparency. The results taken together indicate that the decrease in privacy through data collection was associated with (a) an increase in accountability, (b) the blocking of non-authorized uses of resources, (c) a decrease in usability, (d) an altered perception of a communal space, (e) altered actions in the communal space, and (f) an increased salience of how decisions are made and communicated. These results call into question the models from computer science/data mining that posit a privacy-utility tradeoff. Instead, this paper argues, multicriteria notions of utility are needed, and this leads to design spaces in which less privacy may be associated with less utility rather than be compensated for by more utility, as the standard tradeoff models suggest. The paper concludes with an outlook on activities aimed at raising awareness and bringing the wider notion of DPbD into decision processes.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    For example, halal/kosher food preference can be an indicator of religion.

  2. 2.

    The button was a fictitious PET, initially thought of as a version of distributed and possibly collaborative anonymisation [20], but open to interpretation by respondents.

  3. 3.

    It is possible, but irrelevant for the results, that this person had first flipped through the online version, was blocked from re-taking it, and therefore filled in the paper version.

  4. 4.

    In reaction to this, decision makers said that observed cases had in fact been communicated, and asked whether it was their task to prove abuse – which indeed would be impossible without another form of surveillance technology.

  5. 5.

    In the case study, the latter two were mentioned by administrative/management personnel involved in card-reader deployment, in a follow-up interview of the survey.

  6. 6.

    This constraint on utility also illustrates the dependence of technical solutions’ utility on purposes. The proposal “no plastic cups and no cups in the cafeteria” to Q2 would serve the purpose of barring non-authorised use, but not that of accountability\(_2\). However, the existence of this purpose was likely unknown to respondents.

  7. 7.

    The semantics of the risk that are generally used in risk-utility models focus on an individuum-centric notion of privacy. The current focus on the risks of tracking using personal data (see Sect. 3) follows this approach. Certainly privacy is not only an individual but also a collective value, so some aspects of “altered perceptions of a communal space” could be modelled as an additional factor of privacy risk. However, it appears questionable to subsume also usability or the salience of decision making under “privacy risks”.

  8. 8.

    complemented by privacy protection as an opacity tool towards the powerless (data subjects).

References

  1. Article 29 Working Party (2001). Opinion 8/2001 on the Processing of Personal Data in the Employment Context. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2001/wp48_en.pdf

  2. Bertino, E., Lin, D., Jiang, W.: A survey of quantification of privacy preserving data mining algorithms. In: Aggarwal, C.C., Yu, P.S. (eds.) Privacy-preserving Data Mining: Models and Algorithms, pp. 181–200. Springer, New York (2008)

    Google Scholar 

  3. Boyatzis, R.: Transforming Qualitative Information: Thematic Analysis and Code Development. Sage, London (1998)

    Google Scholar 

  4. Bovens, M.: Analysing and Assessing Public Accountability. A Conceptual Framework. European Governance Papers (EUROGOV) No. C-06-01 (2006). http://www.connex-network.org/eurogov/pdf/egp-connex-C-06-01.pdf

  5. Crespo García, A., et al.: PRIPARE. Privacy- and Security-by design Methodology Handbook (2016). http://pripareproject.eu/wp-content/uploads/2013/11/PRIPARE-Methodology-Handbook-Final-Feb-24-2016.pdf. Accessed 13 Apr 2017

  6. Danezis, G., Domingo-Ferrer, J., Hansen, M., Hoepman, J.-H., Le Métayer, D., Tirtea, R., Schiffner, S.: Privacy and Data Protection by Design from Policy to Engineering. ENISA Report (2014). https://www.enisa.europa.eu/publications/privacy-and-data-protection-by-design. Accessed 13 Apr 2017

  7. Data Protection Commissioner (undated). Guidance Note for Data Controllers on Location Data. https://www.dataprotection.ie/docs/Guidance-Note-for-Data-Controllers-on-Location-Data/1587.htm. Accessed 13 Apr 2017

  8. De Hert, P., Gutwirth, S.: Privacy, data protection and law enforcement. Opacity of the individual and transparency and power. In: Claes, E., Duff, A., Gutwirth, S. (eds.) Privacy and the Criminal Law, pp. 61–104. Intersentia, Antwerp (2006)

    Google Scholar 

  9. De Wolf, R., Vanderhoven, E., Berendt, B., Pierson, J., Schellens, T.: Self-reflection on privacy research in social networking sites. Behav. Inform. Technol. (2016). doi:10.1080/0144929X.2016.1242653

  10. Domingo-Ferrer, J., Martínez, S., Sánchez, D., Soria-Comas, J.: Co-Utility: self-enforcing protocols for the mutual benefit of participants. Eng. Appl. AI 59, 148–158 (2017)

    Article  Google Scholar 

  11. Duncan, G.T., Keller-McNulty, S.A., Stokes, S.L.: Disclosure Risk vs. Data Utility: The R-U Confidentiality Map. National Institute of Statistical Sciences. Technical report Number 121 (2001). http://www.niss.org/sites/default/files/technicalreports/tr121.pdf. Accessed 13 Apr 2017

  12. Elliot, M., Mackey, E., O’Hary, K., Tudor, C.: The Anonymisation Decision-Making Framework. Manchester, UK: UKAN (2016). http://ukanon.net/wp-content/uploads/2015/05/The-Anonymisation-Decision-making-Framework.pdf. Accessed 13 Apr 2017

  13. European Union Agency For Fundamental Rights FRA. Twelve operational fundamental rights considerations for law enforcement when processing Passenger Name Record (PNR) data (2014). https://fra.europa.eu/sites/default/files/fra-2014-fundamental-rights-considerations-pnr-data-en.pdf. Accessed 13 Apr 2017

  14. Gürses, S., Diaz, C.: Two tales of privacy in online social networks. IEEE Secur. Priv. 11(3), 2937 (2013)

    Article  Google Scholar 

  15. Hendrickx, F.: Protection of Workers’ Personal Data in the European Union: Two Studies. http://ec.europa.eu/social/BlobServlet/docId=2507. Accessed 13 Apr 2017

  16. Herelixka, E.: Experiencing a Privacy Enhancing Technology. An Exploratory User Study of Collaborative Anonymization. Masters Thesis. KU Leuven, Faculty of Science (2016)

    Google Scholar 

  17. Jameson, A., Berendt, B., Gabrielli, S., Cena, F., Gena, C., Vernero, F., Reinecke, K.: Choice architecture for Human-Computer Interaction. Found. Trends Hum. Comput. Inter. 7(1–2), 1–235 (2014)

    Google Scholar 

  18. Renaud, K., Volkamer, M., Renkema-Padmos, A.: Why doesn’t jane protect her privacy? In: De Cristofaro, E., Murdoch, S.J. (eds.) PETS 2014. LNCS, vol. 8555, pp. 244–262. Springer, Cham (2014). doi:10.1007/978-3-319-08506-7_13

    Google Scholar 

  19. Schaar, P.: Privacy by design. Identity Inform. Soc. 3(2), 267–274 (2010)

    Article  Google Scholar 

  20. Soria-Comas, J., Domingo-Ferrer, J.: Co-utile collaborative anonymization of microdata. In: Torra, V., Narukawa, Y. (eds.) MDAI 2015. LNCS, vol. 9321, pp. 192–206. Springer, Cham (2015). doi:10.1007/978-3-319-23240-9_16

    Chapter  Google Scholar 

  21. Tsormpatzoudi, P., Berendt, B., Coudert, F.: Privacy by design: from research and policy to practice – the challenge of multi-disciplinarity. In: Berendt, B., Engel, T., Ikonomou, D., Le Métayer, D., Schiffner, S. (eds.) APF 2015. LNCS, vol. 9484, pp. 199–212. Springer, Cham (2016). doi:10.1007/978-3-319-31456-3_12

    Chapter  Google Scholar 

Download references

Acknowledgements

I thank all respondents of the survey for their thought-inspiring answers, and all those involved in the “New Developments in data privacy” workshops 2016 for support and valuable ideas: the Cambridge University Isaac Newton Institute and Turing Gateway to Mathematics, the organisers Mark Elliot, Natalie Shlomo and Chris Skinner, and all participants. Ralf De Wolf has provided helpful comments an on earlier version of the text.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bettina Berendt .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Berendt, B. (2017). Better Data Protection by Design Through Multicriteria Decision Making: On False Tradeoffs Between Privacy and Utility. In: Schweighofer, E., Leitold, H., Mitrakas, A., Rannenberg, K. (eds) Privacy Technologies and Policy. APF 2017. Lecture Notes in Computer Science(), vol 10518. Springer, Cham. https://doi.org/10.1007/978-3-319-67280-9_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-67280-9_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-67279-3

  • Online ISBN: 978-3-319-67280-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics