Abstract
In this chapter, we focus on how policy design must evolve and extend beyond the technical focus applied thus far. In order for privacy protection to be internalized into the design of smart devices, and ultimately into the mindsets of developers, policy solutions that strengthen the implementation of the concept of privacy and data protection by design and default for an Internet of Things environment are necessary. This chapter closes the loop, merging the findings of the previous chapters on the legal principles, technical tools, and their interplay, in order to establish guidelines that support the development of privacy-friendly designs.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Cf. Yi/Paulet/Bertino, pp. 49 et seqq.
- 2.
Cf. Goldberg, p. 10; cf. also Feigenbaum/Ford, pp. 58 et seqq. on TOR.
- 3.
Berkman Center Report, Don’t Panic, 2016, p. 10.
- 4.
Berkman Center Report, Don’t Panic, 2016, p. 11.
- 5.
Berkman Center Report, Don’t Panic, 2016, p. 11.
- 6.
Cf. Koops/Leenes, pp. 159 et seqq.; Bowman et al. p. 61 making the same argument and stating that the weaknesses of access control tools are scalability.
- 7.
Cf. Palfrey/Gasser, Interop, pp. 75-88.
- 8.
Cf. e.g., Lee, Cisco Blog, Backwards Compatibility, 2014; cf. also Palfrey/Gasser, Interop, pp. 76-77.
- 9.
Berkman Center Report, Don’t Panic, 2016, p. 11; cf. also Catuogno/Turchi, p. 206.
- 10.
Berkman Center Report, Don’t Panic, 2016, p. 3.
- 11.
For example, Android operating systems are rarely updated on older phones even though newer operating systems would be available. Cf. Berkman Center Report, Don’t Panic, 2016, p. 11.
- 12.
Berkman Center Report, Don’t Panic, 2016, pp. 4-5; cf. also Whitten/Tygar, pp. 683 et seqq. Note that other technical tools suffer the same weakness, cf. e.g., Bowman et al., p. 62 on access control tools.
- 13.
Phil Zimmermann developed it in 1991. Cf. on the motivation behind the development of PGP, Zimmermann, PGP, pp. 37-41; cf. also Mollin, pp. 227-228. Starting as an open source software, PGP was first shared among a community of security specialists. Yet, because PGP was and is based on patented algorithms such as the RSA and ElGamal (see Chap. 6), the use of the free software could entail patent litigations. Consequently, the wide use of PGP was deterred. Furthermore, in the early 90s the US strictly prohibited the export of cryptographic systems using keys larger than 40 bits. Cf. Mollin, pp. 227-228; Rescorla, pp. 11-12; cf. also Creutzig/Buhl, pp. 55-56; Wu/Irwin, p. 996.
- 14.
First, PGP creates a session key, which is a one-time-only secret key. While the plaintext is encrypted with the session key, the session key itself is encrypted with the recipient’s (Bob’s) public key. Both the ciphertext, encrypted by the session key, and the cipher of the session key, encrypted by the public key, are sent to Bob. The decryption processes works reverse: Bob uses his private key to decrypt the session key and the session key to decrypt the ciphertext. Thereby, PGP combines both, the advantages of symmetric cryptography (computational efficiency) and the ones of public key cryptography (computationally difficult to decrypt). Cf. on the functioning of PGP: Creutzig/Buhl, p. 10, 28; Lindhorst, pp. 23-24; Mollin, pp. 227-232; Zimmermann, pp. 41-51; Wu/Irwin, pp. 996-997.
- 15.
Cf. Open Whisper Systems website <https://whispersystems.org/> (last visited November 2017).
- 16.
Brown/Marsden, pp. 53-54. Note that except from Internet Explorer, P3P has only limited support in other browser. The W3C suspended the development of a second version of the standard in 2007. Cf. Brown/Marsden, p. 54.
- 17.
Brown/Marsden, pp. 53-54; Iachello/Hong, p. 50 with further references; Wang/Kobsa, pp. 352 et seqq.
- 18.
Iachello/Hong, p. 50; cf. e.g., Langheinrich, pp. 123-127 or Ackermann, pp. 430-439.
- 19.
Berkman Center Report, Don’t Panic, 2016, p. 4.
- 20.
E.g., the messaging service WhatsApp changed to end-to-end encryption (see Greenberg, Wired, 2014). It can be assumed that most users did not alter this new feature. Cf. also Ausloos et al., pp. 14 et seqq. on the tendency of user’s not to change default settings of SNS.
- 21.
PCAST Report, 2014, p. xi; cf. also Wood/O’Brien/Gasser, p. 4 with further references; IERC, IoT Report, 2015, pp. 18-19; WP 29, Opinion on Anonymisation Techniques, pp. 12-16 in which the WP 29 argues that in general anonymization tools must be combined with other tools in order to be effective.
- 22.
Cf. in particular Ohm, pp. 1701 et seqq.; cf. also de Montjoyel et al., on the reidentifiability of credit card metadata, pp. 536-539; Kosinski/Stillwell/Graepel, unpaginated; Sweeny/Abu/Winn, p. 1 et seqq. on linking demographical information with publically available user profile.
- 23.
Often referred to are the AOL data release called “AOL Research”’ or the “Netflix Prize” data study, which both occurred in 2006; cf. Ohm, p. 1717-1722; Narayanan/Shmatikov, pp. 111 et seqq.; Rubinstein/Hartzog, pp. 704 et seqq.; Schwartz/Solove, PII, pp. 1841-1843; Research by Latanya Sweeney is also usually referred to for an example conerning public available data (such as ZIP, sex, and birth date) and the risk of identification, cf. e.g., Sweeney, Demographics, pp. 1 et seqq.
- 24.
Ohm, p. 1730; cf. also Spiekermann/Novotny, p. 193.
- 25.
In fact, for every benefit of anonymous communication (e.g., whistle-blowing, political engagement), abuses (e.g., assisting criminal activities) may arise.
- 26.
For instance, in order to counter the concern of granting anonymous authorizations, deanonymizing third-parties can be implemented. Those deanonymizing parties can—under predefined conditions—establish the link between an action and the user. Cf. Birrell/Schneider, pp. 43-44. By means of cryptographic techniques identity management systems further ensure that no corrupted deanonymizing parties can establish the link between an action and an identity. Note that identity providers themselves can also act as deanonymizing parties (e.g., Shibboleth). von Ahn and colleagues among others argue for the adaptation of anonymity protocols that allow for selective traceability when specific criteria are met. By means of a public-key encryption system and group signatures. Group signatures allows members of a group to sign messages anonymously as a member of this group. While it can be verified that the group signed the message, it cannot be verified which member signed the message, unless the group manager decides to do so. Cf. von Ahn et al., pp. 208 et seqq.
- 27.
E.g., various cloud services to backup personal data exist.
- 28.
Bits of Freedom Report, Transparent Consumer, 2016.
- 29.
Benkler, p. 19; cf. HTML5 W3C website <https://www.w3.org/TR/html5/> (last visited November 2017).
- 30.
Dommering, p. 13.
- 31.
Cf. i.a. Cavoukian/Shapiro/Cronk, pp. 1 et seqq.; Finneran Dennedy/Fox/Finneran, pp. 29 et seqq. in particular; Kalloniatis/Kavakli/Gritzalis, p. 186 in particular; Spiekermann/Cranor, pp. 67 et seqq.; ENISA Report, 2014, pp. 3 et seqq.; Opinion EDPS, 2015, pp. 10 et seqq.
- 32.
Finneran Dennedy/Fox/Finneran, p. 29.
- 33.
Finneran Dennedy/Fox/Finneran, p. 29; cf. also Cavoukian/Shapiro/Cronk, p. 3 defining privacy engineering as “the discipline of understanding how to include privacy as a non-functional requirement in system engineering.”; Gürses/del Alamo, p. 40 defining privacy engineering as “research framework that focuses on designing, implementing, adapting, and evaluating theories, methods, techniques, and tools to systematically capture and address privacy issues in the development of socio-technical systems.”; cf. also Spiekermann/Cranor, pp. 67 et seqq. differentiating between “privacy by policy” and “privacy by architecture” approaches.
- 34.
Cf. Anderson, pp. 3 et seqq.; Kalloniatis/Kavakli/Gritzalis, p. 186; Cavoukian/Shapiro/Cronk, p. 5; Stoll, pp. 219 et seqq.
- 35.
Kalloniatis/Kavakli/Gritzalis provide a literature review over engineering guidelines.
- 36.
Kalloniatis/Kavakli/Gritzalis, p. 186.
- 37.
Own classification based on literature found on the topic of privacy engineering. Cf. i.a. Cavoukian/Shapiro/Cronk, pp. 1 et seqq.; Finneran Dennedy/Fox/Finneran’s Privacy Engineer’s Manifesto; Fhom/Bayarou, pp. 235 et seqq.; Kalloniatis/Kavakli/Gritzalis, pp. 186 et seqq. who describe multiple methods/frameworks to design privacy aware information systems.
- 38.
Hoepman, p. 449.
- 39.
Fhom/Bayarou, p. 236.
- 40.
Koops/Leenes, pp. 167-168; cf. also Gürses/Troncoso/Diaz, unpaginated.
- 41.
ENISA Report, 2014, pp. 18-22; cf. also Hoepman pp. 452 et seqq. who relies on these eight privacy design strategies.
- 42.
Spiekermann/Cranor, pp. 67 et seqq.
- 43.
Various authors have argued to break down privacy and data protection principles into more concrete goals or targets. Cf. e.g., Finneran Dennedy/Fox/Finneran, in particular pp. 99 et seqq.; Oetzel/Spiekermann, p. 133 who suggest to formulate “privacy targets as action items” (as seen in modelling techniques like UML (Unified Modelling Language) and ARIS (Architecture of Integrated Information Systems)) as this would more likely promote encoding privacy by developers/engineers.
- 44.
Cf. Koops/Leenes, pp. 167-168; cf. also Gürses/Troncoso/Diaz, unpaginated; ENISA Report, 2014, pp. 19 et seqq.
- 45.
Cavoukian/Shapiro/Cronk, p. 11.
- 46.
WP 29, Opinion on IoT, 2014, pp. 21-22.
- 47.
WP 29, Opinion on IoT, 2014, pp. 22-23.
- 48.
WP 29, Opinion on IoT, 2014, pp. 22-23.
- 49.
WP 29, Opinion on IoT, 2014, p. 23.
- 50.
Threats are linked to the materialization of privacy concerns listed in Sect. 1.2; cf. also Cavoukian/Shapiro/Cronk, p. 10 stating that risk is “a function of harm and probability” and that quantifying the harm is dependent on the context and culture.
- 51.
Cf. Brost/Hoffmann, p. 138.
- 52.
Brost/Hoffmann, p. 140.
- 53.
Iachello/Hong, p. 85 with further references.
- 54.
Hong et al., pp. 91-97 in particular.
- 55.
Rubinstein/Hartzog, pp. 741-743.
- 56.
Rubinstein/Hartzog, pp. 741-743.
- 57.
Such as the classifications described in Chap. 6. Other classifications are possible, e.g., Fairchild/Ribbers, pp. 120 et seq. propose four risk classes: 0 = public level risk; I = basic level risk, II = increased risk, III = high risk; cf. also ISO 27002: 2013, 8.2.1 stating that “creating groups of information with similar protection needs and specifying information security procedures that apply to all the information in each group facilitates this.”
- 58.
Brost/Hoffmann, p. 140.
- 59.
Spiekermann and Novonty propose the idea of using BAT for anonymization technologies which originally stems from environmental law. Their suggestion can be applied to all four technical tools, namely, security, anonymity, autonomy, and transparency tools. Spiekermann/Novotny, p. 194. Cf. also Recital 13 Directive 2010/75/EU.
- 60.
Cf. Spiekermann/Novotny, p. 194; Recital 13 Directive 2010/75/EU.
- 61.
STOA assesses technology and provides the European Parliament with studies on the impact of certain technologies, such as the STOA study on the Ethical Aspects of Cyber-Physical Systems, 2016.
- 62.
Spiekermann/Novotny, p. 194.
- 63.
Landau, p. 68; cf. also Klitou, p. 285.
- 64.
Landau, p. 68 cf. also Cranor/Sadeh, pp. 8-9.
- 65.
Cf. e.g., MIT course “Engineering Ethics” or Harvard University course “Ethics for Engineers” thought in 2017.
- 66.
Cf. Landau, p. 66; cf. also Cranor/Sadeh, pp. 8-9.
- 67.
Büchi/Just/Latzer, unpaginated; Junco, pp. 45-47; Hoofnagle et al., p. 20; Palfrey/Gasser, Born Digital, pp. 73 et seqq. in particular; Thierer, Innovation, pp. 69 et seqq.
- 68.
Cf. Palfrey/Gasser, Born Digital, pp. 53 et seqq.
- 69.
Thierer, Innovation, p. 69.
- 70.
Palfrey/Gasser, Born Digital, pp. 53-54; cf. in particular studies conducted by PEW with the Youth & Media Lab at the Berkman Klein Center for Internet & Society: PEW Teens and Privacy, 2013 and PEW Privacy Advice, 2013.
- 71.
Palfrey/Gasser, Born Digital, pp. 66-69.
- 72.
Egelman et al., pp. 591 et seqq. Note that different education tools have been develoeped in this field (e.g., Oyoty chat bot that is designed to teach kids about how to be safe when sharing content online, see website Oyoty <https://www.oyoty.com/> (last visited November 2017)).
- 73.
Egelman et al., pp. 592-593 stating that: “Taken as a whole, the principles demonstrate the general types of threats to privacy, how they occur, why organizations may exploit them, what the possible consequences are, and what people can do about it.” As the TPP curriculum and TROPE materials are released with a CC license they are available to the public.
- 74.
Egelman et al., pp. 592-593.
- 75.
Cranor/Sadeh, pp. 7-9; cf. also Bamberger/Mulligan, pp. 76 et seqq. on corporate privacy management in the US; Determann, pp. 6 et seqq.; Finneran Dennedy/Fox/Finneran, p. 261. Other terms such as privacy manager and analysist are also commonly employed.
- 76.
Cf. Art. 37-39 GDPR.
- 77.
Cf. Bamberger/Mulligan, pp. 243 et seqq.
- 78.
Art. 39 GDPR.
- 79.
Cf. Bamberger/Mulligan, pp. 243 et seqq.
- 80.
Finneran Dennedy/Fox/Finneran, p. 261; cf. also Bowman et al., pp. 145 et seqq.; Cavoukian/Shapiro/Cronk, pp. 1 et seqq.
- 81.
Cavoukian/Shapiro/Cronk, p. 4.
- 82.
Cf. Bamberger/Mulligan, pp. 59 et seqq. who interviewed corporate privacy officers in the US, Germany, France, Spain, and the UK. Cf. Determann, pp. 6 et seqq. on the German “Datenschutzbeauftragter”.
- 83.
Cf. Bamberger/Mulligan, pp. 76 et seqq. (on US CPO) or pp. 94 et sqq. (on German privacy officer, referred to as DPO). Both showing that privacy officers have a “high-level, strategic, and forward-looking” role within a company (citation, p. 94).
- 84.
Finneran Dennedy/Fox/Finneran, p. 260.
- 85.
Cf. Bamberger/Mulligan, pp. 78 et seqq. (on US CPO) or pp. 95 et sqq. (on German DPO).
- 86.
Cf. tasks in Art. 39 GDPR; CNIL, DPO Interpretation, pp. 25-26.
References
Literature
Ackerman, M. (2004). Privacy in pervasive environments: Next generation labeling protocols. Personal and Ubiquitous Computing, 8(6), 430-439.
Anderson, R. (2008). Security Engineering—A Guide to Building Dependable Distributed Systems (2nd edition). Indianapolis: Wiley Publishing.
Ausloos, J., Kindt, E., Lievens, E., Valcke, P. & Dumortier, J. (2013). Guidelines for Privacy-Friendly Default Settings. KU Leuven Interdisciplinary Center for Law and ICT Working Paper Series 12/2013.
Bamberger, K. & Mulligan, D. (2015). Privacy on the Ground—Driving Corporate Behavior in the United States and Europe. Cambridge: MIT Press.
Benkler, Y. (2016). Degrees of Freedom, Dimensions of Power. Daedalus, Journal of the American Academy of Arts & Science, 145(1), 18-32.
Birrell, E. & Schneider, F. (2013). Federated Identity Management Systems: A Privacy-Based Characterization. IEEE Security & Privacy, 11(5), 36-48.
Bowman, C., Gesher, A., Grant, J. & Slate, D. (2015). The Architecture of Privacy—On Engineering Technologies that Can Deliver Trustworthy Safeguards. Sebastopol: O’Reilly.
Brost, G.S. & Hoffmann, M. (2015). Identifying Security Requirements and Privacy Concerns in Digital Health Applications. In S. Fricker, C. Thümmler & A. Gavras (Eds.), Requirements Engineering for Digital Health (pp. 133-154). Heidelberg: Springer.
Brown, I. & Marsden, C. (2013). Regulating Code—Good Governance and Better Regulation in the Information Age. Cambridge: MIT Press.
Büchi, M., Just, N. & Latzer, M. (2016). Caring is not enough: the importance of Internet skills for online privacy protection [Electronic version]. Information, Communication & Society, unpaginated. <https://doi.org/10.1080/1369118X.2016.1229001>
Cavoukian, A., Shapiro, S. & Cronk, J. (2014). Privacy Engineering: Proactively Embedding Privacy by Design. Retrieved from: <https://iapp.org/resources/article/privacy-engineering-proactively-embedding-privacy-by-design/>
Cranor, L.F. & Sadeh, N. (2013). Privacy engineering emerges as a hot new career. IEEE Potentials, Security & Privacy Lockdown, 7-9.
Creutzig, C. & Buhl, A. (1999). PGP—Pretty Good Privacy: Der Briefumschlag für Ihre Post, Translation into German (4th edition). Art d’Ameublement.
de Montjoye, Y.A., Radaelli, L., Singh, V. & Pentland, A. (2015). Unique in the shopping mall: on the reidentifiability of credit card metadata. Science, 347(6221), 536-539.
Determann, L. (2017). Datenschutz: International Compliance Field Guide. München: Beck.
Dommering, E. (2006). Regulating Technology: Code is not Law. In E. Dommering & L. Asscher (Eds.), Coding Regulation—Essays on the Normative Role of Information Technology (pp. 1-16). The Hague: T.M.C. Asser Press.
Egelman, S., Bernd, J., Friedland, G. & Garcia, D. (2016, March). The Teaching Privacy Curriculum. Proceedings of the 47th ACM Technical Symposium on Computing Science Education, Memphis, Tennessee, USA, 591-596. <https://doi.org/10.1145/2839509.2844619>
Fairchild, A. & Ribbers, P. (2011). Privacy-Enhancing Identity Management in Business. In J. Camenisch, R. Leenes & D. Sommer (Eds.), Digital Privacy (pp. 107-129). Heidelberg: Springer.
Feigenbaum, J. & Ford, B. (2015). Seeking Anonymity in an Internet Panopticon. Communications of the ACM, 58(1), 58-69.
Fhom, H.S. & Bayarou, K. (2011). Towards a Holistic Privacy Engineering Approach for Smart Grid Systems. Proceedings of International Joint Conference of IEEE TrustCom11, 234-241. <https://doi.org/10.1109/TrustCom.2011.32>
Finneran Dennedy, M., Fox, J. & Finneran, T. (2014). The Privacy Engineer’s Manifesto: Getting from Policy to Code to QA to Value [Electronic book]. McAfee Apress. <https://doi.org/10.1007/978-1-4302-6356-2>
Goldberg, I. (2008). Privacy-Enhancing Technologies for the Internet III: Ten Years Later. In A. Acquisti, S. Gritzalis, C. Lambrinoudakis, S. di Vimercatiet (Eds.), Digital Privacy—Theory, Technologies, and Practices (pp. 3-18). New York: Auerbach Publications.
Gürses, S.F. & del Alamo, J. (2016). Privacy Engineering: Shaping an Emerging Field of Research and Practice. IEEE Symposium on Security and Privacy, April/May Issue, 40-46.
Gürses, S.F., Troncoso, C. & Diaz, C. (2011). Engineering Privacy by Design. Computers, Privacy & Data Protection, unpaginated. Retrieved from <https://www.esat.kuleuven.be/cosic/publications/article-1542.pdf>
Hoepman, J.H. (2014). Privacy Design Strategies. Conference paper IFIP International Information Security Conference. Published in ICT Systems Security and Privacy Protection, 428, 446-459. <https://doi.org/10.1007/978-3-642-55415-5_38>
Hong, J., Ng, J.D., Lederer, S. & Landay, A. (2004). Privacy Risk modesl for designing privacy-sensitive ubiquitous computing systems. Proceedings of ACM conference on Designing Interactive Systems, Boston MA, USA, 91-100.
Hoofnagle, C., King, J., Li, S. & Turow, J. (2010). How Different are Young Adults From Older Adults When it Comes to Information Privacy Attitudes & Policies? Working Paper of the Annenberg School of Communication, University of Pennsylvania. Retrieved from <http://repository.upenn.edu/asc_papers/399>
Iachello, G. & Hong, J. (2007). End-User Privacy in Human-Computer Interaction. Foundation and Trends in Human-Computer Interaction, 1(1), 1-137.
Junco, R. (2015). What Are Digital Literacies and Why Do They Matter? In S. Cortesi & U. Gasser, Digitally Connected—Global Perspectives on Youth and Digital Media (pp. 45-47), Research Publication No. 2015-6.
Kalloniatis, C., Kavakli, E. & Gritzalis, S. (2009). Methods for Designing Privacy Aware Information Systems: A review. Paper presented at the 13th Panhellenic Conference on Informatics, IEEE, 185-194. <https://doi.org/10.1109/PCI.2009.45>
Klitou, D. (2014). Privacy-Invading Technologies and Privacy by Design—Safeguarding Privacy, Liberty and Security in the 21st Century. Heidelberg: Springer.
Koops, B.J. & Leenes, R (2014). Privacy Regulation Cannot Be Hardcoded: A critical comment on the ‘privacy by design’ provision in data-protection law. International Review of Computers, Technology and Law, 28(2), 159-171.
Kosinski, M., Stillwell, D. & Graepel, T. (2013). Private traits and attributes are predictable from digital records of human behavior. Proceedings of the National Academy of Science, 110(15). <https://doi.org/10.1073/pnas.1218772110>
Landau, S. (2014). Educating Engineers: Teaching Privacy in a World of Open Doors. IEEE Security & Privacy, 12(3), 66-70.
Langheinrich, M. (2005). Personal Privacy in Ubiquitous Computing—Tools and System Support. Dissertation, ETH Zurich, No. 16100.
Lindhorst, A. (2002). Das Einsteigerseminar: Sichere E-Mails mit PGP. Verlag Moderne Industrie.
Mollin, R. (2007). An Introduction to Cryptography (2nd edition). London: CRC Press.
Narayanan, A. & Shmatikov, V. (2008). Robust De-anonymization of Large Sparse Datasets. Proceeding of the 2008 IEEE Symposium on Security and Privacy, Washington DC, USA, 111-125. <https://doi.org/10.1109/SP.2008.33>
Oetzel, M.C. & Spiekermann, S. (2014). A systematic methodology for privacy impact assessments: a design science approach. European Journal of Information Systems, 23, 126-150.
Ohm, P. (2010). Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization. University of California Los Angeles Law Review, 57, 1701-1777.
Palfrey, J. & Gasser, U. (2012). Interop—The Promise and Perils of Highly Interconnected Systems. New York: Basic Books. (cited: Interop)
———— (2008). Born Digital-How Children Grow Up in a Digital Age. Revised and expanded version. New York: Basic Books (cited: Born Digital)
Rescorla, E. (2001). SSL and TLS: Designing and Building Secure Systems. Boston: Addison-Wesley.
Rubinstein, I. & Hartzog, W. (2016). Anonymization and Risk. Washington Law Review, 91(2), 703-760.
Schwartz, P. & Solove, D. (2011). The PII Problem: Privacy and a New Concept of Personally Identifiable Information. New York University Law Review, 86, 1815-1894. (cited: PII)
Spiekermann, S. & Cranor, L.F. (2009). Engineering Privacy, IEEE Transactions on Software Engineering, 35(1), 67-82.
Spiekermann, S. & Novotny, A. (2015). A vision for global privacy bridges: Technical and legal measures for international data markets. Computer Law & Security Review, 31, 181-200.
Sweeney, L. (2000). Simple Demographics Often Identify People Uniquely. Carnegie Mellon University, Data Privacy Working Paper 3. Retrieved from <http://dataprivacylab.org/projects/identifiability/paper1.pdf> (cited: Demographics)
Sweeny, L., Abu, A. & Winn, J. (2013). Identifying Participants in the Personal Genome Project by Name. Harvard University, Data Privacy Lab, White Paper 1021-1. Retrieved from <http://dataprivacylab.org/projects/pgp/>
Thierer, A. (2014). Permissionless Innovation, The Continuing Case for Comprehensive Technological Freedom. Arlington: Mercatus Center. (cited: Innovation)
von Ahn, L., Bortz, A., Hopper, N. & O’Neill, K. (2006). Selectively Traceable Anonymity. In G. Danezis & P. Golle (Eds.), Privacy Enhancing Technologies (pp. 208-222). Heidelberg: Springer.
Wang, Y. & Kobsa, A. (2008). Privacy Enhancing Technologies. In M. Gupta (Ed.), Handbook of Research on Emerging Developments in Data Privacy (pp. 352-375). ICI Global.
Whitten, A. & Tygar, D. (1999). Why Johnny Can’t Encrypt. A Usability Evaluation of PGP 5.0. In L.F. Cranor & S. Garfinkel (Eds.), Security and Usability—Designing Secure Systems That People Can Use (pp. 679-702). Sebastopol: O’Reilly.
Wood, A., O’Brien, D. & Gasser, U. (2016). Privacy and Open Data [Electronic version]. Networked Policy Series, Berkman Klein Center Research Publication No. 2016-16. Retrieved from <https://cyber.harvard.edu/publications/2016/OpenDataBriefing>
Wu, C.H. & Irwin, D. (2013). Introduction to Computer Networks and Cybersecurity. London: CRC Press.
Yi, X., Paulet, R. & Bertino, E. (2014). Homomorphic Encryption and Applications. Briefs in Computer Science. Heidelberg: Springer.
Zimmermann, P. (1999). Phil Zimmermann on PGP [Electronic version]. In Introduction to Cryptography (pp. 37-62). Network Associates. (cited: PGP) Retrieved from <ftp://ftp.pgpi.org/pub/pgp/7.0/docs/english/IntroToCrypto.pdf>
News(paper) Articles and Blogs
Lee, M. (2014, October 15). POODLE and The Curse of Backwards Compatibility. Cisco Blog. Retrieved from <http://blogs.cisco.com/security/talos/poodle-and-the-curse-of-backwards-compatibility> (cited: Lee, Cisco Blog, Backwards Compatibility, 2014)
Greenberg, A. (2014, November 18). WhatsApp Just Switched on End-to-End Encryption for Hundreds of Millions of Users. Wired. Retrieved from <https://www.wired.com/2014/11/whatsapp-encrypted-messaging/> (cited: Greenberg, Wired, 2014)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Tamò-Larrieux, A. (2018). Strengthening Privacy by Design. In: Designing for Privacy and its Legal Framework. Law, Governance and Technology Series(), vol 40. Springer, Cham. https://doi.org/10.1007/978-3-319-98624-1_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-98624-1_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-98623-4
Online ISBN: 978-3-319-98624-1
eBook Packages: Law and CriminologyLaw and Criminology (R0)