Selling your soul while negotiating the conditions: from notice and consent to data control by design

Abstract

This article claims that the Notice and Consent (N&C) approach is not efficient to protect the privacy of personal data. On the contrary, N&C could be seen as a license to freely exploit the individual’s personal data. For this reason, legislators and regulators around the world have been advocating for different and more efficient safeguards, notably through the implementation of the Privacy by Design (PbD) concept, which is predicated on the assumption that privacy cannot be assured solely by compliance with regulatory frameworks. In this sense, PbD affirms that privacy should become a key concern for developers and organisations alike, thus permeating new products and services as well as the organisational modi operandi. Through this paper, we aim at uncovering evidences of the inefficiency of the N&C approach, as well as the possibility to further enhance PbD, in order to provide the individual with increased control on her personal data. The paper aims at shifting the focus of the discussion from “take it or leave it” contracts to concrete solutions aimed at empowering individuals. As such, we are putting forth the Data Control by Design (DCD) concept, which we see as an essential complement to N&C and PbD approaches advocated by data-protection regulators. The technical mechanisms that would enable DCD are currently available (for example, User Managed Access (UMA) v1.0.1 Core Protocol). We, therefore, argue that data protection frameworks should foster the adoption of DCD mechanisms in conjunction with PbD approaches, and privacy protections should be designed in a way that allows every individual to utilise interoperable DCD tools to efficiently manage the privacy of her personal data. After having scrutinised the N&C, PbD and DCD approaches we discuss the specificities of health and genetic data, and the role of DCD in this context, stressing that the sensitivity of genetic and health data requires special scrutiny from regulators and developers alike. In conclusion, we argue that concrete solutions allowing for DCD already exist and that policy makers should join efforts together with other stakeholders to foster the concrete adoption of the DCD approach.

This is a preview of subscription content, log in to check access.

Notes

  1. 1.

    According to EU legislation, “personal data” means any information relating to an identified or identifiable natural person (“data subject”); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.” Furthermore, “data concerning health” is defined as “personal data related to the physical or mental health of a natural person, including the pro vision of health care services, which reveal information about his or her health status,” while “genetic data means personal data relating to the inherited or acquired genetic characteristics of a natural person which give unique information about the physiology or the health of that natural person and which result, in particular, from an analysis of a biological sample from the natural person in question.” See art. 4.1, 15, and 13. Regulation (EU) 2016/679, better known as General Data Protection Regulation. Due to its comprehensive nature, the EU approach is usually considered as a data protection benchmark. The U.S. approach, as an instance, has been criticised for being less coherent and consistent, offers multiple competing definitions of personal information. See e.g. Schwartz and Solove [50].

  2. 2.

    In the 1970s, growing use of automated systems aimed at collecting and processing data about individuals stimulated the elaboration various national efforts gave birth to the first privacy frameworks - e.g. in the 1974 US Fair Information Practice Principles or the French 1978 Loi Informatique et libertés (Law n°78–17) – and stimulated the first international frameworks on data protection. The OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data were adopted by the Council of the OECD and became applicable on 23 September 1980. In January 1981, the Council of Europe adopted a Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data.

  3. 3.

    See OECD [38].

  4. 4.

    See http://publications.apec.org/publication-detail.php?pub_id=390

  5. 5.

    See Bundesverfassungsgericht, Decision of 15 December 1983 (1BvR 209, 269, 362, 420, 440, 484/83), decisions. Vol. 65, 1–71.

  6. 6.

    See the keynote delivered by Joe Canatacci at the Health Privacy Summit 2016 https://www.youtube.com/watch?v=XuBWs3PBDMk

  7. 7.

    Notably, Article 8 of the Council of Europe Convention 108 and Article 12 of the EU Data Protection Directive 46/95/EC ascribe to data subjects the right to: access their personal data; to have their data deleted or blocked; and to object the use of their data for direct marketing purposes, to take automated decisions, or to be processed producing disproportionate results. The updated OECD Privacy Guidelines as well as the new EU General Data Protection Regulation further clarify that individuals enjoy also the right to have their data erased, rectified, completed or amended [37].

  8. 8.

    See Blank et al. [4].

  9. 9.

    See e.g. Obar and Oeldorf-Hirsch [34]. See also http://www.biggestlie.com/

  10. 10.

    Such criticalities are well-known to policymakers since the mid-2000s. For instance, based on empirical research [58], Federal Trade Commissioner Jon Leibowitz famously stated: “Initially, privacy policies seemed like a good idea. But in practice, they often leave a lot to be desired. In many cases, consumers don’t notice, read, or understand the privacy policies. They are often posted inconspicuously via a link at the very bottom of the site’s homepage – and filled with fine-print legalese and techno talk.”

  11. 11.

    The situation does not look rosier, in countries traditionally considered as having a highly educated population, like Japan, The Netherlands or Finland, where the percentage of illiteracy within the adult population is close to 40%, or in countries considered as highly developed, such as the U.S., France or Germany, where the percentage of illiteracy exceeds 50% of the adult population. See OECD [36].

  12. 12.

    See e.g. [1, 6]. For a reading list on data-driven economy literature, see https://www.uschamberfoundation.org/reading-list-data-driven-economy

  13. 13.

    Although not all providers make this element explicit in their PPs or ToS, some services such as the PockémonGO application openly state that “Information that we collect from our users, including [personal data], is considered to be a business asset. Thus, if we are acquired by a third party as a result of a transaction such as a merger, acquisition, or asset sale or if our assets are acquired by a third party in the event we go out of business or enter bankruptcy, some or all of our assets, including your (or your authorized child’s) [personal data], may be disclosed or transferred to a third party acquirer in connection with the transaction.” See https://www.nianticlabs.com/privacy/pokemongo/en

  14. 14.

    Cavoukian and Prosch [8] also highlight that privacy can be redesigned using a transformative “Privacy by ReDesign” process which offers a framework for undertaking a proactive assessment of existing gaps in how an organization manages personal information and addresses those gaps systematically.

  15. 15.

    See e.g. Ziegeldorf et al. [60]; KPMG [25].

  16. 16.

    See 32nd International Conference of Data Protection and Privacy Commissioners. Jerusalem, Israel 27–29 October, 2010 Resolution on Privacy by Design. https://secure.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/Documents/Cooperation/Conference_int/1 0–10-27_Jerusalem_Resolutionon_PrivacybyDesign_EN.pdf

  17. 17.

    For a discussion of the concept of legal interoperability and its applications, see Santosuosso and Malerba [46]; Weber [59]; Belli & Foditsch [3].

  18. 18.

    Data minimisation posits the collection, processing and disclosure of the minimal data necessary to perform a task, in order to reduce the chances that personal data be misused or leaked.

  19. 19.

    In addition to the projects discussed in further detail later in this paper, some cursory examples of initiatives that are intended to shift the personal data management paradigm include: 1) the QIY Foundation, which is a consortium of private and public organisations based out of the Netherlands. The QIY Foundation has developed a technology protocol and scheme of principles that are designed to help members of the consortium cooperate in a way that gives people who use their services control over their data. 2) The midata vision, which was published under the United Kingdom’s 2010 to 2015 coalition government to announce a voluntary partnership between 26 public and private organizations. The midata vision was created for the purpose of giving individuals access to their personal data on request, in machine-readable format, and in a safe way. [28, 51] 3) The Meeco digital service, built by a private company based out of Australia to help individuals add, organise, edit, and share their personal information on one secure platform. 4) The Midata.coop initiative, which is led by researchers experimenting the creation of regional cooperatives that allow people to store, manage, and control access to their health-related personal data through a combination of open source software and government regulations. [18] 5) The Custumer Commons project, aiming to develop a suite of legal terms and visual icons for individual people to set the terms for how their data can be shared and used by second and third parties, in the model of how Creative Commons terms work for copyright law. [48, 49] 6) Datamixers, an online start-up company based out of Belgium, which provides a platform for customers to access their personal data from different sources.

  20. 20.

    The term “data sovereignty” has been employed to mean different things. Katryna Dow, the CEO and Founder of the personal data management start-up Meeco, has publicly adopted the term as a tagline for the mission of the company, defining it as the concept that an individual’s personal data and information should belong to them. See https://meeco.me/why-meeco.html

  21. 21.

    Co-founder and CEO of Datacoup Matt Hogan describes Datacoup as a personal data marketplace where users can aggregate and sell their data See PSFK Labs [42].

  22. 22.

    Meeco is a service intended to help users organise, edit, share, encrypt, and search their personal information across devices. See https://meeco.me/

  23. 23.

    Founded with angel funding from Tim Draper and Marc Benioff, DataWallet is a free application that launched with a closed beta version in June 2016.

  24. 24.

    For example, services like NemID, that provides a centralised single interface for e-government services in Denmark (including healthcare), HealthBank, which is a private company whose service consolidates healthcare data management for Swiss citizens, and eIDAS, which is a web protocol designed to provide interoperable identity services, all take different approaches to allowing people to access a dynamic, up-to-date, machine readable version of their health data rather than requesting particular records from past healthcare providers every time they use a new healthcare provider.

  25. 25.

    23andMe is a privately owned, direct-to-consumer commercial online genetic testing service and Promethease is program that reanalyses genetic testing results from companies like 23andMe based on public genetic data [44].

  26. 26.

    For example, HealthTap is a mobile health start-up that allows people to consult remotely an interactive community of physicians; WebMD is a service that provides access to medical information and research through one online access portal; The Figures Janascript Library generates graphical representations of health data [26]; and OneDrop offers a mobile diabetes management tool.

  27. 27.

    Programmes like Castlight health, the Chevron Wellness Program, Apple HealthKit, CipherHealth, and the John Hancock Vitality program all utilise automated data collection methods, like smartphone-based step trackers, sleep tracking applications, and food diaries to incentivise people to track and quantify their everyday activities for the purpose of altering behaviour in a way that optimises their personal health. The incentive mechanisms of these services range from the simple appeal of gamification to monetary rewards.

  28. 28.

    Digital Health Revolution is a partnership between universities and research centres across Finland that are actively researching and prototyping new healthcare data systems and services that operate in accordance with the MyData principles of data security, interoperability, and usability.

  29. 29.

    In an American Medical Informatics Association white paper advocating the creation of a national framework for the secondary use of health data in the United Sates, Safran et al. explain that, while there would be many benefits to the secondary use of health data, “secondary use of health data poses technical, strategic, policy, process, and economic concerns related to the ability to collect, store, aggregate, link, and transmit health data broadly and repeatedly for legitimate purposes. Thus, lack of coherent policies and standard “good practices” for secondary use of health data impedes efforts to transform the U.S. health care system.” [45]

  30. 30.

    See https://www.nih.gov/precision-medicine-initiative-cohort-program

  31. 31.

    See https://www.genomicsengland.co.uk/

  32. 32.

    In Brazil, for instance, Google queries for health-related information are the second most popular searches. See http://cetic.br/publicacao/pesquisa-sobre-o-uso-das-tecnologias-de-informacao-e-comunicacao-nosdomicilios-brasileiros/

  33. 33.

    The Direct-to-Consumer tests are available on https://www.23andme.com/

  34. 34.

    In this context, it becomes useful to utilise the concept of “total control” as proposed by Michel Foucault in his idea of Society of Control in “Discipline and Punishment”.

  35. 35.

    The practice of individual profiling, collecting and automatically processing data to build hypotheses regarding personality and interests, is of great importance to businesses, as personalised advertising at an opportune moment is highly successful in conquering new customers. [27]. Notably, the “dataman” Jeffrey Hammerbache, developer of the software Cloudera, stated that, in his search for “following the smartest people to find the best problem,” healthcare is “the best problem by far.” See http://bits.blogs.nytimes.com/2013/06/19/sizing-up-big-data-broadening-beyond-the-internet/?_r=0

  36. 36.

    https://search.coe.int/cm/pages/result_details.aspx?objectid=09000016806b2c5f.

  37. 37.

    Direct-to-consumer genetic testing refers to genetic tests that are marketed directly to consumers via television, print advertisements, or the Internet.

  38. 38.

    Between 2013 and 2015, the US Food and Drug Administration (FDA) ordered 23andMe to discontinue marketing its personal genome service (PGS), concerned about the potential consequences of customers receiving inaccurate health results. See http://www.nytimes.com/2015/10/21/business/23andme-will-resumegiving-users-health-data.html?_r=0.

  39. 39.

    In this sense, Genewatch Executive Directore Helen Wallace has stressed that when genetic tests “are not regulated and the science is still poorly understood - so there is a real danger people could be misled about their health” and that her “main concern is that the human genome is set to become a massive marketing scam.” See https://www.theguardian.com/science/2008/jan/22/genetics.health

  40. 40.

    In this hypothesis, a person who has a genetic mutation which presents a threat and who is strongly advised by doctors not to have children would be unable to experience parenthood.

  41. 41.

    See http://www.wired.com/2016/02/schools-kicked-boy-based-dna/.

  42. 42.

    High impact information are those which reveal high propensity to certain serious illnesses, such as the presence of the mutation BRCA, which is correlated with breast cancer instances, or the genetic mutation that points to Huntingdon’s disease in the future.

  43. 43.

    This is particularly sensitive in countries where the health system holds employers responsible for providing health assistance to their employees.

  44. 44.

    The guarantee of the right not to know does not resolve the complexity of the question in that. Even if a person states that he/she does not want to know, in cases where a possible genetic mutation has to be or probably will be shared with blood relatives, they may demand the right to be informed, particularly in cases where treatment is available. For this reasons, in some contexts, the familial consent is discussed. “However, familial versions of informed consent could not be instituted without obstructing individuals who for medical or other reasons seek information about their own genetic status, yet lack familial consent to do so” [39].

References

  1. 1.

    Acquisti A. (2010). The Economics of Personal Data and the Economics of Privacy. Joint WPISP-WPIE Roundtable. Background Paper #3. OECD Conference Centre. https://www.oecd.org/sti/ieconomy/46968784.pdf.

  2. 2.

    Belli, L. & Venturini, J. (2016). Private ordering and the rise of terms of service as cyber- regulation. Internet Policy Review, 5(4). https://policyreview.info/node/441/pdf.

  3. 3.

    Belli, L. and Foditsch, N. (2016) “Network Neutrality: An Empirical Approach to Legal Interoperability”, in Belli, L. and De Filippi, P. (Eds.) Net neutrality compendium: human rights, free competition and the future of the internet. Springer.

  4. 4.

    Blank G, Bolsover G, Dubois E. New privacy paradox: young people and privacy on social network sites. Global Cyber Security Capacity Centre: Draft Working Paper; 2014.

    Google Scholar 

  5. 5.

    Cannataci, J. (2016). Report of the Special Rapporteur on the right to privacy, Joseph A. Cannataci. A/HRC/31/64.

  6. 6.

    Cattaneo G. et al. (2015). European Data Market SMART 2013/0063. D6 — First Interim Report.https://idc-emea.app.box.com/s/k7xv0u3gl6xfvq1rl667xqmw69pzk790.

  7. 7.

    Cavoukian A. Privacy by design: the 7 foundational principles. Ontario: Office of the Information and Privacy Commissioner; 2009. Retrieved May 30, 2016 from https://www.ipc.on.ca/images/Resources/7foundationalprinciples.pdf

    Google Scholar 

  8. 8.

    Cavoukian A. and Prosch, M. (2011). Privacy by ReDesign: Building a Better Legacy. http://privacybydesign.ca/content/uploads/2010/11/ PbRD.pdf.

  9. 9.

    Cohen JE. Configuring the networked self: law, code, and the play of everyday practice. New Haven, CT: Yale University Press; 2012.

    Google Scholar 

  10. 10.

    Conner-Simons A. Web Inventor Tim Berners-Lee’s Next Project: A Platform that gives users control of their data. 2015; In MIT CSAIL. http://www.csail.mit.edu/solid_mastercard_gift

  11. 11.

    Cooper, et al. Privacy considerations for internet protocols. RFC. 2013;6973 https://tools.ietf.org/html/rfc6973#ref-PbD

  12. 12.

    ENISA (2014). Privacy and data protection by design. From policy to engineering. https://www.enisa.europa.eu/publications/privacy-and-data-protection-by-design.

    Google Scholar 

  13. 13.

    European Commission (2015). Special Eurobarometer 431 “Data Protection.” http://ec.europa.eu/public_opinion/archives/ebs/ebs_431_en.pdf.

  14. 14.

    Funk, C., Kennedy, B., & Podrebarac Sciupac, E. (2016). U.S. Public Wary of Biomedical Technologies to ‘Enhance’ Human Abilities. Pew Research Center. http://www.pewinternet.org/2016/07/26/u-s-public-wary-of-biomedical-technologies-to-enhance-human-abilities/.

  15. 15.

    Geller, L. et al. Individual, family, and societal dimensions of genetic discrimination: a case study analysis. In: ALPER, J. et al. (Eds.). The doubleedged helix: social implications of genetics in a diverse society. Baltimore: The Johns Hopkins University Press, 2002. p. 247-266.

  16. 16.

    Gjorgievska, A. (2016). Google and Facebook lead digital ad industry to revenue record. Bloomberg Technology. https://www.bloomberg.com/news/articles/2016-04-22/google-and-facebook-lead-digital-ad-industry-to-revenue-record.

    Google Scholar 

  17. 17.

    Guedes, Cristiano & Diniz, D. (2007). Um caso de discriminação genética: o traço falciforme no Brasil. PHYSIS: Rev. Saúde Coletiva, Rio de Janeiro, 17(3):501-520, 2007 Available at http://www.scielo.br/pdf/physis/v17n3/v17n3a06.pdf.

  18. 18.

    Hafen E, Kossmann D, Brand A. Health data cooperatives - citizen empowerment. Methods Inf Med. 2014;53(2):82–6. doi:10.3414/ME13-02-0051.

    Article  Google Scholar 

  19. 19.

    Hanen, Marsha. (2009). Genetic Technologies and Medicine: Privacy, Identity, and Informed Consent. Lessons from the identity trial: Anonymity, Privacy and Identity in a Networked Society. Available on http://idtrail.org/content/view/799.html.

  20. 20.

    Hull G. (2015). Successful failure: what Foucault can teach us about privacy self-Management in a World of Facebook and big data. In Ethics and Information Technology 17(2).doi:10.1007/s10676-015-9363-z.

  21. 21.

    Jerome, J. (2013). Buying and Selling Privacy: Big Data's Different Burdens and Benefits Available on http://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=1513383.

  22. 22.

    Junges, José Roque. Recktenwald, Micheli. Hebert, Noéli Daiãm Raymundo. Moretti, Andressa Wagner. Pereira, Bárbara Nicole Karlinski. (2015) Sigilo e provacidade das informações sobre usuário nas equipes de atenção básica à saúde: revisão. Revista Bioética: 2015–23 (1). Available on http://revistabioetica.cfm.org.br/index.php/revista_bioetica/article/view/1000.

  23. 23.

    Kellogg, B. (2016). DataWallet Launches to Empower Consumers to Claim the Profits Made with Their Data. prweb. http://www.prweb.com/releases/2016/06/prweb13479668.htm.

  24. 24.

    Kerr I. et al. (2009). Soft Surveillance, Hard Consent: The Law and Psychology of Engineering Consent. Lessons from the identity trial: Anonymity, Privacy and Identity in a Networked Society. Available on http://idtrail.org/content/view/799.html.

  25. 25.

    KPMG. The internet of things: should We embrace its full potential? Cyber Insights Magazine: Edition. 2015;3 https://assets.kpmg.com/content/dam/kpmg/pdf/2016/04/ch-the-internet-of-things-en.pdf

  26. 26.

    Ledesma A, Al-Musawi M, Nieminen H. Health figures: an open source JavaScript library for health data visualization. BMC Medial Informatics and Decision Making. 2016; doi:10.1186/s12911-016-0275-6.

    Google Scholar 

  27. 27.

    Louzada, L. (2015). Bancos de Perfis Genéticos para fins de investigação criminal: reflexões sobre a regulamentação no Brasil. Dissertação de Mestrado. Programa de Pós-Graduação em Ciências Sociais e Jurídicas da Universidade Federal Fluminense (PPGSD/UFF).

  28. 28.

    The midata vision of consumer empowerment. From the Department for Busniess, Innovation & Skills and The Rt Hon Edward Davey. 2011; https://www.gov.uk/government/news/the-midata-vision-of-consumer-empowerment

  29. 29.

    Machulak, M. P., Maler, E. L., Catalano, D., & Van Moorsel, A. (2010). User-managed access to web resources. In Proceedings of the 6th ACM workshop on Digital identity management (pp. 35–44). ACM.

  30. 30.

    Mantovani E, Quinn P, Guihen B, Habbig A, De Hert P. eHealth to mHealth – a journey precariously dependent upon apps? European Journal of ePractice. 2013;20:48–66. http://www.vub.ac.be/LSTS/pub/Dehert/461.pdf

    Google Scholar 

  31. 31.

    McDonald A.M. and Cranor L.F. (2008). The Cost of Reading Privacy Policies. In I/S: A Journal of Law and Policy for the Information Society. 2008 Privacy Year in Review issue.

  32. 32.

    Mitchell A. From data hoarding to data sharing. Journal of Direct, Data and Digital Marketing Practice. 2012;13(4):325–34. doi:10.1057/dddmp.2012.3.

    Article  Google Scholar 

  33. 33.

    Nebert D, Bingham E. Pharmacogenomics: out of the lab and into the community. Trends Biotechnol. 2001;19(12)

  34. 34.

    Obar J. A. and Oeldorf-Hirsch A. (2016). The Biggest Lie on the Internet: Ignoring the Privacy Policies and Terms of Service Policies of Social Networking Services http://ssrn.com/abstract=2757465.

  35. 35.

    OECD. (2013a). Exploring the Economics of Personal Data: A Survey of Methodologies for Measuring Monetary Value. OECD Digital Economy Papers, No. 220. OECD Publishing. Paris. doi:10.1787/5k486qtxldmq-en.

  36. 36.

    OECD. OECD skills outlook 2013: first results from the survey of adult skills. OECD Publishing. 2013b; doi:10.1787/9789264204256-en.

    Google Scholar 

  37. 37.

    OECD. Recommendation of the Council concerning Guidelines governing the Protection of Privacy and Transborder Flows of Personal Data. 2013c; http://www.oecd.org/sti/ieconomy/oecd_privacy_framework.pdf

  38. 38.

    OECD. Recommendation of the Council concerning Guidelines governing the Protection of Privacy and Transborder Flows of Personal Data. 1980; http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm

  39. 39.

    O’Neil O. Informed consent and genetic information. Studies in History Philosophy of Biology and Biomedical Sciences. 2001;32(4)

  40. 40.

    Poikola, A., Kuikkaniemi, K. and Honko, H. (2015). “MyData: A Nordic Model for human-centered personal data management and processing.” Finnish Ministry of Transport and Communications. http://www.lvm.fi/documents/20181/859937/MyData-nordic-model/2e9b4eb0-68d7-463b- 9460-821493449a63?version=1.0.

  41. 41.

    Rainie, L. (2016). The state of privacy in America: what we learned. Pew Research Center. http://www.pewresearch.org/fact-tank/2016/01/20/the-state-of-privacy-in-america/.

    Google Scholar 

  42. 42.

    PSFK Labs. Creating a Transparent Marketplace for Personal Data. 2015; http://www.psfk.com/2015/08/personal-data-datacoup-personal-information-marketplace-matt-hogan.html

  43. 43.

    Ramirez, A., Brill, J., Ohlhausen, M., Wright, J. and McSweeney, T. (2014). Data brokers: a call for transparency and accountability. Federal Trade Commission. https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf.

    Google Scholar 

  44. 44.

    Ruckenstein M. Keeping data alive: talking DTC genetic testing. Information, Communication & Society. 2016:1–16. doi:10.1080/1369118X.2016.1203975.

  45. 45.

    Safran, C., Bloomrosen, M., Hammond, W.E., Labkoff, S., Markel-Fox, S., Tang, P., & Detmer, D. (2007). Toward a National Framework for the secondary use of health data: an American medical informatics association white paper.

    Google Scholar 

  46. 46.

    Santosuosso A. and Malerba A. (2014). Legal interoperability as a comprehensive concept in transnational law. Law, Innovation and Technology 6(1) http://www.tandfonline.com/doi/abs/10.5235/17579961.6.1.51.

  47. 47.

    Searls D. The intention economy: when customers take charge. Cambridge: Harvard Business Review Press; 2012.

    Google Scholar 

  48. 48.

    Searls, D. (2016a). Time for THEM to agree to OUR terms. Customer Commons Blog. http://customercommons.org/blog/.

    Google Scholar 

  49. 49.

    Searls, D. (2016b). At last, a protocol to connect VRM and CRM. ProjectVRM Blog. http://blogs.harvard.edu/vrm/2016/08/18/at-last-a-protocol-to-connect-vrm-and-crm/.

    Google Scholar 

  50. 50.

    Schwartz P.M. & Solove D.J. (2011). The PII Problem: Privacy and a New Concept of Personally Identifiable Information. N.Y.U. L. Rev. 86.

  51. 51.

    Shadbolt N. Midata: towards a personal information revolution. Digital Enlightenment Yearbook. 2013:202–24.

  52. 52.

    “User-Managed Access (UMA) Profile of OAuth 2.0”. Retrieved on 30 September 2016 from https://docs.kantarainitiative.org/uma/rec-uma-core.html.

  53. 53.

    Vaidhyanathan, S. (2015). The rise of the Cryptopticon. The Hedgehog Review 17(1). http://www.iasc-culture.org/THR/THR_article_2015_Spring_Vaidhyanathan.php.

  54. 54.

    Van Blarkom G.W., Borking J.J. and Olk J.G.E. (2003). Handbook of privacy and privacy-enhancing technologies the case of intelligent software agents. PISA Consortium.

    Google Scholar 

  55. 55.

    Van Rossum H, et al. Privacy-enhancing technologies: the path to anonymity. In: Registratiekamer, the Netherlands, and information and privacy commissioner. Ontario: Canada; 1995.

    Google Scholar 

  56. 56.

    Venturini, J. et al. (2016). Terms of service and human rights: an analysis of online platform contractual agreements. Revan Editor. http://internet-governance.fgv.br/sites/internet-governance.fgv.br/files/publicacoes/tos_0.pdf.

    Google Scholar 

  57. 57.

    World Economic Forum. Personal Data: The Emergence of a New Asset Class. 2011; http://www3.weforum.org/docs/WEF_ITTC_PersonalDataNewAsset_Report_2011.pdf

  58. 58.

    Williams F. (2006). Internet privacy policies: a composite index for measuring compliance to the fair information principles.

    Google Scholar 

  59. 59.

    Weber, R. (2014). Legal interoperability as a tool for combatting fragmentation. GlobalCommission on Internet Governance, Paper Series n°4. https://www.cigionline.org/sites/default/files/gcig_paper_no4.pdf.

  60. 60.

    Ziegeldorf JH, Garcia Morchon O, Wehrle K. Privacy in the internet of things: threats and challenges. Security and Communication Networks. 2014;7:12. doi:10.1002/sec.795.

    Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Luca Belli.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Funding

There is no funding source.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

This article is part of the Topical Collection on Privacy and Security of Medical Information

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Belli, L., Schwartz, M. & Louzada, L. Selling your soul while negotiating the conditions: from notice and consent to data control by design. Health Technol. 7, 453–467 (2017). https://doi.org/10.1007/s12553-017-0185-3

Download citation

Keywords

  • Notice and consent
  • Privacy by design
  • Data control by design
  • Data protection
  • Health data