Abstract
European Union data protection law aims to protect individuals from privacy intrusions through a myriad of procedural tools that reflect a code of fair information principles. These tools empower the individual by giving him/her rights to control the processing of his/her data. One of the problems, however, with the European Union’s current reliance on fair information principles is that these tools are increasingly challenged by the technological reality. And, perhaps nowhere is this more evident than when it comes data mining, which puts the European Union data protection principles to the ultimate test. As early as 1998, commentators have noted that there is quite a paradoxical relation between data mining and some data protection principles.This paper seeks to explore this so-called paradoxical relationship further and to specifically examine how data mining calls into question the purpose limitation principle. Particular attention will be paid to how data mining defies this principle in a way that data analysis tools of the recent past do not.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Westin 1967.
- 2.
- 3.
Article 6(1)(b) of EU Data Protection Directive 95/46/EC; Article 5(b) CoE Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data; Article 3 of EU Framework Decision 2008/977/JHA.
- 4.
- 5.
Westin 1967.
- 6.
Westin 1967.
- 7.
Mulligan and King 2012.
- 8.
Westin 1967, p. 33.
- 9.
Westin 1967, p. 33.
- 10.
Westin 1967, p. 34.
- 11.
Westin 1967, p. 35.
- 12.
Westin 1967, p. 36.
- 13.
Westin 1967, p. 38.
- 14.
Westin 1967, pp. 31–32.
- 15.
Westin 1967, pp. 31–32.
- 16.
Westin 1967, pp. 31–32.
- 17.
Westin 1967, pp. 31–32.
- 18.
Westin 1967, pp. 31–32.
- 19.
Solove 2008.
- 20.
Allen 1988.
- 21.
Solove 2002.
- 22.
- 23.
Schwartz 2000.
- 24.
Schwartz 2000.
- 25.
See generally, Cohen 2000.
- 26.
Nissenbaum 2010, p. 71 (raising the question, ”(h)as a person who intentionally posts photographs of himself to a Web site such as Flickr lost privacy?”).
- 27.
- 28.
Regan 1995.
- 29.
Schwartz 1999.
- 30.
Peppet 2012.
- 31.
Nissenbaum 2010, p. 71 (raising the question, ”(h)as a person who intentionally posts photographs of himself to a Web site such as Flickr lost privacy?”).
- 32.
Nissenbaum 2010, p. 71 (raising the question, ”(h)as a person who intentionally posts photographs of himself to a Web site such as Flickr lost privacy?”).
- 33.
Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, adopted Sept. 23, 1980 (explaining “(t)he purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfillment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.”).
- 34.
Cate 2006 (raising the following serious of questions: “What is the difference between ‘collection limitation,’ ‘purpose specification,’ and ‘use limitation,’ all three of which appear in the OECD Guidelines, and how do they compare with ‘purpose limitation’ as that term is used to describe the EU directive? Does the latter include all three of the former?”).
- 35.
Article 6(1)(b) of Directive 95/46/EC; see also, Article 3 of Framework Decision 2008/977/JHA and Article 5(b) of the Council of Europe Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, ETS 108 (1981).
- 36.
See generally, Article 9 of the Council of Europe Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, ETS 108 (1981) (stating, “(d)erogation from (the purpose limitation principle) shall be allowed when such derogation is provided for by the law of the Party and constitutes a necessary measure in a democratic society in the interests of: protecting State security, public safety, the monetary interests of the State or the suppression of criminal offences (sic); protecting the data subject or the rights and freedoms of others.”); see also, Rotaru v Romania, ECtHR, judgment of 4 May 2000 (Para. 55) (adding added another three conditions, notably that violations of privacy should also be precise, foreseeable and proportionate).
- 37.
Article 29 Data Protection Working Party: Opinion 03/2013 on purpose limitation (2 April 2013) available online at http://idpc.gov.mt/dbfile.aspx/Opinion3_2013.pdf.
- 38.
Article 29 Data Protection Working Party: Opinion 03/2013 on purpose limitation (2 April 2013) available online at http://idpc.gov.mt/dbfile.aspx/Opinion3_2013.pdf.
- 39.
Article 29 Data Protection Working Party: Opinion 03/2013 on purpose limitation (2 April 2013) available online at http://idpc.gov.mt/dbfile.aspx/Opinion3_2013.pdf.
- 40.
Article 29 Data Protection Working Party: Opinion 03/2013 on purpose limitation (2 April 2013) available online at http://idpc.gov.mt/dbfile.aspx/Opinion3_2013.pdf.
- 41.
See generally, Rotaru v Romania, ECtHR, judgment of 4 May 2000; see also, Leander v. Sweden, ECtHR, judgment of 26 March 1987.
- 42.
Westin, Alan. 1967. Privacy and freedom. New York: Atheneum.
- 43.
ECtHR, Peck v. the United Kingdom (Para. 62); see also, Solove 2011.
- 44.
White 2012.
- 45.
See Cate 2006.
- 46.
Korff and Brown 2010.
- 47.
Korff and Brown 2010 (explaining that “…UK law refers to ‘policing purposes’ in one breath (and thus allows data obtained for one police purpose to be used for any such purpose), where German law strictly distinguishes between ‘countering immediate threats’, ‘general and specific prevention’, and ‘investigation and prosecution of [suspected] criminal offences.’”).
- 48.
Colonna 2012.
- 49.
Colonna 2012.
- 50.
EDPS Opinion on the data protection reform package (March 7, 2012).
- 51.
- 52.
Kuner et al. 2012.
- 53.
Kuner et al. 2012 (explaining that metadata is “data about when and where and how the underlying information was generated.”); see also, Biersdorfer 2006 (explaining that (m)etadata, a term created by the fusion of an ancient Greek prefix with a Latin word, has come to mean “information about information” when used in technology and database contexts. The Greek meta means behind, hidden or after, and refers to something in the background or not obviously visible, yet still present. Data, the Latin term, is factual information used for calculating, reasoning or measuring.): see also, Government Surveillance 2012 (explaining that "(m)etadata (the records of who people call and e-mail, and when, as distinct from the content of conversations) can now be amassed on a vast scale, and run through powerful software that can use it to create a fairly complete portrait of a person’s life and habits–often far more complete than just a few recorded conversations.).
- 54.
Han and Kamber 2001.
- 55.
Fayyad and Uthurusamy 2002.
- 56.
Han and Kamber 2001; see also, Symeonidis and Mitkas 2005 (stating, “(t)he human quest for knowledge and the inability to perceive the—continuously increasing—data volumes of a system has led to what we today call data mining.”); Schermer 2011, p. 45 (stating “(o)ver the past decades (data mining) as evolved from an experimental technology to an important instrument for private companies and institutions to help overcome to problem of information overload.”).
- 57.
Kuner et al. 2012.
- 58.
For an example of the lack of agreement on what “data mining” actually means compare the definition of “data mining” in the 2007 US Data Mining Reporting Act (defining “data mining” in such a manner that requires the reporting of “pattern-based” tools but not “linked based” tools), with the definition of “data mining” provided by the US General Accountability Office (GAO)(defining data mining in its May 2004 report entitled “Data Mining: Federal Efforts Cover a Wide Range of Uses”, more broadly as “the application of database technology and techniques—such as statistical analysis and modeling – to uncover hidden patterns and subtle relationships in data and to infer rules that allow for the prediction of future results.”).
- 59.
Han and Kamber 2001.
- 60.
Han and Kamber 2001.
- 61.
See e.g., Sivanandam and Sumathi 2006.
- 62.
Sivanandam and Sumathi 2006.
- 63.
Jackson 2002.
- 64.
Sivanandam 2006.
- 65.
Symeonidis and Mitkas 2005.
- 66.
Zarsky 2011; see also, Schermer 2011 (explaining, “(t)he goal of descriptive data mining is to discover unknown relations between different data objects in a database. Descriptive data mining algorithms try to discover knowledge about a certain domain by determining commonalties between different objects and attributes. By discovering correlations between data objects in a dataset that is representative of a certain domain, we can gain insight to it.”).
- 67.
Symeonidis and Mitkas 2005.
- 68.
Zarsky 2011 (explaining that “(i)n a predictive process, the analysts use data mining applications to generate rules based on preexisting data. Thereafter, these rules are applied to newer (while partial) data, which is constantly gathered and examined as the software constantly searches for previously encountered patterns and rules. Based on new information and previously established patterns, the analysts strive to predict outcomes prior to their occurrence (while assuming that the patterns revealed in the past pertain to the current data as well."); see also, Schermer 2011 (explaining “(a)s the name implies, the goal of predictive data mining is to make a prediction about events based on patterns that were determines using known information.”); Whitehorn 2006.
- 69.
Custers 2013 (explaining that “… traditional statistical analysis usually begins with an hypothesis that is tested against the available data. Data mining tools usually generate hypotheses themselves and test these hypotheses against the available data.”).
- 70.
Calders and Custers 2013 (explaining, “Unlike in statistics, where the data is collected specially with the purpose of testing a particular hypothesis, or estimating the parameters of a model, in data mining one usually starts with historical data that was not necessarily collected with the purpose of analysis, but rather as a by-product of an operational system.”).
- 71.
Yoo et al. 2012.
- 72.
- 73.
Seifert 2006.
- 74.
Schermer 2011 (explaining that “(i)n unguided descriptive data mining we look for correlations in the data without using a pre-defined working hypothesis. Dependent on the size of the dataset and the ‘confidence interval’ used to determine correlations, our data mining exercise will yield certain results. While these results might indeed be significant, there is also a chance they are completely random. So, the results we find (and the hypothesis we formulate on the basis of these results) need to be validated to exclude the possibility that the correlation is in fact totally random.”).
- 75.
Calders and Custers 2013.
- 76.
Nissenbaum 2010.
- 77.
Imprecise Causality In Mined Rules 2003.
- 78.
Imprecise Causality In Mined Rules 2003.
- 79.
Han and Kamber 2001.
- 80.
Imprecise Causality In Mined Rules 2003.
- 81.
Berti-Equille 2007, p. 101.
- 82.
Taipale 2003.
- 83.
Taipale 2003.
- 84.
Taipale 2003 (explaining that “Data mining is the process of looking for new knowledge in existing data. The basic problem addressed by data mining is turning low-level data, usually too voluminous to understand, into higher forms (information or knowledge) that might be more compact (for example, a summary), more abstract (for example, a descriptive model), or more useful (for example, a predictive model). At the core of the data mining process is the application of data analysis and discovery algorithms to enumerate and extract patterns from data in a database.”).
- 85.
Han and Kamber 2001.
- 86.
Symeonidis and Mitkas 2005, (explaining that data mining “…confronts the visualization and understanding of large data sets efficiently.”).
- 87.
For one explanation of the difference between the two fields see Bertini and Lalanne 2009, p. 12 (explaining that "(w)hile information visualization (infovis) targets the visual representation of large-scale data collections to help people understand and analyze information, data mining, on the other hand, aims at extracting hidden patterns and models from data, automatically or semi-automatically.").
- 88.
Han and Kamber 2001.
- 89.
Zarsky 2011.
- 90.
Zarsky 2011.
- 91.
Zarsky 2011.
- 92.
Lloyd-Williams 1997.
- 93.
Lloyd-Williams 1997.
- 94.
Han and Kamber 2001.
- 95.
For more, see Keim 2002 (explaining that “(o)ne-dimensional data usually has one dense dimension. A typical example of one-dimensional data is temporal data… Two-dimensional data has two distinct dimensions. A typical example is geographical data where the two distinct dimensions are longitude and latitude… Many data sets consists of more than three attributes and therefore, they do not allow a simple visualization as 2-dimensional or 3-dimensional plots. Examples of multidimensional (or multivariate) data are tables from relational databases, which often have tens to hundreds of columns (or attributes). Since there is no simple mapping of the attributes to the two dimensions of the screen, more sophisticated visualization techniques are needed.”).
- 96.
Keim 2002.
- 97.
Nissenbaum 2010.
- 98.
Schwartz 1999.
- 99.
Nissenbaum 2010.
- 100.
- 101.
Gandy 2009.
- 102.
See, Rosenzweig 2010 (explaining that “the purpose and use limitations, if fully applied, would significantly degrade the analytical utility of many knowledge discovery systems.”).
- 103.
Cavoukian 1998 (explaining that a good data mining program cannot, in advance, delineate what the primary purpose will be because the “discovery model” upon which data mining is based, does not need an hypothesis, and without an hypothesis, establishing the specific purpose of data collection or data processing is a much more complex task); see also de Hert and Bellanova 2008.
- 104.
Article 29 Data Protection Working Party: Opinion 03/2013 on purpose limitation (2 April 2013) available online at http://idpc.gov.mt/dbfile.aspx/Opinion3_2013.pdf (explaining that “(s)pecification of purpose is an essential first step in applying data protection laws and designing data protection safeguards for any processing operation.”).
- 105.
Gunasekara 2009.
- 106.
Duhigg 2012.
- 107.
Duhigg 2012.
- 108.
Gunasekara 2009.
- 109.
See generally, Jonas 2009.
- 110.
Lyon 1994.
- 111.
- 112.
Amoore 2011.
- 113.
Amoore 2011.
- 114.
This example is based off a similar example provided by Louise Amoore during her presentation “Risk based security practices—Risk and the war on terror” held at the Amsterdam Privacy Conference (October 9, 2012).
- 115.
Calders and Custers 2013.
- 116.
See generally, Lyon 2008 (explaining, “In the case of Oyster cards in the UK, data that begin life in the commercial sphere of public transit, are increasingly required in police inquiries. Such data may also stay in the same context but as their uses grow, they may acquire some dangerous characteristics; internal citations omitted).
- 117.
Duhigg 2013.
- 118.
Ramasastry 2006, p. 757 (aptly using the phrase “lost in translation” to explain the problems that arise when data migrates from commercial data brokers to government entities in counter-terrorism data mining programs).
- 119.
Tavani 1999, p. 137.
- 120.
See Cate 2006.
- 121.
Cate 2006.
- 122.
Kirchberger 2011.
- 123.
For more on how the law works, see, Kirchberger 2011.
- 124.
See Seipel 2001 (where Seipel explains that “(i)n short, the misuse model would mean freedom to process whereas the processing model would mean that processing requires some kind of permission.”).
- 125.
Steele 2002.
- 126.
Steele 2002.
- 127.
Cate 2007.
- 128.
Schwartz 1999.
- 129.
Moor 1990 (“Although control of information is clearly an aspect of privacy, these definitions emphasizing control are inadequate for there are many situations in which people have no control over the exchange of personal information about themselves but in which there is no loss of privacy.”).
- 130.
- 131.
Felton 2012.
- 132.
- 133.
Opinion of the European Data Protection Supervisor on Promoting Trust in the Information Society by Fostering Data Protection and Privacy, Brussels (18 March 2010).
- 134.
Cavoukian 2012.
- 135.
See generally, Ponniah 2010.
- 136.
Hanumanthappa et al. 2012.
- 137.
Ozgul et al. 2012.
- 138.
Bienkowski et al. 2012.
- 139.
- 140.
For more, see Rubinstein 2013.
- 141.
Rubinstein 2013.
References
Allen, Anita L. 1988. Uneasy access: Privacy for women in a free society. Totowa: Rowman and Littlefield.
Amoore, Louise. 2011. Data derivatives: On the emergence of a security risk calculus for our times. Theory, Culture and Society (SAGE 2011) 28(6):24.
Arthur, Charles. 2011. What’s a zettabyte? By 2015, the internet will know, says Cisco. Technology Blog at The Guardian UK Newspaper.
Bennett, C. J. 2011. In defence of privacy: The concept and the regime. Surveillance and Society 8(4):485.
Berti-Equille, Laure. 2007. Measuring and modelling data quality for quality-awareness. Data Mining, Quality Measures in Data Mining 43:101–126.
Bertini, Enrico, and Denis Lalanne. 2009. Surveying the complementary role of automatic data analysis and visualization in knowledge discovery. Proceedings of the ACM SIGKDD workshop on visual analytics and knowledge discovery: Integrating automated analysis with interactive exploration (VAKD’09) 12–20. New York: ACM.
Bienkowski, Marie, Mingyu Feng, and Barbara Means. 2012. Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. Washington, D. C.: US Department of Education.
Biersdorfer, J. D. 2006. Weeding out Windows fonts. The New York Times, February 16.
Calders, Toon, and Bart Custers. 2013. What is data mining and how does it work. In Discrimination and privacy in the information society: Data mining and profiling in large databases, eds. Bart Custers, Tal Zarsky, Bart Schermer and Toon Calders, p. 28. Berlin: Springer.
Cate, Fred H. 2006. The failure of fair information practice principles. In Consumer protection in the age of the information economy, ed. Jane K. Winn. Surry: Ashgate.
Cate, Fred H. 2007. The autonomy trap. The Privacy Symposium Cambridge, MA. http://www.fredhcate.com/Publications/The%20Autonomy%20Trap.revised.pdf. Accessed 24 Aug 2007.
Cavoukian, Ann. 1998. Data mining: Staking a claim on your privacy, Information and Privacy Commissioner/Ontario. http://www.ipc.on.ca/images/resources/datamine.pdf.Accessed 15 Sept 2013.
Cavoukian, Ann. 2012. Operationalizing privacy by design: A guide to implementing strong information and privacy practices. http://www.ipc.on.ca/images/Resources/operationalizing-pbd-guide.pdf. Accessed 4 Dec 2012.
Cavoukian, Ann, and Jeff Jonas. 2012. Privacy by design in the age of big data. 2012. http://privacybydesign.ca/content/uploads/2012/06/pbd-big_data.pdf. Accessed 8 June 2012.
Cohen, Julie E. 2000. Examined lives: Informational privacy and the subject as an object. Stanford Law Review 52:1373.
Colonna, Liane. 2012. The new EU proposal to regulate data protection in the law enforcement sector: Raises the bar but not high enough. IRI-memo, Nr. 2/2012.
Council of Europe Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (ETS No. 108) (Jan. 28, 1981).
Council Framework Decision. 2008. 2008/977/JHA of 27 November 2008 on the protection of personal data processed in the framework of police and judicial cooperation in criminal matters. Official Journal L 350:0060–0071.
Custers, Bart. 2013. Data dilemmas in the information society: Introduction and overview. In Discrimination and privacy in the information society: Data mining and profiling in large databases, eds. Bart Custers, Tal Zarsky, Bart Schermer and Toon Calders. Berlin: Springer.
David Rowan, “Personal data mining to improve your cognitive toolkit,” David Rowans Blog http://www.wired.co.uk/news/archive/2011-01/18/edge-question. Accessed 18 January 2011
De Hert, Paul, and Rocco Bellanova. 2008. Data protection from a transatlantic perspective: The EU and US move towards an International Data protection agreement? Brussels: European Parliament’s Committee on Civil Liberties, Justice and Home Affairs.
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data Official Journal L 281, 23/11/1995 P. 0031–0050.
Duhigg, Charles. 2012. How companies learn your secrets. The New York Times, February 16.
Duhigg, Charles. 2013. Mine personal lives to get out vote. The New York Times, October 13.
Eisenberg, Anne. 2012. What 23 years of E-Mail may say about you, The New York Times (April 7, 2012)
Fayyad, U., and R. Uthurusamy. 2002. Evolving data into mining solutions for insights. Communications of the ACM 45(8):28–31.
Felton, ed. 2012. Accountable algorithms. In the blog, Freedom to Tinker: Research and expert commentary on digital technologies in public life available at https://freedom-to-tinker.com/blog/felten/accountable-algorithms/. Accessed 12 September 2012.
Fried, Charles. 1968. Privacy. Yale Law Journal 77:475.
Gandy, O.H. 2009. Coming to terms with chance: Engaging rational discrimination and cumulative disadvantage. Aldershot: Ashgate.
Government Surveillance. 2012. Little peepers everywhere. The Economist Issue 950.
Gunasekara, Gehan. 2009. The ‘final’ privacy frontier? Regulating trans-border data flows. International Journal of Law and Information Technology 17:147.
Han, Jiawei, and Micheline Kamber. 2001. Data mining: Concepts and techniques. San Diego: Academic Press.
Hanumanthappa, M., B. R. Prakash, and Manish Kumar. 2012. Applications of data mining in e-governance: A case study of Bhoomi project in data engineering and management lecture notes in computer science. vol. 6411, 208. Springer.
Imprecise Causality In Mined Rules. 2003. Proceedings: Rough sets, fuzzy sets, data mining, and granular computing: 9th International Conference, RSFDGrC 2003, Chongqing, China (Lecture Notes in Computer Science, Springer-Verlag Heidelberg, v 2639/ 2003), 581.
Inness, Julie C. 1996. Privacy, intimacy, and isolation. Oxford: Oxford University Press.
Jackson, Joyce. 2002. Data mining: A conceptual overview. Communications of the Association for Information Systems 8:267.
Jonas, Jeff. 2009. Data finds data. http://jeffjonas.typepad.com/jeff_jonas/2009/07/data-findsdata.html.Accessed 15 Sept 2013.
Keim, Daniel A. 2002. Information visualization and visual data mining. IEEE Transactions on Visualization and Computer Graphics 100.
Kirchberger, Christine. 2011. Cyber law in Sweden. The Netherlands: Kluwer Law International.
Korff, Douwe, and Ian Brown. 2010. Comparative study on different approaches to new privacy challenges, in particular in the light of technological developments. European Commission.
Kuner, Christopher, Fred H. Cate, Christopher Millard, and Dan Jerker B. Svantesson. 2012. The challenge of ‘big data’ for data protection. International Data Privacy Law 2(2):47.
Lloyd-Williams, Michael. 1997. Discovering the hidden secrets in your data—the data mining approach to information. Information Research: An International Electronic Journal.
Lyon, D. 1994. The electronic eye: The rise of surveillance society. Minneapolis: University of Minnesota Press.
Lyon, David. 2008. Surveillance society. Talk for Festival del Diritto, Piacenza, Italia.
Miller, Arthur. 1971. Assault on Privacy. Michigan: University of Michigan Press.
Moor, James H. 1990. The ethics of privacy protection. Library Trends 39(Fall):69.
Mulligan, Deirdre K., and Jennifer King. 2012. Bridging the gap between privacy and design. University of Pennsylvania Journal of Constitutional Law 14(4):989.
Nissenbaum, Helen. 2010. Privacy in context: technology, policy, and the integrity of social life. Stanford: Stanford University (Stanford Law Books).
Ozgul, F., M. Gok, A. Celik, and Y. Ozal. 2012. Mining hate crimes to figure out reasons behind. Advances in Social Networks Analysis and Mining (ASONAM), 2012 IEEE/ACM International Conference 887.
Peppet, Scott R. 2012. Privacy and the personal prospectus: Should we introduce privacy agents or regulate privacy intermediaries? Iowa Law Review Bulletin 7:77.
Ponniah, P. 2010. Data mining basics in data warehousing fundamentals for it professionals, 2nd ed. Hoboken: Wiley.
Ramasastry, Anita. 2006. Lost in translation? Data mining, national security and the ‘adverse inference’ problem. Santa Clara Computer and High Tech Law Journal 22(4):757.
Rosenzweig, Paul. 2010. Privacy and counter-terrorism: The pervasiveness of data, case western reserve. Journal of International Law 42(3):625.
Regan, Priscilla M. 1995. Legislating privacy: technology, social values, and public policy, p. 9. North Carolina: University of North Carolina Press.
Rubinstein, Ira S. 2013. Big data: The end of privacy or a new beginning? International Data Privacy Law 12–56.
Schermer, Bart W. 2011. The limits of privacy in automated profiling and data mining. Computers Law and Security Review 27:45.
Schoeman, Ferdinand. 1992. Privacy and social freedom. Cambridge: Cambridge University Press.
Schwartz, Paul M. 2000. Internet privacy and the state. Conneticut Law Review 32:815.
Schwartz, Paul M. 1999. Privacy and democracy in cyberspace. Vanderbilt Law Review Â52:1609.
Seifert, Jeffrey W. 2006. Data mining and homeland security: An overview, US Congressional Research Service Report.
Seipel, Peter. 2001. Privacy and freedom of information in Sweden in nordic data protection law. 1st ed., ed. P. Blume, 124. Copenhagen: DJØF Publishing.
Sivanandam, S. N. and S. Sumathi. 2006. Introduction to data mining and its applications. Springer.
Solove, Daniel J. 2002. Conceptualizing privacy. California Law Review 90:1087.
Solove, Daniel J. 2008. Understanding privacy. Boston: Harvard University Press.
Solove, Daniel J. 2011. Nothing to hide: The false tradeoff between privacy and security. New Haven: Yale University Press.
Steele, Jonathan. 2002. Data protection: An opening door? Liverpool Law Review 24(1–2):19.
Symeonidis, Andreas L., and Pericles A. Mitkas. 2005 Data mining and knowledge discovery: A brief overview. In Agent intelligence through data mining multiagent systems, artificial societies, and simulated organizations. Springer.
Taipale, K. A. 2003. Data mining and domestic security: Connecting the dots to make sense of data. Columbia Science and Technology Law Review 5:1.
Tavani, Herman T. 1999. Informational privacy, data mining, and the internet. Ethics and information technology. vol. 1, Issue 2, 137. Kluwer Academic Publishers.
Van den Hoven, J. 2007. information technology, privacy and the protection of personal data. In Information technology, privacy and the protection of personal data. Cambridge: University Press.
Vijayan, Jaikumar. 2007. DHS must assess privacy risk before using data mining tool, GAO say. Computer World.
Westin, Alan. 1967. Privacy and freedom. New York: Atheneum.
White, Martha C. 2012. Could that Facebook ‘like’ hurt your credit score? Time Magazine.
Whitehorn, Mark. 2006. The parable of the beer and diapers: Never let the facts get in the way of a good story, The Register.
Wiley, Steven. 2008. Hypothesis-Free? No such thing: Even so-called ‘discovery-driven research’ needs a hypothesis to make any sense, The Scientist Magazine.
Wright, David and Paul de Hert, eds. 2012. Privacy impact assessment. Series: Law, governance and technology series. vol. 6. Springer.
Yoo, Illhoi, Patricia Alafaireet, Miroslav Marinov, Keila Pena-Hernandez, Rajitha Gopidi, Jia-Fu Chang, and Lei Hua. 2012. Data mining in healthcare and biomedicine: A survey of the literature. Journal of Medical Systems 36(4):2431–2448.
Zarsky, Tal Z. 2011. Governmental data mining and its alternatives. Penn State Law Review 116(2):285.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Colonna, L. (2014). Data Mining and Its Paradoxical Relationship to the Purpose Limitation Principle. In: Gutwirth, S., Leenes, R., De Hert, P. (eds) Reloading Data Protection. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-7540-4_14
Download citation
DOI: https://doi.org/10.1007/978-94-007-7540-4_14
Published:
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-007-7539-8
Online ISBN: 978-94-007-7540-4
eBook Packages: Humanities, Social Sciences and LawLaw and Criminology (R0)