Personal information is key to informational privacy and the algorithmically generated profiles of individuals. However, the concept of personal information and its nature is rarely discussed. The concept of personal information thus seems to be based on an idea of information as objective and truthful—as natural information—that is depicted as digital footprints in the online and digital realm. I argue that the concept of personal information should exit the realm of natural information and enter the realm of nonnatural information—grounded in meaning, intention, and convention—as this will provide us with a concept that can account for potential misleadingness, inaccuracies, and mistakes.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
Tax calculation will be finalised during checkout.
In the present article, when talking about natural information, it may mean either of the two different varieties, depending on the context. Most often, it is a reference to Dretske-information, as it is this conception of information that is most often at play in the privacy literature when personal information is mentioned.
The concepts of nonnatural personal information, unintentionally non-misleading personal information, intentionally non-misleading personal information, personal misinformation, and personal disinformation have not previously been developed and are thus not present within the privacy literature.
It should be noted that I am only committing myself to the phrasing of the legal definition in terms of the relation between some information and an identifiable natural person. I am not tying the definition of nonnatural personal information to any of the specifications of the elements of the original legal definition as laid out in the opinion published by the Data Protection Working Party (2007).
In the “all-or-nothing” interpretation equivalent to Dretske’s (1981) concept of information.
The Data Protection Working Party (2007) does not distinguish between data and information. As seen in the quotation, personal data is defined in terms of information, and the document does not offer a general definition of information.
It should be noted that a footprint in principle can be generated with the intention to mislead. For instance, it is possible to walk backwards in the snow in order to mislead about the direction one has taken. However, this possibility is not reflected in the idea of the digital footprint, which is treated as something objective, truthful, and accurate.
The idea that information is something one can control and/or restrict access to seems to be bound to an idea of information as property—i.e., something which can be owned and thereby controlled by some specific individual, company, or government. For a discussion of ownership of information (data), see Hummel, Braun, and Dabrock (2020).
In fact, we might need to question the profiling business as such.
Agre, P. E. (1994). Surveillance and capture: Two models of privacy. The Information Society, 10(2), 101–127.
Allo, P. (2020). The epistemology of non-distributive profiles. Philosophy & Technology, 33(3), 379–409.
Andersen, J. (2018). Archiving, ordering, and searching: Search engines, algorithms, databases, and deep mediatization. Media, Culture & Society,. https://doi.org/10.1177/0163443718754652
Blaauw, M. (2013). The epistemic account of privacy. Episteme, 10(2), 167–177.
Bucher, T. (2016). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20(1), 30–44.
Cavoukian, A. (2011). Privacy by design – The 7 foundational principles. https://www.ipc.on.ca/wp-content/uploads/Resources/7foundationalprinciples.pdf. Accessed 11 February 2020.
Chahal, M. (2015). Consumers are ‘dirtying’ databases with false details. Marketing Week, 8 July 2015.
Cohen, J. E. (2012). Configuring the networked self: Law, code, and the play of everyday practices. Yale University Press.
Cohen, J. E. (2013). What privacy is for. Harvard Law Review, 126(7), 1904–1933.
Data Protection Working Party. (2007). Opinion 4/2007 on the concept of personal data. Article 29, WP 136. Brussels: European Commission.
DeCew, J. (2002). Privacy. In E. N. Zalta (Ed.), Stanford Encyclopedia of Philosophy. First published May 14, 2002; substantive revision Aug 9, 2013. (Spring 2015 Edition). https://plato.stanford.edu/archives/spr2015/entries/privacy/
Dretske, F. (1981). Knowledge and the flow of information. MIT Press.
Dretske, F. (2008). Epistemology and information. Philosophy of InformationIn P. Adriaans & J. van Bentham (Eds.), Handbook of the philosophy of science (Vol. 8, pp. 29–47). Elsevier.
Fallis, D. (2011). Floridi on Disinformation. Etica & Politica, 13(2), 201–214.
Fallis, D. (2014). The varieties of disinformation. In L. Floridi & P. Illari (Eds.), The philosophy of information quality (pp. 135–161). Springer.
Fallis, D. (2015). What is disinformation? Library Trends, 63(3), 401–426.
Fetzer, J. H. (2004). Information: Does it have to be true? Minds and Machines, 14, 223–229.
Floridi, L. (2005). The ontological interpretation of informational privacy. Ethics and Information Technology, 7, 185–200.
Floridi, L. (2011). The philosophy of information. Oxford Scholarship Online. Retrieved at https://oxford.universitypressscholarship.com/view/https://doi.org/10.1093/acprof:oso/9780199232383.001.0001/acprof-9780199232383?rskey=0gNsh5&result=1
Floridi, L. (2013). The ontological interpretation of informational privacy. The Ethics of Information (pp. 228–260). Oxford University Press.
Fox, C. J. (1983). Information and misinformation. An investigation of the notions of information, misinformation, informing, and misinforming. London: Greenwood Press.
Galič, M., Timan, T., & Koops, B.-J. (2017). Bentham, Deleuze and beyond: An overview of surveillance theories from the panopticon to participation. Philosophy and Technology, 30, 9–37.
Grice, H. P. (1957). Meaning. In: P. Grice (1989) Studies in the Way of Words (pp. 213–223). Cambridge, MA: Harvard University Press.
Grice, H. P. (1967). Logic and conversation. In: P. Grice (1989), Studies in the way of words (pp. 22–40). Cambridge, MA: Harvard University Press.
Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. British Journal of Sociology, 51(4), 605–622.
Harcourt, B. E. (2015). Exposed. Harvard University Press.
Hjørland, B. (2007). Information: Objective or subjective/situational? Journal of the American Society for Information Science and Technology, 58(10), 1448–1456.
Hummel, P., Braun, m., and Dabrock, P. (2020). Own data? Ethical reflections on data ownership. Philosophy & Technology, Online first.
Le Morvan, P. (2018). Information, privacy, and false light. In A. E. Cudd & M. C. Navin (Eds.), Core concepts and contemporary issues in privacy (pp. 79–90). Springer.
Mahon, J. (2008). The definition of lying and deception. In E. N. Zalta (Ed.), Stanford encyclopedia of philosophy (Fall 2009 Edition). First published Feb. 21, 2008. Available at http://plato.stanford.edu/archives/fall2009/entries/lying-definition/
Nissenbaum, H. (2010). Privacy in context: Technology, policy and the integrity of social life. Stanford Law Books.
Pasquale, F. (2015). The black box society. the secret algorithms that control money and information. Cambridge, MA: Harvard University Press.
Piccinini, G. (2020). Nonnatural mental representation. In J. Smortchkova, K. Dołȩga, & T. Schlicht (Eds.), What are mental representations? (pp. 254–286). New York, NY: Oxford University Press.
Rotenberg, M., et al. (2015). Privacy in the modern age: The search for solutions. The New Press.
Russo, F. (2018). Digital technologies, ethical questions, and the need of an informational framework. Philosophy & Technology, 31(4), 655–667.
Scarantino, A., & Piccinini, G. (2010). Information without truth. Metaphilosophy, 41(3), 313–330.
Solove, D. J. (2008). Understanding privacy. Harvard University Press.
Søe, S. O. (2016). The urge to detect, the need to clarify. Gricean perspectives on information, misinformation, and disinformation. PhD Thesis, Faculty of Humanities, University of Copenhagen.
Søe, S. O. (2018). Algorithmic detection of misinformation and disinformation: Gricean perspectives. Journal of Documentation, 74(2), 309–332.
Søe, S. O. (2019a). A Floridian dilemma. Semantic information and truth. Information Research, 24(2), 827.
Søe, S. O. (2019b). A unified account of information, misinformation, and disinformation. Synthese. https://doi.org/10.1007/s11229-019-02444-x.
Striphas, T. (2015). Algorithmic culture. European Journal of Cultural Studies, 18(4–5), 395–412.
Tavani, H. T. (2008). Informational privacy: Concepts, theories, and controversies. In K. E. Himma & H. T. Tavani (Eds.), The handbook of information and computer ethics (pp. 131–164). Wiley.
Tavani, H. T. (2012). Search engines and ethics. In: E. N. Zalta (Ed.), Stanford Encyclopedia of philosophy, First published Aug. 27, 2012; substantive revision July 8, 2016. (Fall 2016 Edition) https://plato.stanford.edu/archives/fall2016/entries/ethics-search/
Wolf, C., et al. (2015). Envisioning privacy in the world of big data. In M. Rotenberg (Ed.), Privacy in the modern age: The search for solutions (pp. 204–216). The New Press.
Youyou, W., Kosinski, M., & Stillwell, D. (2015). Computer-based personality judgments are more accurate than those made by humans. PNAS, 112(4), 1036–1040.
I wish to thank my colleagues in the project “Don't Take it Personal” Jens-Erik Mai, Rikke Frank Jørgensen, Bjarki Valtysson, Taina Bucher, Johan Lau Munkholm, and Jesper Pagh for fruitful discussions along the way. Furthermore, I would like to thank Mike Katell and Irina Shklovski for very thorough and useful comments on an earlier draft, as well as all the attendees at the Information Ethics Roundtable in Copenhagen in 2018 (IER2018).
This research was conducted within the project “Don’t Take it Personal”: Privacy and Information in an Algorithmic Age, which is generously funded by the Independent Research Fund Denmark, grant number: 8018-00041B.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Søe, S.O. Nonnatural Personal Information. Accounting for Misleading and Non-misleading Personal Information. Philos. Technol. (2021). https://doi.org/10.1007/s13347-021-00457-4
- Personal information
- Nonnatural information
- Personal misinformation
- Personal disinformation
- Algorithmic profiles