Nonnatural Personal Information. Accounting for Misleading and Non-misleading Personal Information


Personal information is key to informational privacy and the algorithmically generated profiles of individuals. However, the concept of personal information and its nature is rarely discussed. The concept of personal information thus seems to be based on an idea of information as objective and truthful—as natural information—that is depicted as digital footprints in the online and digital realm. I argue that the concept of personal information should exit the realm of natural information and enter the realm of nonnatural information—grounded in meaning, intention, and convention—as this will provide us with a concept that can account for potential misleadingness, inaccuracies, and mistakes.

This is a preview of subscription content, access via your institution.

Fig. 1


  1. 1.

    In the present article, when talking about natural information, it may mean either of the two different varieties, depending on the context. Most often, it is a reference to Dretske-information, as it is this conception of information that is most often at play in the privacy literature when personal information is mentioned.

  2. 2.

    The concepts of nonnatural personal information, unintentionally non-misleading personal information, intentionally non-misleading personal information, personal misinformation, and personal disinformation have not previously been developed and are thus not present within the privacy literature.

  3. 3.

    It should be noted that I am only committing myself to the phrasing of the legal definition in terms of the relation between some information and an identifiable natural person. I am not tying the definition of nonnatural personal information to any of the specifications of the elements of the original legal definition as laid out in the opinion published by the Data Protection Working Party (2007).

  4. 4.

    In the “all-or-nothing” interpretation equivalent to Dretske’s (1981) concept of information.

  5. 5.

    The Data Protection Working Party (2007) does not distinguish between data and information. As seen in the quotation, personal data is defined in terms of information, and the document does not offer a general definition of information.

  6. 6.

    It should be noted that a footprint in principle can be generated with the intention to mislead. For instance, it is possible to walk backwards in the snow in order to mislead about the direction one has taken. However, this possibility is not reflected in the idea of the digital footprint, which is treated as something objective, truthful, and accurate.

  7. 7.

    The idea that information is something one can control and/or restrict access to seems to be bound to an idea of information as property—i.e., something which can be owned and thereby controlled by some specific individual, company, or government. For a discussion of ownership of information (data), see Hummel, Braun, and Dabrock (2020).

  8. 8.

    In fact, we might need to question the profiling business as such.


  1. Agre, P. E. (1994). Surveillance and capture: Two models of privacy. The Information Society, 10(2), 101–127.

    Article  Google Scholar 

  2. Allo, P. (2020). The epistemology of non-distributive profiles. Philosophy & Technology, 33(3), 379–409.

    Article  Google Scholar 

  3. Andersen, J. (2018). Archiving, ordering, and searching: Search engines, algorithms, databases, and deep mediatization. Media, Culture & Society,.

    Article  Google Scholar 

  4. Blaauw, M. (2013). The epistemic account of privacy. Episteme, 10(2), 167–177.

    Article  Google Scholar 

  5. Bucher, T. (2016). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20(1), 30–44.

    Article  Google Scholar 

  6. Cavoukian, A. (2011). Privacy by design – The 7 foundational principles. Accessed 11 February 2020.

  7. Chahal, M. (2015). Consumers are ‘dirtying’ databases with false details. Marketing Week, 8 July 2015.

  8. Cohen, J. E. (2012). Configuring the networked self: Law, code, and the play of everyday practices. Yale University Press.

    Google Scholar 

  9. Cohen, J. E. (2013). What privacy is for. Harvard Law Review, 126(7), 1904–1933.

    Google Scholar 

  10. Data Protection Working Party. (2007). Opinion 4/2007 on the concept of personal data. Article 29, WP 136. Brussels: European Commission.

  11. DeCew, J. (2002). Privacy. In E. N. Zalta (Ed.), Stanford Encyclopedia of Philosophy. First published May 14, 2002; substantive revision Aug 9, 2013. (Spring 2015 Edition).

  12. Dretske, F. (1981). Knowledge and the flow of information. MIT Press.

    Google Scholar 

  13. Dretske, F. (2008). Epistemology and information. Philosophy of InformationIn P. Adriaans & J. van Bentham (Eds.), Handbook of the philosophy of science (Vol. 8, pp. 29–47). Elsevier.

    Google Scholar 

  14. Fallis, D. (2011). Floridi on Disinformation. Etica & Politica, 13(2), 201–214.

    Google Scholar 

  15. Fallis, D. (2014). The varieties of disinformation. In L. Floridi & P. Illari (Eds.), The philosophy of information quality (pp. 135–161). Springer.

    Google Scholar 

  16. Fallis, D. (2015). What is disinformation? Library Trends, 63(3), 401–426.

    Article  Google Scholar 

  17. Fetzer, J. H. (2004). Information: Does it have to be true? Minds and Machines, 14, 223–229.

    Article  Google Scholar 

  18. Floridi, L. (2005). The ontological interpretation of informational privacy. Ethics and Information Technology, 7, 185–200.

    Article  Google Scholar 

  19. Floridi, L. (2011). The philosophy of information. Oxford Scholarship Online. Retrieved at

  20. Floridi, L. (2013). The ontological interpretation of informational privacy. The Ethics of Information (pp. 228–260). Oxford University Press.

    Chapter  Google Scholar 

  21. Fox, C. J. (1983). Information and misinformation. An investigation of the notions of information, misinformation, informing, and misinforming. London: Greenwood Press.

  22. Galič, M., Timan, T., & Koops, B.-J. (2017). Bentham, Deleuze and beyond: An overview of surveillance theories from the panopticon to participation. Philosophy and Technology, 30, 9–37.

    Article  Google Scholar 

  23. Grice, H. P. (1957). Meaning. In: P. Grice (1989) Studies in the Way of Words (pp. 213–223). Cambridge, MA: Harvard University Press.

  24. Grice, H. P. (1967). Logic and conversation. In: P. Grice (1989), Studies in the way of words (pp. 22–40). Cambridge, MA: Harvard University Press.

  25. Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. British Journal of Sociology, 51(4), 605–622.

    Article  Google Scholar 

  26. Harcourt, B. E. (2015). Exposed. Harvard University Press.

    Book  Google Scholar 

  27. Hjørland, B. (2007). Information: Objective or subjective/situational? Journal of the American Society for Information Science and Technology, 58(10), 1448–1456.

    Article  Google Scholar 

  28. Hummel, P., Braun, m., and Dabrock, P. (2020). Own data? Ethical reflections on data ownership. Philosophy & Technology, Online first.

  29. Le Morvan, P. (2018). Information, privacy, and false light. In A. E. Cudd & M. C. Navin (Eds.), Core concepts and contemporary issues in privacy (pp. 79–90). Springer.

    Chapter  Google Scholar 

  30. Mahon, J. (2008). The definition of lying and deception. In E. N. Zalta (Ed.), Stanford encyclopedia of philosophy (Fall 2009 Edition). First published Feb. 21, 2008. Available at

  31. Nissenbaum, H. (2010). Privacy in context: Technology, policy and the integrity of social life. Stanford Law Books.

    Google Scholar 

  32. Pasquale, F. (2015). The black box society. the secret algorithms that control money and information. Cambridge, MA: Harvard University Press.

  33. Piccinini, G. (2020). Nonnatural mental representation. In J. Smortchkova, K. Dołȩga, & T. Schlicht (Eds.), What are mental representations? (pp. 254–286). New York, NY: Oxford University Press.

    Google Scholar 

  34. Rotenberg, M., et al. (2015). Privacy in the modern age: The search for solutions. The New Press.

    Google Scholar 

  35. Russo, F. (2018). Digital technologies, ethical questions, and the need of an informational framework. Philosophy & Technology, 31(4), 655–667.

    Article  Google Scholar 

  36. Scarantino, A., & Piccinini, G. (2010). Information without truth. Metaphilosophy, 41(3), 313–330.

    Article  Google Scholar 

  37. Solove, D. J. (2008). Understanding privacy. Harvard University Press.

    Google Scholar 

  38. Søe, S. O. (2016). The urge to detect, the need to clarify. Gricean perspectives on information, misinformation, and disinformation. PhD Thesis, Faculty of Humanities, University of Copenhagen.

  39. Søe, S. O. (2018). Algorithmic detection of misinformation and disinformation: Gricean perspectives. Journal of Documentation, 74(2), 309–332.

    Article  Google Scholar 

  40. Søe, S. O. (2019a). A Floridian dilemma. Semantic information and truth. Information Research, 24(2), 827.

    Google Scholar 

  41. Søe, S. O. (2019b). A unified account of information, misinformation, and disinformation. Synthese.

  42. Striphas, T. (2015). Algorithmic culture. European Journal of Cultural Studies, 18(4–5), 395–412.

    Article  Google Scholar 

  43. Tavani, H. T. (2008). Informational privacy: Concepts, theories, and controversies. In K. E. Himma & H. T. Tavani (Eds.), The handbook of information and computer ethics (pp. 131–164). Wiley.

    Chapter  Google Scholar 

  44. Tavani, H. T. (2012). Search engines and ethics. In: E. N. Zalta (Ed.), Stanford Encyclopedia of philosophy, First published Aug. 27, 2012; substantive revision July 8, 2016. (Fall 2016 Edition)

  45. Wolf, C., et al. (2015). Envisioning privacy in the world of big data. In M. Rotenberg (Ed.), Privacy in the modern age: The search for solutions (pp. 204–216). The New Press.

    Google Scholar 

  46. Youyou, W., Kosinski, M., & Stillwell, D. (2015). Computer-based personality judgments are more accurate than those made by humans. PNAS, 112(4), 1036–1040.

    Article  Google Scholar 

Download references


I wish to thank my colleagues in the project “Don't Take it Personal” Jens-Erik Mai, Rikke Frank Jørgensen, Bjarki Valtysson, Taina Bucher, Johan Lau Munkholm, and Jesper Pagh for fruitful discussions along the way. Furthermore, I would like to thank Mike Katell and Irina Shklovski for very thorough and useful comments on an earlier draft, as well as all the attendees at the Information Ethics Roundtable in Copenhagen in 2018 (IER2018).


This research was conducted within the project “Don’t Take it Personal”: Privacy and Information in an Algorithmic Age, which is generously funded by the Independent Research Fund Denmark, grant number: 8018-00041B.

Author information



Corresponding author

Correspondence to Sille Obelitz Søe.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Søe, S.O. Nonnatural Personal Information. Accounting for Misleading and Non-misleading Personal Information. Philos. Technol. (2021).

Download citation


  • Personal information
  • Nonnatural information
  • Misleadingness
  • Personal misinformation
  • Personal disinformation
  • Algorithmic profiles
  • Privacy
  • Information