Abstract
This paper argues in favor of a hybrid conception of identity. A common conception of identity in datafied society is a split between a digital self and a real self, which has resulted in concepts such as the data double, algorithmic identity, and data shadows. These data-identity metaphors have played a significant role in the conception of informational privacy as control over information—the control of or restricted access to your digital identity. Through analyses of various data-identity metaphors as well as philosophical accounts of identity, we argue in favor of a hybrid conception of identity that emphasizes the relations between the ‘real’ and the ‘digital’. A hybrid conception of identity—where the digital is an aspect on par with social relations, self-understanding, and values—ultimately calls for an understanding of privacy as the right to influence one’s own identity.
Similar content being viewed by others
Data availability
Not applicable.
Code availability
Not applicable.
Notes
In the following we will present a selection of identity theories in order to demonstrate how identity is a complex process involving both internal and external factors. The review of theories is not exhaustive and other theories could have been chosen – e.g. Hacking’s (1996) looping effects of human kinds or Wallace’s (2019) network self – however, we have chosen a set of theories that in some way or other explicitly deal with the digital domain or the structuring force of language.
Solove is a pragmatist arguing in favor of a pluralist understanding of privacy based on the protection against different yet related problems. Thus, Solove (2008) is not a control theorist and does not defend any single theory of privacy. However, he has a great overview including detailed descriptions of the different conceptions of privacy available.
Hildebrandt (2019) also argues in favor of the connection between privacy and identity in advancing the account that privacy needs to be understood “as the protection of the incomputable self” (p. 96).
The original text is written partly backwards. Each word is written in the right direction but the order is reversed and reads from the bottom up. The quote in the original reads as follows: “time in unfixed be to is human be To” (Wittkower, 2011, p. 297).
References
Agre, P. E. (2001). Introduction. In P. E. Agre, & M. Rotenberg (Eds.), Technology and privacy: The new landscape (pp. 1–28). MIT Press.
Allo, P. (2020). The epistemology of non-distributive profiles. Philosophy & Technology, 33(3), 379–409.
Amoore, L. (2020). Cloud Ethics. Algorithms and the Attributes of Ourselves and Others. Duke University Press.
Article 29 Data Protection Working Party (2007). Opinion on the concept of personal data. Working Paper 136. EU Justice.
Becker, M. (2019). Privacy in the digital age: Comparing and contrasting individual versus social approaches towards privacy. Ethics and Information Technology, 21, 307–317.
Bowker, G. C., Baker, K., Millerand, F., & Ribes, D. (2010). Towards information infrastructure studies: Ways of knowing in a networked environment. In J. Hunsinger, J. M. Allen, & L. Klastrup (Eds.), International Handbook of Internet Research (pp. 97–117). Springer.
Bowker, G. C., & Star, S. L. (2000). Sorting things out: Classification and its consequences. MIT Press.
Bucher, T. (2017). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information Communication & Society, 20(1), 30–44.
Cadwalladr, C., & Graham-Harrison, E. (2018, March 17). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
Cheney-Lippold, J. (2011). A new algorithmic identity: Soft biopolitics and the modulation of control. Theory Culture & Society, 28(6), 164–181.
Cheney-Lippold, J. (2017). We are data: Algorithms and the making of our digital selves. NYU Press.
Coeckelbergh, M. (2017). Using words and things. Language and philosophy of technology. Routledge.
DeCew, J. (2002). Privacy, The Stanford Encyclopedia of Philosophy, (Spring 2018 edition). Zalta, E. N. (ed.), first published Tue May 14, 2002; substantive revision Thu Jan 18, 2018. https://plato.stanford.edu/archives/spr2018/entries/privacy/
de Vries, K. (2010). Identity, profiling algorithms and a world of ambient intelligence. Ethics and Information Technologies, 12(1), 71–85.
Duhigg, C. (2012, February 16). How companies learn your secrets. New York Times. http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html
Floridi, L. (2015). The onlife manifesto: Being human in a hyperconnected era. Springer Nature.
Floridi, L. (2005). The ontological interpretation of informational privacy. Ethics and Information Technology, 7(4), 185–200.
Gill, L. (2018). Law, metaphor, and the encrypted machine. Osgoode Hall Law Journal, 55(2), 440–477.
The Guardian (n.d.). The Cambridge Analytica files. The Guardian. https://www.theguardian.com/news/series/cambridge-analytica-files
Greene, T., & Shmueli, G. (2019). How personal is machine learning personalization? arXiv preprint, arXiv:1912.07938.
Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. British Journal of Sociology, 51(4), 605–622.
Harcourt, B. E. (2015). Exposed: Desire and disobedience in the digital age Harvard University Press.
Henschke, A. (2017). Ethics in an age of surveillance: Personal information and virtual identities. Cambridge University Press.
Hildebrandt, M. (2019). Privacy as protection of the incomputable self: From agnostic to agonistic machine learning. Theoretical Inquiries in Law, 20(1), 83–121.
Jenkins, R. (2012). Identity, surveillance and modernity: Sorting out who’s who. In K. Ball, K. Haggerty, & D. Lyon (Eds.), Routledge Handbook of Surveillance Studies (pp. 159–166). Routledge.
Jensen, K. B., & Helles, R. (2017). Speaking into the system: Social media and many-to-one communication. European Journal of Communication, 32(1), 16–25.
Koopman, C. (2019a). Information before information theory: The politics of data beyond the perspective communication. New Media and Society, 21(6), 1326–1343.
Koopman, C. (2019b). How we became our data: A genealogy of the informational person. University of Chicago Press.
Koops, B. J. (2021). The concept of function creep. Law Innovation and Technology, 13(1), 29–56.
Lynch, M. P. (2019). Know-it-all society: Truth and arrogance in political culture Liveright.
Lyon, D. (2001). Surveillance society: Monitoring everyday life. Buckingham: Open University Press.
Macnish, K. (2020). Mass surveillance: A private affair? Moral Philosophy and Politics, 7(1), 9–27.
Mai, J. E. (2016). Big data privacy: The datafication of personal information. The Information Society, 32(3), 192–199.
Milano, S., Taddeo, M., & Floridi, L. (2020). Recommender systems and their ethical challenges. AI & Society, 35(4), 957–967.
Monahan, T. (2016). Built to lie: Investigating technologies of deception, surveillance, and control. The Information Society, 32(4), 229–240.
Moor, J. H. (1997). Towards a theory of privacy in the information age. Computers and Society, Sept. 27–32.
Moore, A. (2005). Information ethics. Privacy, property, and power. University of Washington Press.
Nissenbaum, H. F. (2010). Privacy in context. Technology, policy, and the integrity of social life. Stanford Law Books.
Purtova, N. (2018). The law of everything: Broad concept of personal data and future of EU data protection law. Law Innovation and Technology, 10(1), 40–81.
Pasquale, F. (2015). Black box society: The secret algorithms that control money and information. Harvard University Press.
Ricoeur, P. (1980). Narrative time. Critical Inquiry, 7(1), 169–190.
Ricoeur, P. (1992). Oneself as another. University of Chicago Press.
Schüll, N. D. (2019). The data-based self: Self-quantification and the data-driven life. Social Research, 86(4), 909–930.
Schüll, N. D. (2018). Self in the loop: Bits, patterns, and pathways in the quantified self. In Z. Papacharissi (Ed.), The networked self, vol.5: Human augmentics, artificial intelligence, sentience (pp. 25–38). Routledge.
Smith, C. H. (2020). Corporatised identities ≠ digital identities: Algorithmic filtering on social media and the commercialisation of presentations of self. In C. Burr, & L. Floridi (Eds.), Ethics of digital well-being (pp. 55–80). Springer.
Solove, D. J. (2008). Understanding privacy. Cambridge, MA: Harvard University Press.
Solove, D. J. (2004). The digital person: Technology and privacy in the information age. NYU Press.
Søe, S. O. (2021). Nonnatural personal information. Accounting for misleading and non-misleading personal information. Philosophy & Technology, 34(2), 1243–1262.
Søe, S. O., Jørgensen, R. F., & Mai, J. E. (2021). What is the ‘personal’ in ‘personal information’? Ethics and Information Technology, 23(4), 625–633.
Tavani, H. T. (2008). Informational privacy: Concepts, theories, and controversies. In K. E. Himma, & H. T. Tavani (Eds.), The handbook of information and computer ethics (pp. 131–164). Hoboken, NJ: Wiley.
Thylstrup, N. B. (2014). Archival shadows in the digital age. Nordisk Tidsskrift for Informationsvidenskab og Kulturformidling, 3(2/3), 29–39.
van Dijck, J. (2013). ‘You have one identity’: Performing the self on Facebook and LinkedIn. Media Culture & Society, 35(2), 199–215.
von der Leyen, U. (2020). State of the Union Address by President von der Leyen at the European Parliament Plenaryhttps://ec.europa.eu/commission/presscorner/detail/en/SPEECH_20_1655
Warren, S. D., & Brandeis, L. D. (2005). The right to privacy. In A. D. Moore (Ed.), Information ethics: Privacy, property, and power (pp. 209–225). University of Washington Press. (original work published 1890).
Westin, A. F. (1967). Privacy and freedom. New York, NY: Atheneum.
Wittgenstein, L. (2009). Philosophical investigations (P. M. S. Hacker & J. Schulte, Eds.; G. E. M. Anscombe, P. M. S. Hacker, & J. Schulte, Trans.). Blackwell Publishing. (original work published 1953).
Wittkower, D. E. (2011). Time in unfixed are you. In D. E. Wittkower (Ed.), Philip K. Dick and philosophy. Do androids have kindred spirits? (pp. 293–297). Open Court.
Hacking, I. (1996). The looping effects of human kinds. In Causal Cognition: A Multidisciplinary Deabate Sperber, D., Premack, D. and Premack, A. J. (eds.). Oxford University Press
Wallace K. (2019). The Network Sel. Relation, Process, and Personal Identity. Routledge Studies in Amarican Philosophy, Routledge.
Acknowledgements
We would like to thank our colleagues in the Don’t Take it Personal project Bjarki Valtysson, Johan Lau Munkholm, Rikke Frank Jørgensen, and Tanja Wiehn for helpful comments and constructive discussions along the way. We would also like to thank two anonymous reviewers for very constructive and helpful comments on an earlier version of this article.
Funding
This research was conducted within the project “’Don’t Take it Personal’: Privacy and Information in an Algorithmic Age”, which is generously funded by the Independent Research Fund Denmark, grant number: 8018-00041B.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest/competing interests
Not applicable.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Søe, S.O., Mai, JE. Data identity: privacy and the construction of self. Synthese 200, 492 (2022). https://doi.org/10.1007/s11229-022-03968-5
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11229-022-03968-5