Abstract
The general idea developed in this paper from a sociological perspective is that some of the foundational categories on which the debate about privacy, security and technology rests are blurring. This process is a consequence of a blurring of physical and digital worlds. In order to define limits for legitimate use of intrusive digital technologies, one has to refer to binary distinctions such as private versus public, human versus technical, security versus insecurity to draw differences determining limits for the use of surveillance technologies. These distinctions developed in the physical world and are rooted in a cultural understanding of pre-digital culture. Attempts to capture the problems emerging with the implementation of security technologies using legal reasoning encounter a number of problems since law is by definition oriented backwards, adapting new developments to existing traditions, whereas the intrusion of new technologies in the physical world produces changes and creates fundamentally new problems.
Similar content being viewed by others
Notes
As Luhmann states with regard to the difference between risk and danger: “The distinction presupposes … that uncertainty exists in relation to future loss. There are then two possibilities. The potential loss is either regarded as a consequence of the decision, that is to say, attributed to the decision. We then speak of risk – to be more exact of the risk of decision. Or the possible loss is considered to have been caused externally, that is to say, it is attributed to the environment. In this case we speak of danger.” (Luhmann 1993: 21)
This concept is common coinage among law enforcement agencies and it precisely mirrors the recursive logic of assessment-evaluation-management-measurement like an endless Moebius band.
A frequent air-traveller might wonder how such standards are defined and implemented, when taken out from the waiting queue for special security checks at the gate (see Kirschenbaum et al. 2012).
At the time of writing, the practices of different national security services were discussed in the media after NSA whistle-blower Edward Snowden had disclosed some of the practices employed by these agencies. Such disclosures nicely demonstrate the point. For most of the scandalous surveillance practices, the administration could produce some sort of quasi-legal justification.
The term collective semantic infrastructure is used here for lack of a better alternative to refer to a set of foundational categories or contrast pairs underlying a given culture or society. As Bauman (2000) points out, the categories of time and space are undergoing a fundamental change in “liquid” modernity. The same could be shown for categories of gender, where transgender discourses are eroding the binary distinction between the male and the female (see Ekins and King 2006). Other examples to be discussed later are the distinction between the natural and the artificial or life and death.
This requires a heterodox reading of classical texts of political philosophy, an exercise taking place mostly outside the mainstream debate (see Böhme and Böhme 1985).
The recent hype over the so-called “quantified self” is a pop-cultural offspring of what scholars like Nikolas Rose called the “biological Self” (Rose and Novas 2005). The idea to constantly monitor vital signs such as heart rate or blood pressure and then to adapt one’s daily life to an optimized path is a reflexive use of new technologies and an adaptation of the self concept. Insurance companies may want to analyse such bio-data, calculating their clients’ health risks (which are financial risks for the companies). In any case, the idea of what it means to be human—reflexively, economically and politically—is changing with the injection of such monitoring technologies into the human organism.
The link between the private and public sphere here is made through what is termed the “chilling effect“. Speaking out in public, knowing that any such public statement can be recorded or documented by the police or any other agent, is supposed to have a chilling effect on potential civic activists. Such effects may also emerge as side-effects of governing cyber space (see Cohen 2003).
With the emergence of new techno-gadgets such as Google Glass, we may see a new turn of the screw with regard to privacy intrusive technologies (see Boyd 2008).
The same court in 2008 decided that data processors have to take adequate measures to protect person-related data from misuse.
The Big Brother metaphor tends to focus on surveillance, leaving aside the economic dimension of a data-driven economy. Collecting, processing and trading personal data has become a profitable business model. With every click on the computer, users create valuable intelligence not only for state authorities but for private enterprises as well.
Although trust and reciprocity developing in face-to-face interactions still provides the basis for social relationships, the “virtual/digital” is superseding the “natural/local” in the social fabric of society. This provides the basis for a new form that could be termed “inter-veillance”, as opposed to surveillance and sousveillance, creating new opportunities to follow up on the digital traces a person has left on the Internet (see Jansson 2011).
By way of anecdotal evidence, one could quote an article from The New York Times of 25 Dec2012, written by Jessica Silver-Greenberg, ("Perfect 10? Never mind that. Ask for her credit score") reporting on the use of credit scoring as one parameter to choose partners for dates in online forums.
References
Arendt, H. (1993). Was ist Politik?. München: Piper.
Badke-Schaub, P., Hofinger, G., & Lauche, K. (Eds.). (2008). Human factors. Psychologie sicheren Handelns in Risikobranchen. Heidelberg: Springer.
Baghai, K. (2012). Privacy as a human right: A sociological perspective. Sociology, 46(5), 951–965.
Ball, K., & Snider, L. (Eds.). (2013). The surveillance industrial complex: Towards a political economy of surveillance. London: Routledge.
Bauman, Z. (2000). Liquid modernity. Cambridge: Polity Press.
Beck, U. (1999). World risk society. London: Polity Press.
Bigo, D. (2002). Security and immigration: Toward a critique of the governmentality of unease. Alternatives: Global, Local, Political, 27(1), 63–92.
Böhme, G., & Böhme, H. (1985). Das Andere der Vernunft – Zur Entwicklung von Rationalitätsstrukturen am Beispiel Kants. Frankfurt/M: Suhrkamp.
Boyd, D. (2008). Facebook’s privacy trainwreck: Exposure, invasion, and social convergence. Convergence: The International Journal of Research into New Media Technologies, 14(1), 13–20.
Braudel, F. (1993). Civilization and capitalism, 15th–18th Century, Vol. I: The structure of everyday life. New York: Penguin Books.
Brown, S. (2006). The criminology of Hybrids. Rethinking crime and law in technosocial networks. Theoretical Criminology, 10(2), 223–244.
Buzan, B., Waever, O., & Jaap de Wilde, J. (1998). Security: A new framework for analysis. Boulder: Lynne Rienner Publishers.
Cohen, J. E. (2003). DRM and privacy. Berkeley Technology Law Journal, 18, 575–617.
Crouch, C. (2004). Post-democracy. Cambridge: Polity Press.
De Goede, M. (2008). The politics of preemption and the war on terror in Europe. European Journal of International Relations, 14(1), 161–185.
Ekins, R., & King, D. (2006). The transgender phenomenon. London: Sage.
Foschepoth, J. (2013). Überwachtes Deutschland. Post- und Telefonüberwachung in der alten Bundesrepublik. Göttingen: Vandenhoeck & Ruprecht.
Foucault, M. (1970). The order of things: An archeology of the human sciences. New York: Vintage Books.
Gibson, W. (1984). Neuromancer. New York: Ace Books.
Giddens, A. (1990). The consequences of modernity. Stanford: Stanford University Press.
Giddens, A. (1991). Modernity and self-identity: Self and society in the late modern age. Stanford: Stanford University Press.
Glassner, B. (1999). The culture of fear: Why Americans are afraid of the wrong things. New York, NY: Basic Books.
Habermas, J. (1991). The structural transformation of the public sphere. Boston: MIT Press.
Jansson, A., (2011). Perceptions of surveillance: Reflexivity and trust in digital domains. In Paper presented at the annual meeting of the International Communication Association, TBA, Boston, MA, May 25, 2011 http://citation.allacademic.com/meta/p488315_index.html (last visited Jan 5, 2014).
Jungk, R. (1977). Der Atomstaat. München: Kindler.
Katz, J. M. (1970). The games bureaucrats play: Hide and seek under the freedom of information act. Texas Law Review, 48, 1261–1278.
Kirschenbaum, A., Mariani, M., Van Gulijk, C., Lubasz, S., Rapaport, C., & Andriessen, H. (2012). Airport security: An ethnographic study. Journal of Air Transport Management, 18, 68–73.
Kreissl, R., & Steinert, H. (2008). Für einen gesellschaftstheoretisch aufgeklärten Materialismus. Kriminologisches Journal, 40(4), 269–283.
La Mettrie, J. O. (1996). Machine Man and Other Writings (trans. by. Ann Thomson). Cambridge, UK: Cambridge University Press (first publ. 1747).
Lash, S., & Urry, J. (1994). Economies of signs and space. London: Sage Publishers.
Latour, B. (1988). The pasteurization of France. Cambridge, MA: Harvard University Press.
Lianos, M., & Douglas, M. (2000). Dangeriztion and the end of deviance. British Journal of Criminology, 40, 261–278.
Luhmann, N. (1993). Risk: A sociological theory. New York: De Gruyter.
Lyon, D. (1994). The electronic eye: The rise of the surveillance society. Minneapolis: Uniersity of Minnesota Press.
Lyon, D. (2007). Surveillance studies. Cambridge: Polity Press.
Macpherson, C. B. (2011). The political theory of possessive individualism, from Hobbes to Locke. Oxford: Oxford University Press.
Mead, G. H. (1934). Mind, self and society. Chicago: University of Chicago Press.
Nissenbaum, H. (2010). Privacy in context technology, policy, and the integrity of social life. Stanford: Stanford University Press.
O’Malley, P. (1996). Risk and Responsibility. In A. Barry, T. Osborne, & N. Rose (Eds.), Foucault and political reason: Liberalism, neo-liberalism, and rationalities of government (pp. 189–207). Chicago: University of Chicago Press.
Perrow, Ch. (1999). Normal accidents. Living with high-risk technologies. Princeton: Princeton Unviersity Press.
Rose, N., & Novas, C. (2005). Biological Citizenship. In S. J. Collier & A. Ong (Eds.), Global assemblages: Technology, politics and ethics as anthropological problems (pp. 439–463). Oxford: Blackwell Publishers.
Solove, D. J. (2001). Privacy and power: Computer databases and metaphors for information privacy. Stanford Law Review, 54(6), 1393–1462.
Solove, D. J. (2007). The future of reputation. Gossip, rumor and prvacy on the internet. New Haven: Yale University Press.
Tomasello, M. (2008). Origins of human communication. Cambridge, MA: MIT Press.
Tremin, S. (2006). Reproductive freedom, self-regulation, and the government of impairment in utero. Hypathia, 21(1), 33–53.
Urry, J. (2000). Sociology beyond societies: Mobilities for the twenty-first century. London: Routledge.
Warren, S., & Brandeis, L. B. (1890). The right to privacy. Harvard Law Review, IV(5), 193–220.
Westin, A. (1967). Privacy and freedom. New York: Atheneum.
Zedner, L. (2005). Security and liberty in the face of terror: Reflections from criminal justice. Journal of Law and Society, 32(4), 507–533.
Zedner, L. (2007). Pre-crime and post-criminology? Theoretical Criminology, 11(2), 261–281.
Acknowledgement
I would like to thank my colleague David Wright for helpful advice and two anonymous reviewers for pointing out shortcomings in the draft.
Author information
Authors and Affiliations
Corresponding author
Additional information
This paper grew out of a European research project IRISS (Increasing Resilience in Surveillance Societies) funded under the 7th Framework Program Socioeconomic Sciences and Humanities (Grant Agreement No 290492).
Rights and permissions
About this article
Cite this article
Kreissl, R. Assessing Security Technology’s Impact: Old Tools for New Problems. Sci Eng Ethics 20, 659–673 (2014). https://doi.org/10.1007/s11948-014-9529-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11948-014-9529-9