Abstract
The use of cognitive systems like pattern recognition or video tracking technology in security applications is becoming ever more common. The paper considers cases in which the cognitive systems are meant to assist human tasks by providing information, but the final decision is left to the human. All these systems and their various applications have a common feature: an intrinsic difference in how a situation or an event is assessed by a human being and a cognitive system. This difference, which here is named “the model gap,” is analyzed pertaining to its epistemic role and its ethical consequences. The main results are as follows: (1) The model gap is not a problem, which might be solved by future research, but the central feature of cognitive systems. (2) The model gap appears on two levels: the aspects of the world, which are evaluated, and the way they are processed. This leads to changes in central concepts. While differences on the first level often are the very reason for the deployment of cognitive systems, the latter is hard to notice and often goes unreflected. (3) Such a missing reflection is ethically problematic because the human is meant to give the final judgment. It is particularly problematic in security applications where it might lead to a conflation of descriptive and normative concepts. (4) The idea of the human operator having the last word is based on an assumption of independent judgment. This assumption is flawed for two reasons: The cognitive system and the human operators form a “hybrid system” the components of which cannot be assessed independently; and additional modes of judgment might pose new ethical problems.
Similar content being viewed by others
Notes
I follow the common use of the word “smart” as in “smart security system” or “smart CCTV” both in public and scientific discourse. Yet, it is important to mention that this choice of words can contribute to the very misjudgment of security technology that is discussed in this paper.
In this context, the first processing steps of a cognitive system are often called “feature extraction.”(Bishop 2006, 2) Yet, I want to stress that this is already an interpretation creating a new description, instead of extracting something that is already there.
The EURODAC is a central database using fingerprints to identify all asylum seekers and illegal migrants in the European Union.
To include those systems into my view, I use the word “model” rather comprehensively, in the sense that not every part of a model must have its counterpart in the world.
Of course, it would be a mistake to consider video images or audio as unprocessed just because they are “less” processed than data from cognitive systems. Yet, having additional data that did not influence the outcome of a cognitive system can be an advantage. And in addition, our social skills and practices better reflect the epistemic implications of image and audio recordings, which are available for roughly a century now. That, however, does not mean that their application cannot cause all kinds of problems—as the vast literature from media and surveillance studies testifies.
References
Ackerman MS (2000) The intellectual challenge of cscw: the gap between social requirements and technical feasibility. Human Comput Interact 15(2):179–203
Bezdec J (1994) What is computational intelligence? In: Zurada JM, Marks RJ, Robinson CJ (eds) Computational intelligence imitating life. IEEE Press, New York
Bishop C (2006) Pattern recognition and machine learning. Information Science and Statistics, Springer, New York
Bowker GC, Star SL (2000) Sorting things out—classification and its consequences. MIT Press, Cambridge
Brey P (2000) Disclosive computer ethics. ACM SIGCAS Comput Soc 30(4):10–16
Brey P (2005) The epistemology and ontology of human-computer interaction. Minds Mach 15(3–4):383–398
Clark A (2001) Reasons, robots and the extended mind. Mind Lang 16(2):121–145
Clark A, Chalmers DJ (1998) The extended mind. Analysis 58(1):7–19
de Vries SC (2005) UAVs and control delays. http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA454251. Accessed 16 Sept 2013
Dixon SR, Wickens CD (2006) Automation reliability in unmanned aerial vehicle control: a reliance-compliance model of automation dependence in high workload. Human Fact J Human Fact Ergon Soc 48(3):474–486
Dixon SR, Wickens CD, Chang D (2005) Mission control of multiple unmanned aerial vehicles: a workload analysis. Human Fact J Human Fact Ergon Soc 47(3):479–487
Dreyfus H (1992) What computer still can’t do: a critique of artificial reason. MIT Press, Cambridge
Eberhart RC (2007) Computational intelligence: concepts to implementations. Morgan Kaufmann, San Francisco
Haraway D (1991) A cyborg manifesto—science, technology, and socialist-feminism in the late twentieth century. In: Simians, Cyborgs and Women (eds) The reinvention of nature. Routledge, New York
Hu W, Tan T, Wang L, Maybank S (2004) A survey on visual surveillance of object motion and behaviors. IEEE Trans Syst Man Cybern Part C Appl Rev 34(3):334–352
Introna L (2005) Disclosive ethics and information technology: disclosing facial recognition systems. Ethics Inf Technol 7(2):75–86
Macnish K (2012) Unblinking eyes: the ethics of automating surveillance. Ethics Inf Technol 14(2):151–167
Norman DA (1993) Things that make us smart: defending human attributes in the age of the machine. Addison-Wesley Longman, Boston
Parasuraman R, Barnes M, Cosenzo K (2007) Adaptive automation for human-robot teaming in future command and control systems. The International C2 Journal 1(2):43–68
Reeves B, Nass C (1996) The media equation: how people treat computers, television, and new media like real people and places. Cambridge University Press, New York
Robinson D (1992) Implications of neural networks for how we think about brain function. Behav Brain Sci 15:644–655
Stamou G, Vogiatzis D, Stov S (1999) Bridging the gap between subsymbolic and symbolic techniques: A pragmatic approach. In: CSCC’99, Circuits systems communications and computers, Athens, July 1999
Turk M, Pentland A (1991) Face recognition using eigenfaces. In: Computer vision and pattern recognition, 1991. Proceedings CVPR’91., IEEE computer society conference, pp 586–591
Van der Ploeg I (1999) The illegal body: ‘Eurodac’ and the politics of biometric identification. Ethics Inf Technol 1:295–302
Van der Ploeg I (2005) Biometrics and the body as information: normative issues of the socio-technical coding of the body. In: Lyon D (ed) Surveillance as social sorting. Routledge, London
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Matzner, T. The model gap: cognitive systems in security applications and their ethical implications. AI & Soc 31, 95–102 (2016). https://doi.org/10.1007/s00146-013-0525-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00146-013-0525-4