In light of the recent emergence of predictive techniques in law enforcement to forecast crimes before they occur, this paper examines the temporal operation of power exercised by predictive policing algorithms. I argue that predictive policing exercises power through a paranoid style that constitutes a form of temporal governmentality. Temporality is especially pertinent to understanding what is ethically at stake in predictive policing as it is continuous with a historical racialized practice of organizing, managing, controlling, and stealing time. After first clarifying the concept of temporal governmentality, I apply this lens to Chicago Police Department’s Strategic Subject List. This predictive algorithm operates, I argue, through a paranoid logic that aims to preempt future possibilities of crime on the basis of a criminal past codified in historical crime data.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
This is the net price. Taxes to be calculated in checkout.
Pearsall (2009), p. 16.
Robinson and Koepke (2016).
For a helpful summary of the differences between place-based and person-based predictive systems, see Ferguson (2017). Advocates of place-based programs defend these as less problematic than their person-based counterparts insofar as they target locations of possible crime rather than subjects of future crime. See Beck and McCue (2009). Advocates for person-based systems have argued that these types of programs are more effective as predictive tools for targeting specific kinds of crime in a city.
Cathy O’Neil provocatively describes predictive policing as a “weapon of math destruction”—a mathematical model that has harmful effects on precarious social groups. See O’Neil (2016), p. 3.
See O’Neil (2016), p. 86 and Alexander (2012). A range of scholars emphasize the mutually constitutive relationship between policing and race in the U.S., noting the way in which policing has been historically central for the formation and maintenance of racial hierarchies. Coramae Richey Mann, for instance, argues that policing in the U.S. has its roots in slavery, with slave patrols constituting the first state-sponsored police forces. See Mann (1993), pp. 165, 195. See also Adamson (1983), Russell (1998), and Bass (2001).
See Ziewitz (2016), p. 10.
See Gillespie (2016), pp. 19, 26.
Foucault (1997), p. xx.
See Introna (2016), p. 19.
Foucault (2004/2008), pp. 12–13.
See Collier (2009), p. 96.
Beer (2017), p. 9.
See Gillespie (2014).
I follow scholars like Bucher (2012) and Ananny and Crawford (2016) who have shown the limits of an analytic of visibility and an accompanying ethics of transparency to the critical study of algorithms. Bucher (2012) shows how the Facebook EdgeRank algorithm works differently from the Panoptic form of surveillance insofar as it imposes a ‘threat of invisibility’ on users. Ananny and Crawford (2016) explore the limits of the ideal of transparency for understanding governing algorithmic systems and for holding these systems accountable.
This performative understanding of time is inspired by Bruno Latour. See Latour (1988), pp. 50, 165.
Mills (2014), p. 28.
See Maguire (2000).
This reactive style of policing is at work in disciplinary power, which proactively shapes the prisoner only after it first situated them as a prisoner by reacting to their crime. Disciplinary power thus does not capture the temporal governmentality of predictive policing insofar as it is reactive to crime where predictive policing is proactive.
Beck and McCue (2009), pp. 20, italics added.
Pearsall (2009), p. 17.
See Kaplan (2017), available at https://southsideweekly.com/predictive-policing-long-road-transparency/.
Chicago Police Department (2016), p. 1, available at http://directives.chicagopolice.org/directives/data/a7a57b85-155e9f4b-50c15-5e9f-7742e3ac8b0ab2d3.html.
See Chicago Police Department, Special Order S09-11 (2016), p. 1.
Chicago Police Department (2015), p. 1, available at http://directives.chicagopolice.org/directives/data/a7a57bf0-1456faf9-bfa14-570a-a2deebf33c56ae59.html.
Ibid. The Custom Notifications directive does not disambiguate between what counts as “victimization” or “engagement” in criminal activity, but rather treats these as equal in the process of notifying subjects. Between 2013 and 2016, the CPD delivered roughly 1400 custom notifications. See Martinez (2016), http://chicago.cbslocal.com/2016/05/31/going-inside-the-chicago-police-departments-strategic-subject-list/ and Posadas (2017), https://medium.com/equal-future/how-strategic-is-chicagos-strategic-subjects-list-upturn-investigates-9e5b4b235a7c.
Chicago Police Department, Special order S10-05 (2015), p. 2.
Amoore links this preemptive activity of risk-assessment algorithms with the strategy of juridical decision associated with sovereign power. See Amoore (2013), pp. 41, 82–83.
While the continual revision of the SSL might seem to challenge this preemptive activity, it is ultimately updated in order to improve the algorithm’s predictive power, and hence to better preempt future crime. The idea here is that preemption contributes to the aim of predictive policing technologies and guides their revisions even when (or especially when) they are not successful in preempting crime.
This marks a difference between Amoore’s account of preemption in predictive algorithms and my own insofar as I understand preemption to be racialized in the case of policing algorithms like the SSL. While Amoore presents preemption as a general feature of risk-assessment algorithms that appears to apply equally to all subjects, my own view is that preemption is differentially applied to racialized subjects, and thus cannot be fully understood without considering how it is entangled with a racial politics of time.
Sedgwick (2003), p. 130, italics added.
Ibid., 131, italics added.
Saunders et al. (2016), p. 364. This study also found that “at-risk individuals were not more or less likely to become victims of a homicide or shooting as a result of the SSL, and this is further supported by city-level analysis finding no effect on the city homicide trend.” (Ibid.)
Ferguson (2017), p. 40 and Davey (2016) available at https://www.nytimes.com/2016/05/24/us/armed-with-data-chicago-police-try-to-predict-who-may-shoot-or-be-shot.html.
See https://data.cityofchicago.org/Public-Safety/Strategic-Subject-List-Dashboard/wgnt-sjgb. Another study found that over 50% of Black men in Chicago between the ages of 20–29 have an SSL score. See Kunichoff and Sier (2017), available at http://www.chicagomag.com/city-life/August-2017/Chicago-Police-Strategic-Subject-List/. While the SSL algorithm does not explicitly use race to calculate risk scores, the publically available data set from 2012 to 2016 identifies subjects with demographic variables like race and gender.
Chicago Police Department, Special order S10-05 (2015), p. 2. Available at http://directives.chicagopolice.org/directives/data/a7a57bf0-1456faf9-bfa14-570a-a2deebf33c56ae59.html.
Moses and Chan (2016), p. 4.
See Gitelman (2013), p. 2.
Ferguson (2017), p. 47.
As historian Daniel Rosenberg reminds us, ‘data’ is the plural form of the Latin ‘datum,’ the past participle of the verb ‘dare’—to give. Hence, the plural ‘data’ and the singular ‘datum’ literally mean “something given” or “something taken for granted.” See Rosenberg in Gitelman (2013), p. 18.
As Barocas and Selbst observe in connection with data mining, “Data mining can reproduce existing patterns of discrimination, inherit the prejudice of prior decision makers, or simply reflect the widespread biases that persist in society.” See Barocas and Selbst (2016), p. 674.
According to a 2017 investigation by the Department of Justice Civil Rights Division, patterns of racially discriminatory conduct pervade the Chicago Police Department. See USDJ Civil Rights Division 2017, 15, available at https://www.justice.gov/opa/file/925846/download and https://data.cityofchicago.org/Public-Safety/Strategic-Subject-List-Dashboard/wgnt-sjgb.
See Završnik (2018), p. 12.
Mills (2014), p. 28.
Foucault (2000), p. 341.
Coates (2015), p. 91, italics added.
See Cooper (2017), available at https://www.ted.com/talks/brittney_cooper_the_racial_politics_of_time.
Adamson, C. R. (1983). Punishment after slavery: Southern state penal systems, 1865-1890. Social Problems, 30(5), 555–569.
Alexander, M. (2012). The new jim crow: Mass incarceration in the age of colorblindness. New York: The New Press.
Amoore, L. (2013). The politics of possibility: Risk and security beyond probability. Durham: Duke University Press.
Ananny, M. (2016). Toward and ethics of algorithms: Convening, observation, probability, and timeliness. Science, Technology, & Human Values, 41(1), 93–117.
Ananny, M., & Crawford, K (2016). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society. https://doi.org/10.1177/1461444816676645.
Barocas, S., & Selbst, A. D. (2016). Big data’s disparate impact. California Law Review, 104, 671–732.
Bass, S. (2001). Policing space, policing race: Social control imperatives and police discretionary decisions. Social Justice, 28(1), 156–176.
Beck, C., & McCue, C. (2009). Predictive policing: What we can learn from Wal-Mart and Amazon about fighting crime during a recession. The Police Chief, 76(11), 20–29.
Beer, D. (2009). Power through the algorithm? Participatory web cultures and the technological unconscious. New Media & Society, 11(6), 985–1002.
Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), 1–13.
Bratton, W., Morgan, J., & Malinowski, S. (2009). Fighting crime in the information age: The promise of predictive policing. Draft. Retrieved from https://publicintelligence.net/lapd-research-paper-fighting-crime-in-the-information-age-the-promise-of-predictive-policing/.
Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on facebook. New Media & Society, 14(7), 1164–1180.
Cheney-Lippold, J. (2011). A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture & Society, 28(6), 164–181.
Chicago Police Department. (2015). “Custom notifications in Chicago.” http://directives.chicagopolice.org/directives/data/a7a57bf0-1456faf9-bfa14-570a-a2deebf33c56ae59.html. Accessed 16 Mar 2017.
Chicago Police Department. (2016). “Strategic Subject List (SSL) dashboard.” http://directives.chicagopolice.org/directives/data/a7a57b85-155e9f4b-50c15-5e9f-7742e3ac8b0ab2d3.html. Accessed 16 Mar 2017.
Coates, T.-N. (2015). Between the world and me. New York: Spiegal & Grau.
Collier, S. J. (2009). Topologies of power: Foucault’s analysis of political government beyond ‘governmentality’. Theory, Culture & Society, 26(6), 78–108.
Cooper, B. (2017). The racial politics of time (video). Retrieved from https://www.ted.com/talks/brittney_cooper_the_racial_politics_of_time.
Davey, M. (2016). “Chicago police try to predict who may shoot or be shot.” https://www.nytimes.com/2016/05/24/us/armed-with-data-chicago-police-try-to-predict-who-may-shoot-or-be-shot.html. Accessed 16 Mar 2017.
Esposito, E. (2015). Beyond the promise of security: Uncertainty as resource. Telos, 170, 89–107.
Ferguson, A. G. (2017). The rise of big data policing: Surveillance, race, and the future of law enforcement. New York: New York University Press.
Foucault, M. (1975/1997). Discipline and punish: The birth of the prison (A. Sheridan, Trans.). New York: Vintage.
Foucault, M. (1976/1978). The history of sexuality, volume 1: The will to know. New York: Pantheon Books.
Foucault, M. (2000). Essential works, volume 3: Power. In J. Faubion & P. Rabinow (Eds.), New York: New Press.
Foucault, M. (2004/2008). The birth of biopolitics: Lectures at the Collège de France, 1978–1979. New York: Palgrave Macmillan.
Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. ACM Transactions on Information Systems (TOIS), 14(3), 330–347.
Gillespie (2016). Algorithm. In B. Peters (Ed.), Digital keywords: A vocabulary of information society and culture (pp. 18–30). Princeton: Princeton University Press.
Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–193). Cambridge: MIT Press.
Gitelman, L. (Ed.). (2013). ‘Raw Data’ is an oxymoron. Cambridge: MIT Press.
Gordon, C. (1991). Governmental rationality: An introduction. In G. Burchell, C. Gordon & P. Miller (Eds.), The foucault effect: Studies in governmentality. Chicago: University of Chicago Press.
Harcourt, B. E. (2007). Against prediction: Profiling, policing, and punishing in an actuarial age. Chicago: University of Chicago Press.
Hofstadter, R. (1965). The paranoid style in American politics and other essays. New York: Knopf.
Introna, L. D. (2016). Algorithms, governance, and governmentality. Science, Technology & Human Values, 41(1), 17–49.
Kaplan, J. (2017). Predictive policing and the long road to transparency. South Side Weekly. Retrieved July 12, 2017, from https://southsideweekly.com/predictive-policing-long-road transparency/.
Koopman, C. (2013). Genealogy as critique: Foucault and the problems of modernity. Bloomington: Indiana University Press.
Koopman, C. (2014). Michel Foucault’s critical empiricism Today: Concepts and analytics in the critique of biopower and infopower. In J. D. Faubion (Ed.), Foucault now: Current perspectives in Foucault studies (pp. 88–111). Cambridge: Polity Press.
Kraemer, F., van Overveld, K., & Peterson, M. (2011). Is there an ethics of algorithms? Ethics and Information Technology, 13(3), 251–260.
Kunichoff, Y., & Sier P (2017). The contradictions of Chicago Police’s Secretive List. Chicago Magazine. Retrieved August 21, 2017, from http://www.chicagomag.com/city-life/August-2017/Chicago-Police-Strategic-Subject List/.
Latour, B. (1988). The pasteurization of France (A. Sheridan & J. Law, Trans.). Cambridge: Harvard University Press.
Lemke, T. (2001). ‘The birth of bio-politics’: Michel Foucault’s lecture at the Collège de France on neo-liberal governmentality. Economy and Society, 30(2), 190–207.
Lyon, D. (2014). Surveillance, snowden, and big data: Capacities, consequences, critique. Big Data & Society, 2014, 1–13.
Maguire, M. (2000). Policing by risks and targets: Some dimensions and implications of intelligence-led crime control. Policing and Society, 9(4), 315–336.
Mann, C. R. (1993). Unequal justice: A question of color. Indianapolis: Indiana University Press.
Martinez, M. (2016). “Going inside the Chicago police department’s ‘Strategic Subject List.’” https://chicago.cbslocal.com/2016/05/31/going-inside-the-chicago-policedepartments-strategic-subject-list/. Accessed 16 Mar 2017.
McCulloch, J., & Wilson, D (2016). Pre-crime: Pre-emption, precaution and the future. London: Routledge.
Mills, C. W. (2014). White time: The chronic injustice of ideal theory. Du Bois Review, 11(1), 27–42.
Mittelstadt, B., Daniel, P., Allo, M., Taddeo, S. W., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 2016, 1–21.
Moses, L. B., & Chan, J. (2016) Algorithmic prediction in policing: Assumptions, evaluation, and accountability. Policing and Society. https://doi.org/10.1080/10439463.2016.1253695.
National Institute of Justice. (2009). Predictive policing symposiums. Retrieved from https://www.ncjrs.gov/pdffiles1/nij/242222and248891.pdf.
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Crown.
Pearsall, B. (2009) Predictive policing: The future of law enforcement? NIJ Journal, 266, 16–19.
Posadas, B. (2017). How strategic is Chicago’s ‘Strategic Subject List’? Upturn investigates. Medium. Retrieved June 22, 2017, from https://medium.com/equal-future/how-strategic-is-chicagos-strategic-subjects-list-upturn-investigates-9e5b4b235a7c.
Robinson, D., & Koepke, L. (2016). Stuck in a pattern: Early evidence on ‘predictive policing’ and civil rights. Upturn report: pp 1–29. Retrieved from https://www.teamupturn.com/reports/2016/stuck-in-a-pattern.
Rose, N. S., O’Malley, P., & Valverde, M. (2006). Governmentality. Annual Review of Law and Society, 2, 83–104.
Russell, K. K. (1998). The color of crime: Racial hoaxes, white fear, black protectionism, police harassment, and other macroaggressions. New York: New York University Press.
Saunders, J., Hunt, P., & Hollywood, J. S. (2016). Predictions put into practice: A quasi-experimental evaluation of Chicago’s predictive policing pilot. Journal of Experimental Criminology, 12, 347–371.
Sedgwick, E. K. (2003). Touching feeling: Affect, pedagogy, performativity. Durham: Duke University Press.
Wilson, D. (2018). Algorithmic patrol: The futures of predictive policing. In A. Završnik (Ed.), Big data, crime and social control. London: Routledge.
Završnik, A. (Ed.). (2018). Big data, crime and social control. London: Routledge.
Zedner, L. (2007). Pre-crime and post-criminology? Theoretical Criminology, 11(2), 261–281.
Ziewitz, M. (2016). Governing algorithms: Myth, mess, and methods. Science, Technology & Human Values, 41(1), 3–16.
About this article
Cite this article
Sheehey, B. Algorithmic paranoia: the temporal governmentality of predictive policing. Ethics Inf Technol 21, 49–58 (2019). https://doi.org/10.1007/s10676-018-9489-x
- Predictive policing