Algorithmic paranoia: the temporal governmentality of predictive policing


In light of the recent emergence of predictive techniques in law enforcement to forecast crimes before they occur, this paper examines the temporal operation of power exercised by predictive policing algorithms. I argue that predictive policing exercises power through a paranoid style that constitutes a form of temporal governmentality. Temporality is especially pertinent to understanding what is ethically at stake in predictive policing as it is continuous with a historical racialized practice of organizing, managing, controlling, and stealing time. After first clarifying the concept of temporal governmentality, I apply this lens to Chicago Police Department’s Strategic Subject List. This predictive algorithm operates, I argue, through a paranoid logic that aims to preempt future possibilities of crime on the basis of a criminal past codified in historical crime data.

This is a preview of subscription content, log in to check access.


  1. 1.

    Pearsall (2009), p. 16.

  2. 2.

    Robinson and Koepke (2016).

  3. 3.

    Ibid., 2.

  4. 4.

    For a helpful summary of the differences between place-based and person-based predictive systems, see Ferguson (2017). Advocates of place-based programs defend these as less problematic than their person-based counterparts insofar as they target locations of possible crime rather than subjects of future crime. See Beck and McCue (2009). Advocates for person-based systems have argued that these types of programs are more effective as predictive tools for targeting specific kinds of crime in a city.

  5. 5.

    For a brief pre-history of predictive policing, see Wilson (2018) in Završnik (2018).

  6. 6.

    Cathy O’Neil provocatively describes predictive policing as a “weapon of math destruction”—a mathematical model that has harmful effects on precarious social groups. See O’Neil (2016), p. 3.

  7. 7.

    See Amoore (2013), Gillespie (2014), Introna (2016), and Beer (2009, 2017). For a connected discussion of algorithms as value-laden, see Friedman and Nissenbaum (1996), Kraemer et al. (2011), and Mittelstadt et al. (2016).

  8. 8.

    See Amoore (2013), Esposito (2015), and Ananny (2016).

  9. 9.

    See McCulloch and Wilson (2016), Robinson and Koepke (2016), O’Neil (2016), and Ferguson (2017).

  10. 10.

    See and

  11. 11.

    See Zedner (2007), p. 262, Lyon (2014), p. 6. McCulloch and Wilson (2016), p. 3.

  12. 12.

    See O’Neil (2016), p. 86 and Alexander (2012). A range of scholars emphasize the mutually constitutive relationship between policing and race in the U.S., noting the way in which policing has been historically central for the formation and maintenance of racial hierarchies. Coramae Richey Mann, for instance, argues that policing in the U.S. has its roots in slavery, with slave patrols constituting the first state-sponsored police forces. See Mann (1993), pp. 165, 195. See also Adamson (1983), Russell (1998), and Bass (2001).

  13. 13.

    See Ziewitz (2016), p. 10.

  14. 14.

    See Gillespie (2016), pp. 19–22 and Ananny (2016), p. 97.

  15. 15.

    See Gillespie (2016), pp. 19, 26.

  16. 16.

    Foucault (1997), p. xx.

  17. 17.

    See Introna (2016), p. 19.

  18. 18.

    Foucault (2004/2008), pp. 12–13.

  19. 19.

    See Collier (2009), p. 96.

  20. 20.

    See Gordon (1991), Lemke (2001), and Rose et al. (2006).

  21. 21.

    See Foucault (2004/2008), p. 19.

  22. 22.

    Foucault (1975/1997, p. 27a.

  23. 23.

    See Foucault (1975/1997, 1976/1978).

  24. 24.

    Introna (2016), p. 28. Foucault refers to the exercise of power as a “conduct of conducts” (conduire des conduites) in a 1978 essay translated from the French as “How is Power Exercised?” See Foucault (1978, 2000), p. 341.

  25. 25.

    Beer (2017), p. 9.

  26. 26.

    See Harcourt (2007), Amoore (2013), Gillespie (2014), and Beer (2017).

  27. 27.

    See Gillespie (2014).

  28. 28.

    See Cheney-Lippold (2011), Bucher (2012), and Amoore (2013), respectively.

  29. 29.

    Foucault (1975/1997).

  30. 30.

    Foucault (1975/1997), p. 200.

  31. 31.

    Ibid., 201.

  32. 32.

    I follow scholars like Bucher (2012) and Ananny and Crawford (2016) who have shown the limits of an analytic of visibility and an accompanying ethics of transparency to the critical study of algorithms. Bucher (2012) shows how the Facebook EdgeRank algorithm works differently from the Panoptic form of surveillance insofar as it imposes a ‘threat of invisibility’ on users. Ananny and Crawford (2016) explore the limits of the ideal of transparency for understanding governing algorithmic systems and for holding these systems accountable.

  33. 33.

    See Esposito (2015), pp. 93–94 and Amoore (2013), p. 9. Where these scholars tend to focus on the temporality of predictive algorithms more generally, my own analysis is more attentive to how temporality is racialized in the specific case of predictive policing algorithms like the SSL.

  34. 34.

    This performative understanding of time is inspired by Bruno Latour. See Latour (1988), pp. 50, 165.

  35. 35.

    Mills (2014), p. 28.

  36. 36.


  37. 37.

    Ibid., 31.

  38. 38.

    Ibid., 31.

  39. 39.

    See Maguire (2000).

  40. 40.

    This reactive style of policing is at work in disciplinary power, which proactively shapes the prisoner only after it first situated them as a prisoner by reacting to their crime. Disciplinary power thus does not capture the temporal governmentality of predictive policing insofar as it is reactive to crime where predictive policing is proactive.

  41. 41.

    See NIJ (2009), pp. 3–4 and Bratton et al. (2009), pp. 1–4.

  42. 42.

    Beck and McCue (2009), pp. 20, italics added.

  43. 43.

    Pearsall (2009), p. 17.

  44. 44.

    See and

  45. 45.

    See Kaplan (2017), available at

  46. 46.


  47. 47.

    Chicago Police Department (2016), p. 1, available at

  48. 48.


  49. 49.

  50. 50.


  51. 51. See also

  52. 52.

    Ibid. See also

  53. 53.

    See Chicago Police Department, Special Order S09-11 (2016), p. 1.

  54. 54.

    Ibid., 1–2.

  55. 55.

    Chicago Police Department (2015), p. 1, available at

  56. 56.


  57. 57.

    Ibid. The Custom Notifications directive does not disambiguate between what counts as “victimization” or “engagement” in criminal activity, but rather treats these as equal in the process of notifying subjects. Between 2013 and 2016, the CPD delivered roughly 1400 custom notifications. See Martinez (2016), and Posadas (2017),

  58. 58.

    Chicago Police Department, Special order S10-05 (2015), p. 2.

  59. 59.

    Ibid., 3.

  60. 60.

    Amoore links this preemptive activity of risk-assessment algorithms with the strategy of juridical decision associated with sovereign power. See Amoore (2013), pp. 41, 82–83.

  61. 61.

    While the continual revision of the SSL might seem to challenge this preemptive activity, it is ultimately updated in order to improve the algorithm’s predictive power, and hence to better preempt future crime. The idea here is that preemption contributes to the aim of predictive policing technologies and guides their revisions even when (or especially when) they are not successful in preempting crime.

  62. 62.

    This marks a difference between Amoore’s account of preemption in predictive algorithms and my own insofar as I understand preemption to be racialized in the case of policing algorithms like the SSL. While Amoore presents preemption as a general feature of risk-assessment algorithms that appears to apply equally to all subjects, my own view is that preemption is differentially applied to racialized subjects, and thus cannot be fully understood without considering how it is entangled with a racial politics of time.

  63. 63.

    See Coates (2015). Coates is here mimicking the title of Richard Hofstadter’s influential 1963 essay “The Paranoid Style in American Politics.” See Hofstadter (1965).

  64. 64.

    Sedgwick (2003), p. 130, italics added.

  65. 65.

    Ibid., 131, italics added.

  66. 66.

    Saunders et al. (2016), p. 364. This study also found that “at-risk individuals were not more or less likely to become victims of a homicide or shooting as a result of the SSL, and this is further supported by city-level analysis finding no effect on the city homicide trend.” (Ibid.)

  67. 67.

    Ferguson (2017), p. 40 and Davey (2016) available at

  68. 68.

    See Another study found that over 50% of Black men in Chicago between the ages of 20–29 have an SSL score. See Kunichoff and Sier (2017), available at While the SSL algorithm does not explicitly use race to calculate risk scores, the publically available data set from 2012 to 2016 identifies subjects with demographic variables like race and gender.

  69. 69.

    Chicago Police Department, Special order S10-05 (2015), p. 2. Available at

  70. 70.


  71. 71.

    Moses and Chan (2016), p. 4.

  72. 72.

    See Gitelman (2013), p. 2.

  73. 73.

    Ferguson (2017), p. 47.

  74. 74.

    As historian Daniel Rosenberg reminds us, ‘data’ is the plural form of the Latin ‘datum,’ the past participle of the verb ‘dare’—to give. Hence, the plural ‘data’ and the singular ‘datum’ literally mean “something given” or “something taken for granted.” See Rosenberg in Gitelman (2013), p. 18.

  75. 75.

    See McCulloch and Wilson (2016), p. 2, Lyon (2014), p. 6, and Zedner (2007), p. 262.

  76. 76.

    As Barocas and Selbst observe in connection with data mining, “Data mining can reproduce existing patterns of discrimination, inherit the prejudice of prior decision makers, or simply reflect the widespread biases that persist in society.” See Barocas and Selbst (2016), p. 674.

  77. 77.

    According to a 2017 investigation by the Department of Justice Civil Rights Division, patterns of racially discriminatory conduct pervade the Chicago Police Department. See USDJ Civil Rights Division 2017, 15, available at and

  78. 78.

    See Završnik (2018), p. 12.

  79. 79.

    Mills (2014), p. 28.

  80. 80.

    Foucault (2000), p. 341.

  81. 81.

    Coates (2015), p. 91, italics added.

  82. 82.

    See Cooper (2017), available at


  1. Adamson, C. R. (1983). Punishment after slavery: Southern state penal systems, 1865-1890. Social Problems, 30(5), 555–569.

    Article  Google Scholar 

  2. Alexander, M. (2012). The new jim crow: Mass incarceration in the age of colorblindness. New York: The New Press.

    Google Scholar 

  3. Amoore, L. (2013). The politics of possibility: Risk and security beyond probability. Durham: Duke University Press.

    Google Scholar 

  4. Ananny, M. (2016). Toward and ethics of algorithms: Convening, observation, probability, and timeliness. Science, Technology, & Human Values, 41(1), 93–117.

    Article  Google Scholar 

  5. Ananny, M., & Crawford, K (2016). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society.

    Google Scholar 

  6. Barocas, S., & Selbst, A. D. (2016). Big data’s disparate impact. California Law Review, 104, 671–732.

    Google Scholar 

  7. Bass, S. (2001). Policing space, policing race: Social control imperatives and police discretionary decisions. Social Justice, 28(1), 156–176.

    Google Scholar 

  8. Beck, C., & McCue, C. (2009). Predictive policing: What we can learn from Wal-Mart and Amazon about fighting crime during a recession. The Police Chief, 76(11), 20–29.

    Google Scholar 

  9. Beer, D. (2009). Power through the algorithm? Participatory web cultures and the technological unconscious. New Media & Society, 11(6), 985–1002.

    Article  Google Scholar 

  10. Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), 1–13.

    Article  Google Scholar 

  11. Bratton, W., Morgan, J., & Malinowski, S. (2009). Fighting crime in the information age: The promise of predictive policing. Draft. Retrieved from

  12. Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on facebook. New Media & Society, 14(7), 1164–1180.

    Article  Google Scholar 

  13. Cheney-Lippold, J. (2011). A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture & Society, 28(6), 164–181.

    Article  Google Scholar 

  14. Chicago Police Department. (2015). “Custom notifications in Chicago.” Accessed 16 Mar 2017.

  15. Chicago Police Department. (2016). “Strategic Subject List (SSL) dashboard.” Accessed 16 Mar 2017.

  16. Coates, T.-N. (2015). Between the world and me. New York: Spiegal & Grau.

    Google Scholar 

  17. Collier, S. J. (2009). Topologies of power: Foucault’s analysis of political government beyond ‘governmentality’. Theory, Culture & Society, 26(6), 78–108.

    Article  Google Scholar 

  18. Cooper, B. (2017). The racial politics of time (video). Retrieved from

  19. Davey, M. (2016). “Chicago police try to predict who may shoot or be shot.” Accessed 16 Mar 2017.

  20. Esposito, E. (2015). Beyond the promise of security: Uncertainty as resource. Telos, 170, 89–107.

    Article  Google Scholar 

  21. Ferguson, A. G. (2017). The rise of big data policing: Surveillance, race, and the future of law enforcement. New York: New York University Press.

    Google Scholar 

  22. Foucault, M. (1975/1997). Discipline and punish: The birth of the prison (A. Sheridan, Trans.). New York: Vintage.

  23. Foucault, M. (1976/1978). The history of sexuality, volume 1: The will to know. New York: Pantheon Books.

  24. Foucault, M. (2000). Essential works, volume 3: Power. In J. Faubion & P. Rabinow (Eds.), New York: New Press.

    Google Scholar 

  25. Foucault, M. (2004/2008). The birth of biopolitics: Lectures at the Collège de France, 1978–1979. New York: Palgrave Macmillan.

  26. Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. ACM Transactions on Information Systems (TOIS), 14(3), 330–347.

    Article  Google Scholar 

  27. Gillespie (2016). Algorithm. In B. Peters (Ed.), Digital keywords: A vocabulary of information society and culture (pp. 18–30). Princeton: Princeton University Press.

    Google Scholar 

  28. Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–193). Cambridge: MIT Press.

    Google Scholar 

  29. Gitelman, L. (Ed.). (2013). ‘Raw Data’ is an oxymoron. Cambridge: MIT Press.

    Google Scholar 

  30. Gordon, C. (1991). Governmental rationality: An introduction. In G. Burchell, C. Gordon & P. Miller (Eds.), The foucault effect: Studies in governmentality. Chicago: University of Chicago Press.

    Google Scholar 

  31. Harcourt, B. E. (2007). Against prediction: Profiling, policing, and punishing in an actuarial age. Chicago: University of Chicago Press.

    Google Scholar 

  32. Hofstadter, R. (1965). The paranoid style in American politics and other essays. New York: Knopf.

    Google Scholar 

  33. Introna, L. D. (2016). Algorithms, governance, and governmentality. Science, Technology & Human Values, 41(1), 17–49.

    Article  Google Scholar 

  34. Kaplan, J. (2017). Predictive policing and the long road to transparency. South Side Weekly. Retrieved July 12, 2017, from transparency/.

  35. Koopman, C. (2013). Genealogy as critique: Foucault and the problems of modernity. Bloomington: Indiana University Press.

    Google Scholar 

  36. Koopman, C. (2014). Michel Foucault’s critical empiricism Today: Concepts and analytics in the critique of biopower and infopower. In J. D. Faubion (Ed.), Foucault now: Current perspectives in Foucault studies (pp. 88–111). Cambridge: Polity Press.

    Google Scholar 

  37. Kraemer, F., van Overveld, K., & Peterson, M. (2011). Is there an ethics of algorithms? Ethics and Information Technology, 13(3), 251–260.

    Article  Google Scholar 

  38. Kunichoff, Y., & Sier P (2017). The contradictions of Chicago Police’s Secretive List. Chicago Magazine. Retrieved August 21, 2017, from List/.

  39. Latour, B. (1988). The pasteurization of France (A. Sheridan & J. Law, Trans.). Cambridge: Harvard University Press.

    Google Scholar 

  40. Lemke, T. (2001). ‘The birth of bio-politics’: Michel Foucault’s lecture at the Collège de France on neo-liberal governmentality. Economy and Society, 30(2), 190–207.

    Article  Google Scholar 

  41. Lyon, D. (2014). Surveillance, snowden, and big data: Capacities, consequences, critique. Big Data & Society, 2014, 1–13.

    Google Scholar 

  42. Maguire, M. (2000). Policing by risks and targets: Some dimensions and implications of intelligence-led crime control. Policing and Society, 9(4), 315–336.

    Article  Google Scholar 

  43. Mann, C. R. (1993). Unequal justice: A question of color. Indianapolis: Indiana University Press.

    Google Scholar 

  44. Martinez, M. (2016). “Going inside the Chicago police department’s ‘Strategic Subject List.’” Accessed 16 Mar 2017.

  45. McCulloch, J., & Wilson, D (2016). Pre-crime: Pre-emption, precaution and the future. London: Routledge.

    Google Scholar 

  46. Mills, C. W. (2014). White time: The chronic injustice of ideal theory. Du Bois Review, 11(1), 27–42.

    Article  Google Scholar 

  47. Mittelstadt, B., Daniel, P., Allo, M., Taddeo, S. W., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 2016, 1–21.

    Google Scholar 

  48. Moses, L. B., & Chan, J. (2016) Algorithmic prediction in policing: Assumptions, evaluation, and accountability. Policing and Society.

    Google Scholar 

  49. National Institute of Justice. (2009). Predictive policing symposiums. Retrieved from

  50. O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Crown.

    Google Scholar 

  51. Pearsall, B. (2009) Predictive policing: The future of law enforcement? NIJ Journal, 266, 16–19.

    Google Scholar 

  52. Posadas, B. (2017). How strategic is Chicago’s ‘Strategic Subject List’? Upturn investigates. Medium. Retrieved June 22, 2017, from

  53. Robinson, D., & Koepke, L. (2016). Stuck in a pattern: Early evidence on ‘predictive policing’ and civil rights. Upturn report: pp 1–29. Retrieved from

  54. Rose, N. S., O’Malley, P., & Valverde, M. (2006). Governmentality. Annual Review of Law and Society, 2, 83–104.

    Article  Google Scholar 

  55. Russell, K. K. (1998). The color of crime: Racial hoaxes, white fear, black protectionism, police harassment, and other macroaggressions. New York: New York University Press.

    Google Scholar 

  56. Saunders, J., Hunt, P., & Hollywood, J. S. (2016). Predictions put into practice: A quasi-experimental evaluation of Chicago’s predictive policing pilot. Journal of Experimental Criminology, 12, 347–371.

    Article  Google Scholar 

  57. Sedgwick, E. K. (2003). Touching feeling: Affect, pedagogy, performativity. Durham: Duke University Press.

    Google Scholar 

  58. Wilson, D. (2018). Algorithmic patrol: The futures of predictive policing. In A. Završnik (Ed.), Big data, crime and social control. London: Routledge.

    Google Scholar 

  59. Završnik, A. (Ed.). (2018). Big data, crime and social control. London: Routledge.

    Google Scholar 

  60. Zedner, L. (2007). Pre-crime and post-criminology? Theoretical Criminology, 11(2), 261–281.

    Article  Google Scholar 

  61. Ziewitz, M. (2016). Governing algorithms: Myth, mess, and methods. Science, Technology & Human Values, 41(1), 3–16.

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Bonnie Sheehey.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Sheehey, B. Algorithmic paranoia: the temporal governmentality of predictive policing. Ethics Inf Technol 21, 49–58 (2019).

Download citation


  • Algorithms
  • Predictive policing
  • Power
  • Ethics
  • Time