Review of Virginia Eubanks (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.
- 66 Downloads
KeywordsAlgorithmic cultures Automatization High-tech Inequality Postdigital Poverty
They tell us that big data shakes up hidebound bureaucracies, stimulates innovative solutions, and increases transparency. But when we focus on programs specifically targeted at poor and working-class people, the new regime of data analytics is more evolution than revolution. (Eubanks 2018: 36)
Technologies and algorithms are part and parcel of the postdigital society. Pointing towards difficulties in separation between digital and analog spheres, postdigital approach describes a blurred environment where our fascination with digital technologies belongs to the past (Cramer 2015), and where digital systems and tools have become invisible (Cormier et al. 2019). Contemporary humans do not only coexist with sophisticated machines; more importantly, these machines also attain increasing power over human actions. Examples include technological opportunities for disruption of traditional political economy of production and distribution of academic knowledge (Jandrić and Hayes 2019), and technological potentials for democratization and disruption of education using tools such as massive open online courses (Littlejohn and Hood 2018). In its nuanced descriptions of complex and messy relationships between human beings and technologies (Jandrić et al. 2018), the postdigital perspective offers a suitable theoretical framework for analyzing their changing power relationships.
In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, Virginia Eubanks (2018) asserts that these power relationships indicate—often in a disguised way—a rise of dependence on digital tools and gadgets. The postdigital condition reinforces this tendency, because our social relationships are dialectically intertwined with the digital cultures that surround us (Escaño 2019). According to Knox (2015) digital cultures have evolved from cybercultures to algorithmic cultures through an intermediate phase of community cultures. In Automating Inequality Eubanks (2018) refers predominantly to the current phase of algorithmic cultures, where high-tech promotes automated decision-making and exacerbates inequalities and privileges. Eubanks shows that algorithmic computer operations have been introduced as neutral facilitators. Viewed as tools that make great disruptive changes by themselves, technologies are presented as fair instruments designed to democratize communication, education, and participation in the public sphere. Following such logic, media literacy has become instrumentalized and focused almost exclusively on the acquisition of skills required for taking part in the digital world (Emejulu and McGregor 2016). The absence of critical consciousness about the nature of algorithmic cultures is noted by critical authors who claim the need for a postdigital critical media pedagogy that reflects on the political and philosophical discourses hidden in technological tools (Emejulu and McGregor 2016; Jandrić 2017; Jandrić et al. 2018). In support of this narrative, critical media literacy and critical data literacy are highly required in our postdigital age (Jandrić 2019).
Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (Eubanks 2018) offers a fascinating glimpse into ways that political and philosophical background of digital technologies supports new forms of ‘society of control’ where technologies are designed to observe, track, monitor and tag (see Jandrić 2017). In developed countries such as the USA, which suffer from unprecedented social inequality and withdrawal of the middle class (Chomsky 2017), important decisions in health and social assistance services are increasingly made by algorithmic entities. While humankind embraces digital technologies, technology develops into a new independent stage; high levels of automation bring about a situation in which even designers of these technologies cannot predict their behavior! (Jandrić 2019). In Automating Inequality, Eubanks does not merely describe how automatization and lack of human control affect the working class and the poor. More importantly, she uses her own experience to bring together different, meticulously researched stories, which describe the core of impact of these technologies on poor and working-class people in America. This exploration of real personal stories enriches the book and brings it close to potential readers.
Throughout the three chapters of the book, Eubanks explores welfare decision-making technology in Indiana, homeless service in Los Angeles, and child protection algorithm in Pennsylvania. This approach provides the opportunity to acknowledge tensions between technologies and inequalities they promote during decision-making processes. When technologies reinforce inequalities, they are no longer neutral, and Eubanks vividly describes how automated decision-making systems punish the poor and middle-class through persuasive mechanisms of control. Eubanks’ research starts in 2014. In a simple and colloquial style, she begins by narrating a personal experience and slowly introduces other protagonists. Through these stories, Eubanks shows how a few missing digits can determine people’s future. Therefore, she concludes, it is necessary to reflect and promote discourses about how the most vulnerable sectors of the population face these ‘random failures of the technological system’—a system to which we are increasingly ceding both power and greater responsibility (Eubanks 2018). Each chapter is illustrated with heartbreaking stories, offering a stunningly detailed study from various perspectives including patients, case workers, activists, police officers, journalists, and other stakeholders. Through these stories, the book empowers the reader and places a wager for a fair and socially committed use of technology.
According to Eubanks, digital scrutiny is merely the last stage in the history of the punishment of the poor. Since the nineteenth century, the aim to regulate poverty has set into motion powerful mechanisms of observation and manipulation. In the predigital age, decisions about housing and welfare were supported by human beings. Based on a combination of objective and subjective perspectives, this approach served as an exercise in liberating decision-makers’ conscience and social responsibility in the face of stark inequalities. Building on this historical overview, Eubanks shows that introducing high-tech tools in these processes has continued to ‘disempower poor and working-class people, denying their human rights, and violating their autonomy’ (Eubanks 2018). Poorhouses have given way to scientific charities, which will later evolve into digital poorhouses. Based on this historical overview, Eubanks explains how computers, presented as neutral tools for optimization of public spending, have ended up being used as surveillance tools and private data collectors. Used within opaque algorithms, then, this data ends up determining individual human rights! At this point of the book readers become more aware of dire implications from transferring one’s personal data: the system equates poverty and homelessness with criminality. In a passive and socially accepted way, the poor are criminalized and discriminated against, thus legitimizing the status quo. Policies that benefit only a very small sector of the population have been increasingly applied by those whom Adam Smith called ‘the masters of humanity’. There is, however, a general lack of popular reaction against these trends (Chomsky 2017).
Digital tools convince us that the neediest people are getting help based on parameters and information that are collected in well-stablished systems. Look the other way, do not get involved, and everything will be fine—poverty is under control. Controversially, however, these digital and high-capacity systems do not accept doubts, errors, or exceptions. Paradoxically, if the algorithmic sequence fails, a person may lose their healthcare or drop a list to get a home. Therefore, we need a critical postdigital perspective to understand ways in which algorithms transcend digital boundaries. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (Eubanks 2018) is an important read for citizens concerned about relationships between social justice and high-tech, and the impacts of these relationships to our postdigital world. Targeting some of the most vulnerable sectors of population, technological policies cross borders which should not be crossed (such as human rights). The problem is much deeper than (data) surveillance and (often false) algorithmic prediction. Deprived of resources and paralyzed by fear, victims of this politico-technological system are helpless. Amid this horror stories, Eubanks identifies an opportunity to realize the necessary social commitment of all and for all. In her words: ‘cultural denial is the process that allows us to know about cruelty, discrimination, and repression, but never openly acknowledge it. It is how we come to know what not to know’ (Eubanks 2018: 144). The digital poorhouse is hard to understand, massively scalable, and persistent—yet we all live in it. Technologies of the past had been structuring the world in visible, physical ways, but this latest bout of postdigital algorithmic restructuring is simultaneously material, non-material, and opaque (Hayes 2019). Digital technologies create master narratives of today (Fuller and Jandrić 2019). These master narratives are part and parcel of the so-called ‘cybernetic capitalism’ (Peters and Jandrić 2018), and can be counterbalanced only using postdigital approaches (Jandrić et al. 2018; Cormier et al. 2019). Arguably the first step in dismantling the digital poorhouse could be to change ways in which we refer to poverty. Poverty, and its different meanings as illiteracy, is a form of repression, a mode of domination, and a form of capitalist production (Peters et al. 2018). Once we recognize the postdigital nature of our own involvement, it becomes possible to start developing a critical perspective for potential change, starting with reconsideration of technological systems and their applications. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (Eubanks 2018) may serve as a necessary turning point within the reach of any reader who wants to get out of their comfort zone and take an active role in social inequalities. Such postdigital approach may not be comfortable (Cormier et al. 2019), yet it is a necessary philosophical approach for anyone committed to equality, social justice, and emancipation.
- Chomsky, N. (2017). Requien for the American dream: the 10 principles of concentration of wealth & power. New York: Seven Stories Press.Google Scholar
- Comier, D., Jandrić, P., Childs, M., Hall, R., White, D., Phipps, L., Truelove, Ian., Hayes, S., & Fawns, T. (2019). Ten years of the postdigital in the 52group: reflections and developments 2009–2019. Postdigital Science and Education, 1(2), 475–506. https://doi.org/10.1007/s42438-019-00049-8 CrossRefGoogle Scholar
- Escaño, C. (2019). Sociedad postdigital (ontología de la remezcla). Iberoamérica Social: Revista-Red de Estudios Sociales, 7(XII), 51–53.Google Scholar
- Eubanks, V. (2018). Automating inequality. How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press.Google Scholar
- Hayes, S. (2019). No false promises. Postdigital Science and Education 1(1), 4–7. https://doi.org/10.1007/s42438-019-0032-0E.
- Jandrić, P. (2017). Learning in the age of digital reason. Rotterdam: Sense.Google Scholar
- Knox, J. (2015). Critical education and digital cultures. In M. A. Peters (Ed.), Encyclopedia of Educational Philosophy and Theory (pp. 1–6). https://doi.org/10.1007/978-981-287-588-4.