The practices by which contract cheating companies find our students are worth exploring as one example of this kind of shadowing between the emergence of practices in higher education and their exploitation by companies whose influence we seek to reduce. Algorithms are all around us, and they drive a substantial amount of decision-making. They also are not inherently bad or damaging: in its most basic form, it is a set of instructions that determines a series of outputs from a series of inputs. And we are surrounded by algorithmic processes all the time. It’s an algorithm that drives what Netflix thinks you want to watch next, what Instagram thinks you might click-through to buy, and what Google anticipates you want from your search results. These experiences are sometimes creepy—like when a store you shop at knows before you do that you are pregnant (Hill, 2012)—and often aggressively capitalist, but they aren’t necessarily explicitly harmful.
Algorithms are not neutral, however. Instead, algorithms reflect the old adage of “garbage in, garbage out,” which is to say that whatever biases underwrite the programming of an algorithm will be reflected in its outputs (Stinson, 2020). And, since we live in a society that wrestles with racism, sexism, classism, ableism, and many other inequities, we should not be surprised that algorithms are often built in a way that encompasses many of these inequities. Virginia Eubanks described the use of algorithms in the development of social programming as an “empathy override,” a decision to outsource perceptions about who “deserves” care (Eubanks, 2018). This is a way of not having harder and more complex political conversations, and it relies on a scarcity model of resourcing social programs and care. Those are conversations that are important to have and will be shaped by individual values, but we have to have them, and not hide behind assumptions that these processes are somehow neutral.
What algorithms, for example, make decisions about who is a good bet for a mortgage or business loan, and what assumptions underlie those parameters? We see algorithms used to redraw community boundaries to further disenfranchise the poor and the marginalized. There’s a term for this: digital redlining (Gilliard, 2017). Indeed, just as old-fashioned analog redlining worked in the service of segregation and reduced class mobility, digital redlining has a direct impact on socioeconomic mobility. Algorithmic processes are increasingly used by credit bureaus to analyze your social media connections, making judgements about financial solvency in part based on a subject’s friends and relations (Waddell, 2016). Critically, a person’s network is not a protected class, so while it may be illegal for an employer or lender to discriminate based on race, gender, or ability, it’s not illegal to discriminate based on algorithmic assumptions made that are in turn based on a person’s network (Boyd et al., 2014). Consider how much more of your network is documented and searchable now than ever before; your connection to a person the lender sees as undesirable is no longer theoretical or circumstantial, but instead comes with a lengthy data trail. Even though the realities of the people within a network may well be framed and circumscribed by those protected factors, nothing protects marginalized users from having this data turned against them. Which is to say: isn’t this just a fancy way to get around traditionally racist and classist practices?
Contract cheating firms are very aware of the power of algorithms—it’s how they find their clients. In Thomas Lancaster’s work describing how social media is used by contract cheating firms, he’s effectively describing an algorithmic process when he reflects on how “A single tweet by a student, even one expressing that they have an assignment due with no indication that they plan to cheat, can lead to them receiving 20 or more visible replies from contract cheating providers within an hour from when the tweet is made” (Lancaster, 2019b). These aren’t human beings scanning social media: these are bots. They phish for students in incredibly predatory ways, using algorithmic processing of key words to track students on social media and pounce when they are most vulnerable. If your institution has a hashtag it uses to collect student posts on social media, you can see this for yourself by following it for a little while, especially around midterms and finals. You’ll quickly find these companies using institutional hashtags to reach students, often cloaking their services in terms of “editing” or “tutoring” or “help.” It’s easy, especially if you’re not versed in institutional branding—or you’re just panicked and looking for any lifeline, and wanting it to be real—to see some of these posts and wonder if they’re legitimately connected to the institution itself. And the companies also use these hashtags to track students as potential customers. They particularly like to use combinations of hashtags that pair a specific institution with words expressing affective experiences of student stress: #essaydue #finalsstress #essayhelp. So a student who is looking to commiserate with classmates on Instagram who uses the hashtag for her institution and, maybe, #freakingout #paperdue #needhelp, sends a bat signal not only for her classmates, but also for predatory contract cheating firms who sweep in to her direct messages at the last moment and offer “assistance.”Footnote 4
While it is never okay to purchase an essay, it’s easy to imagine a situation where desperation combined with opportunity results in an individual making a choice they shouldn’t. Given the spiralling rise in contract cheating, it doesn’t seem likely that students are suddenly less ethical than they used to be, and research suggests that cheating is a highly contextual act, and even those students who seem to be predisposed to contract cheating typically do not engage in it for every assessment (Ramberg & Modin, 2019; Rundle et al., 2019). Students are targeted by predatory companies when they are at their most panicked and most stressed out, and it’s a form of quote-unquote “help” that they can access when they are at that lowest point—say, 2 am the morning before a paper is due—when legitimate resources like learning centres and campus tutors and office hours aren’t available. Contract cheating is wrong. Preying on vulnerable students, and profiting off their misery, is more wrong.Footnote 5