In this chapter, I will provide the post-structural textual analysis of the official Overwatch player forums (McKee 2003) to highlight the issues experienced by players with the specific moderation systems deployed by Overwatch’s developers. The purpose of this chapter is to look primarily at the player input for Overwatch’s gameplay issues created (or solved) by moderation strategies. This chapter introduces and discusses each moderation tactic discovered through my research, with a brief explanation as to what it is, and then delving into the player responses over time to the specific tactic. Then, I will briefly summarize the player discourse and interventions on the systems. My intention for this analysis is to use the discourse surrounding the moderation systems to highlight the rift between casually and competitively identifying players, and how this rift causes conflicts in what players might expect and want from their Overwatch gameplay experience.

First, I address the terms “moderation strategies” and “moderation systems” to provide clarity on what they mean, and how they are applied to this research. Concerning “moderation strategies”, I draw from De Certeau’s definition, according to which strategies are the tools for those in power, that shape and enforce particular spaces (in this case, Blizzard Entertainment and the Overwatch development team) (De Certeau 2011). Strategies allow for the exertion of control over spaces. “Moderation systems”, on the other hand, are the units of the strategy—the different social systems in place to manage and govern players (the report system, Avoid as Teammate, and Endorsements, all to be discussed in more detail). Essentially, moderation strategies are an umbrella term for the amalgamation of these different moderation systems. I will be referring to them as precisely as possible throughout; for the most part with these terms, it is a question of distinction and scale.

Earlier Research

Considering the large amount of scholarship on participatory governance and moderation practices in online multiplayers (TL Taylor 2006; Gray 2014; Kou and Nardi 2014; Busch et al. 2015), what sets Blizzard Entertainment and Overwatch apart from others? Initially, a significant component behind Blizzard Entertainment’s peripheral marketing around Overwatch was its “Developer Updates” on their Play Overwatch YouTube channel. Among their promotion for upcoming new competitive seasons and hero releases are detailed videos explaining the social systems being implemented into the game, and updates made to them along the way. The emphasis on sustainable social systems is part of Blizzard Entertainment’s corporate rhetoric, specifically for Overwatch, as these developer updates exist as a way to “converse” with players—to show that their feedback on the forums is fruitful and taken into consideration.

In addition to game-based moderation systems, the moderation of online platforms is relevant, too. Current literature focuses on “internet governance” regarding the interplay of media policy, social media, and online community management—both in the technical infrastructure and in the regulation of users (Freedman 2010; DeNardis and Hackl 2015). Duguay and colleagues specifically analyze Tinder, Instagram, and Vine to determine queer women’s experience with how the platforms moderate content and users, often to the detriment of underrepresented groups—and highlight “the disconnect between platforms’ formal governance rules … and the impacts on user experience of platform architectures and cultures” (Duguay et al. 2018, 2). These platforms, in other words, make use of formal governance ignoring the cultures of use on their platforms. Tinder, for example, has a formal report mechanic on their app; however, many do not use it because reporting does not seem to have any concrete effect. A similar dissonance is prevalent in online game moderation as well.

Tarleton Gillespie (2018) identifies platform developers as “custodians” who facilitate, and are responsible for, the interactions that take place on their platform. Another point that makes Blizzard Entertainment and Overwatch central to moderation research is their custodian role in the play space. Especially, as Overwatch is a live-service game,Footnote 1 it emulates the legalities and policies similar to those found when joining Twitter or Facebook in the form of a Code of Conduct (CoC) and End User License Agreements (EULAs) that players must sign to access the game client. The tribulations in trying to manage online spaces are similar between social media and online gaming—issues of misogyny, racism, homophobia, as well as hacking, misuse of software, to name but a few problems, run rampant across online platforms. What I identify through this analysis is that players will subvert the systems presented to them to make the systems operate in the way they desire, rather than the developers’ intended purposes.

Previous literature has investigated how players become professionals in esports (Taylor 2006; Witkowski and Manning 2017). Juul, on the other hand, indicates players focused on “mastery” to show great discontent toward the “no fail” mode in Guitar Hero, too, not wanting their own ability to be “diminished” (Juul 2012, 143). Next to the above, Consalvo and Paul unpack the legitimacy of “casual” games and their audiences being subsequently deemed not “real” players (2019). This value judgment is perpetuated by “real”, committed, hardcore players to distinguish themselves from casual and leisurely players. TL Taylor identified “power gamers” as those with the utmost commitment to their play in Everquest with distinct knowledge on how to optimize their characters mechanically, as opposed to players who spend a moderate amount of time in a game (2003).

In Overwatch, especially in the quickplay mode, the player population is an unpredictable mix of players who might identify as either casual or competitive (see Vahlo & Karhulahti, this volume). For this chapter, I define casual players as ones who play Overwatch for leisure, and while obviously wanting to win, they are less invested in optimizing their play performance. Competitive players, in turn, are ones attempting to emulate professional players, usually by engaging in meta playFootnote 2 and trying to climb the competitive ranks. Both casual and competitive attitudes are present in the Overwatchcommunity, and they are taken into consideration in the game’s design. Overwatch makes use of components identified by Juul as part of casual game design; for instance, “juiciness”, where “excessive positive feedback” is central to the game’s aesthetic (Juul 2012, 45). Additionally, there is the “interruptibility” component in quick play (Juul 2012, 30). Quickplay matches also run significantly shorter than competitive matches, which allow less commitment. Although individual matches cannot be paused, they are segmented into short rounds and permit short play sessions. At the same time, Overwatch’s design also champions instrumental play through optimizing one’s playstyle (Taylor 2003). More powerful meta heroes, for instance, can be chosen over less powerful but potentially more “fun” off-meta heroes, and in both cases, learning to play the heroes properly often takes a lot of time and effort.

Multiple scholars have embarked on research that this chapter is in conversation with. Kishonna Gray’s foundational work on the racist and misogynistic culture of Xbox Live pushed discourse around the impacts of allowing such behaviors to proliferate and oppress minorities in online leisure spaces (Gray 2012). Lajeunesse (2018) engaged with the media dispositive that surrounds the DOTA 2community, reinforcing toxicity, through autoethnography, participant observation and analyzing journalism, community forums, and official Valve correspondence in order to build a picture of what allowed toxic behaviors to metastasize. Blizzard Entertainment, and more specifically World of Warcraft(WoW), has been extensively researched, as it endures as one of the most financially successful game studios to date (Jordan 2018). Through investigating Blizzard forums, Crenshaw and Nardi sought to analyze WoW player reactions to patches that altered how social interfaces worked, resulting in some players remaining on unpatched (and illegal) “Vanilla”Footnote 3 versions of the game (Crenshaw and Nardi 2016).

In the first year of Overwatch, the online community was reported to be less toxic, and more supportive compared to other online PvP games (e.g., DOTA 2 and League of Legends), thus potentially heralding a more positive player base and building collegiality (Purslow 2016; u/Fyre2387 2016; Webster 2016; Stuart 2017). However, with the introduction of the competitive mode, tensions and frictions began to arise (Grayson 2016; D’Anastasio 2017; u/LordAurora 2017). This chapter investigates those tensions and frictions.

Data and Methods

Methodologically, I apply close reading forum analysis. TheOverwatchcommunity has a burgeoning presence on various forum sites, such as Reddit with r/overwatch and r/overwatch competitive as active subreddits on the site. The data in this chapter was obtained from both the official and Blizzard Overwatch forums. I chose the official Blizzard forums because they act as a pseudo direct line to the developers who regularly cite the fact that they read the forums for feedback, especially on managing the community’s behavior (PlayOverwatch 2018). Sometimes the developers will respond directly to player threads, either to explain an issue, or to provide context on a topic (Kaplan 2018). These forums have a high volume of player opinions around the implementation and execution of the different moderation strategies and systems, thus making it the site for my data collection.

The official Overwatch forums are split across multiple topics (general, competitive, story, technical support) with my data coming primarily from the general and competitive topics. I read and collected some 200 separate forum threads (as screenshots) by searching moderation-relevant terms between June 2016Footnote 4 and March 2019 (Blamey 2019). These terms included, but were not limited to, “report”, “chat”, “banned”, “communication”, as well as the names for the moderation systems “avoid this player”, “avoid as teammate”, and “endorsements”. I organized the search results by the “most relevant” feature in order to avoid off-topic forum threads. The length of forum threads varied from one single post to discussions extending nearly six months.

I discuss three official moderation systems in this chapter: the Report, Avoid as Teammate, and Endorsements systems. These were the “social systems” promoted by Overwatch developers at the time of data collection. Below, I briefly define each system and analyze a selected example thread as a case study. This data was collected in accordance with the Canadian and Concordia University ethical research guidelines in 2018–2019.

Moderation/Report System

The in-game report system of Overwatch has evolved since its initial launch in May 2016. In the early months, a player would go into their “social menu”, find a list of “recent players” (up to 63), and click on a player to report them to the authorities for: “inappropriate BattletagFootnote 5”, “harassment”, “spam”, or “cheating”. Around 2017 (when the report system was added to consoles) three additional categories were added: “poor teamwork”, “griefing”, and “inactivity”, while “harassment” was removed (JayWaddy 2017). Descriptors were added to clarify what was and was not considered part of that category, likely to avoid false reporting. In May 2018, Blizzard Entertainment updated these categories again, merging “poor teamwork” and “griefing” into “gameplay sabotage”. One of Overwatch’s principal designers Scott Mercer explained that the new category made it easier to know why a player was reporting another player (Mercer 2018a, b).

A significant number of the forum threads I found discussed how the forum itself was moderatedFootnote 6 and some players disputing why they were banned.Footnote 7 As a result of this, many forum threads may have been deleted for containing inappropriate content before I began this research, meaning the remaining threads have been somewhat “curated”. With the purpose of this chapter being to highlight player interventions and discourses surrounding the moderation strategies, it is useful to analyze how they discuss moderation and reporting in a more general sense, before going into specific moderation systems. I will be referring to the original posters for all forum threads as “OP”.

March 17, 2017, “Moderation of in Game Voice Comms”

This threadFootnote 8 discusses how to best moderate voice chat in Overwatch, with just two respondents offering their opinions—in direct opposition of one another. In this post OP acknowledges that dealing with voice chat is a complicated issue, not to be addressed simply by players with “banhammers or instant mute nuke buttons”, but via an authoritative body in the form of the support staff, with the assistance of the players tagging toxic individuals for them (RATSTAB 2017).

OP makes specific reference to “Xbox Live circa 2013” to describe how players are behaving in voice chat. Xbox Live is commonly known as a toxic communication space, with an abundance of racist and sexist comments being used against players who do not fit the hegemonic ideal of a white, male gamer (Gray 2014). The OP’s post highlights how toxic players in voice chat have maintained the same attitudes from other online game spaces and are behaving inappropriately in the chat function provided to converse with teammates, creating an unproductive and negative communication space. There is an expectation within forum posts on the topic of verbal abuse or “comms abuse” (Blamey 2019) that with time, player behaviors should have developed beyond this toxic mindset of verbal abuse being an acceptable way to talk to other players, yet unfortunately, this is not the case.

The second respondent calls out OP for muting people they do not want to hear, deeming them equally as problematic as those being toxic stating: “Mute that person you cannot stand? What exactly do we find “problematic” now?” highlighting the subjectivity of what players deem problematic, and therefore mutable, in voice chat. OP does not respond to the replies on their thread and so the conversation ends.

The “just mute” approach to problematic players is not a new phenomenon. In the late 1990s, Julian Dibbell wrote on governance in MUDs (Multi-User Dungeons), specifically LambdaMOO, where a player used a sub-program to force another player to perform virtual sexual acts against their will. This was met with much uproar and calls for the offending player to be removed entirely from the game, but when it resulted in a wider questioning of how LambdaMOO was to be governed in future instances, many players highlighted that experiencing mean players was inevitable and using the “@gag”Footnote 9 command was a simple and effective method without censoring players (Dibbell 1998). Dibbell argues that “gagging” players in LambdaMOO instead of actioning them only prevents the intended victim from seeing what is being said, there are still witnesses who can see the violation occurring (Dibbell 1998, 7). These witnesses could easily be just as impacted by the attack as the victim.

Voice communication and in-game chat abuse was a frequent topic in the forums, voicing futility in moderating toxic players. Temporary bans, which were used as punishment, removed problem players only momentarily, so why bother in the first place? (Goedmaker 2016; YJG 2017). Compared to Xbox Live’s report systems, Overwatch players simply wait out their ban and then continue their prior behavior. A more holistic issue with abusive voice chat in online games is its lack of protection for underrepresented groups, to whom particular slurs can be more damaging than to others.

The suggestion from forum threads to turn off voice chat entirely comes with more damaging consequences for those not offending than those who are. For instance, playing online games, especially competitively, without voice comms can harm a player’s chances as they are unable to communicate with their teammates—women make up a large majority of those who will mute themselves, and are already underrepresented within esportcommunities. Nakamura specifies that voice chat has allowed for a “new kind of mediated race, sex, and genderdiscrimination” and that users had begun to create blogs to expose players participating in these discriminatory practices (2012, 2). Gray explains how voice chat is a form of “synchronous communication”, providing a space for real-time anonymous toxic chat (Gray 2012), meaning that underrepresented players cannot pre-emptively mute offensive players and are consistently at risk for verbal abuse. Additionally, these problematic players are not situated in either camp of casual or competitive players; rather they are prevalent across the game.

Although the community has the tools to mute, block, and report players, these limited, and sometimes exploited, functions leave little room for impactful moderation on their end, and when the moderation ball is in Blizzard Entertainment’s court, it takes a high volume of reports for an account to be actioned. As players explain in other threads on this topic (Blamey 2019), muting offensive players results in being reported for lack of communication, so using the tool provided results in a player being wrongly reported and actioned. This shows that Blizzard Entertainment is holding accountable those who do not cooperate with how the game needs to be played (with communication) equally to players who are abusive to their teammates. While Blizzard Entertainment has publicly punished their pro players and streamers when they behave poorly in public and Blizzard Entertainment represented spaces (such as tournaments and Twitch), the consensus from the collection of forum threads is that professional players can get away with poor behavior in private, and often these instances of punishment are due to the extremeness of the offending professional players’ actions. Consalvo, in her study of cheating in videogames, writes that players of multiplayer games who cheated but were not punished by the game-owning companies also lost trust in the companies and played less (Consalvo 2009, 144). Moreover, the missing console report system placed PC players (who compete more) at a higher priority than console players. Although Blizzard Entertainment has so far claimed that they do not sanction disruptive behavior, it appears to the player community that they have been selective in who and when they punish (Alexander 2018).

Avoid as Teammate

The “Avoid as Teammate” (AaT) feature evolved from “Avoid this Player”, which allowed players to not play against chosen opponents. As this feature was soon abused to skew matchmaking in order to avoid difficult opponents, “Avoid as Teammate” eventually replaced it, only allowing players to remove up to three players from appearing on their team for seven days.

March 25, 2018, “Avoid as Teammate Griefing Unpopular Heroes”

At the time of this thread, the development team had just announced AaT in a YouTube video “Developer Update—Avoid as Teammate” featuring Jeff Kaplan (PlayOverwatch 2018). The official justification behind the removal of AtP was because of a mass-avoided top-ranked Widowmaker player being unable to queue into any competitive matches. Blizzard Entertainment removed the tool directly as a response to this in June 2016, and due to the unlimited amount of “avoid” slots, and players using them liberally, meant that the game’s matchmaker systems struggled to generate matches overall (Prell 2016). This threadFootnote 10 was a response to the potential impacts AaT might have, generating a discussion of 40 comments. Interestingly, it was immediately flagged by OP as a potential “griefing” tool, trying to explain that AaT’s functionality does not protect players choosing to play “off-meta”. In their post, OP states that:

The new “avoid teammate” feature you are proposing will in most part actually be abused by toxic people on players who may use unpopular heros [sic], more than its intended use. This will affect those honest players more so than anything else. (Zeron 2018)

OP predicts that AaT will be used to avoid off-meta players. Players had ostensibly been throwing matchesFootnote 11 already because a teammate selected an “unpopular” hero, with the avoid feature gone, and no other way to exercise their dislike for off-meta players. OP’s stance is that “meta” players are likely going to use AaT to punish players who are not playing the “meta” heroes. OP then clarifies the issue: an “honest” player (off-meta) would avoid two toxic (strictly meta) players, whereas ten toxic meta players would avoid the one honest player, thus causing the honest player longer queue times due to their play style. Essentially, OP’s point is not that playing meta makes a player toxic but forcing players to play meta is. According to the other AaT forum threads, optimizing one’s team composition by how professionals play is more important to competitive players than others. Casually identifying players are generally less concerned about playing meta heroes, which places them at risk of being avoided when competitive players mix with casual players in the quickplay mode.

The discussions in this thread around the AaT tactic open up a larger debate on what is right to “avoid”. In developer update videos Jeff states that players can avoid others for any reason they see fit—it is not solely for toxic players (PlayOverwatch 2018). While the report system is still in place, it seems that the developers are enabling off-meta players to be punished for their play styles. These “avoided” players do receive a warning when avoided by “a considerable number of players” (WyomingMyst 2018). It is unclear whether or not it is the developers taking a stance on play styles, but it can certainly be inferred that there is a very real possibility that off-meta players will face undue punishment. The sentiment in a lot of the forum threads was that off-meta players will use AaT to avoid disruptive players, and meta players can use AaT to avoid off-meta players and disruptive players (Blamey 2019).

Notably, the labels that characterize players’ styles such as “off-meta”, “meta”, and “one tricks”Footnote 12 have been created by the player community, not the developers. Christopher Paul writes on “theorycrafting”, a practice in World of Warcraft(WoW) by which players analyzed the world’s underlying mathematics to find the optimal way to play. This shifted play styles in WoW, and theorycrafting, in the community, became synonymous with “good” WoW play (Paul 2011). Theorycrafting, like meta play, is used within the games’ sub-communities to self-define “good”, but there is also resistance to this “optimization of play” as restrictive (Paul 2011). While the game mechanics may afford the space for all these different play styles, the clash between players debating what is the “right” way to play the game is down to the players.

The forums voice that the implementation of the AaT has not reconciled formal systems and nuanced player practices (TL Taylor 2006)—and no significant adjustments to the AaT have been made to reflect this.Footnote 13 The discrepancy over off/meta play arises from competitive players in the competitive mode, seeping over into quickplay and causal competitive player spaces, generating a conflict in player expectations. While AaT may have created a preventative method in reducing toxicity in matches by stopping players who do not play well together, it at the same time fostered a climate where players could dictate how people should play the game and who, canonically, was avoidable.

Endorsements (Released June 2018)

Endorsements were introduced as a tool for positive reinforcement. Instead of punishing bad behavior, this system rewards good, cohesive play in a team. At the end of a match, players can endorse up to three teammates or enemies (not players on their “friends” list) via three different types: “shot caller”, “good teammate”, and “sportsmanship”. Endorsement levels range from 1 to 5 (lowest to highest), and often a sign of a positive player is one with a higher endorsement ranking. As an incentive, going up and maintaining endorsement ranks also provide “periodic” loot boxesFootnote 14 (Overwatch Wiki 2019).

July 1, 2018, “Flaws in the Endorsement System”

This particular threadFootnote 15 spanned several months between July 2018 and April 2019 with 71 comments. OP begins their post by praising the endorsement system, how it has improved their opinion of quickplay in comparison to competitive play, and then proceeds to pull apart the endorsement system’s issues. To sum up the lengthy post, OP highlights how the system is counterintuitive in increasing a player’s endorsement level. For example, you can only endorse a player once every 12 hours, so OP points out that there is no reason to stay in a group once all six teammates have endorsed you if you want to prevent your endorsement level decaying over time.Footnote 16

OP’s main complaint is about not being able to endorse friends. While understanding how easily endorsing friends could break the purpose of the system—players could have high endorsement levels with enough friends—they suggest the friend’s endorsement value to be a small fraction of a regular endorsement. Since successful competitive play requires a team of six players, according to OP, not being able to endorse friends seems to punish playing the game in an optimal manner.

During the first set of forum responses between July and August 2018, the endorsement system is relatively new and while there are issues, players seem to agree that even the “fake nice” players looking to increase their endorsement level have made the game considerably less toxic. However, the consensus of the thread seems to be that the endorsement system is somewhat unfavorable to the most optimal way to play, and it lacks credibility. Many state that the endorsements they have received do not make sense, for example, someone without a microphone receiving a “shot caller” endorsement. Others concur that, as a result, endorsements exist as a blanket “you did good”. On the other hand, some discussants found that the different types of endorsement have levels of rarity, with “shot caller” being the rarest and most sought after.

In the AaT section of this chapter, players were concerned that the tool could only be effective for the top 15% of OW players (Sofrito 2018). This concern is echoed once again regarding the endorsement tool. One player highlights that proportional to the amount of time playing, one will receive more endorsements, meaning that those dedicating more time to Overwatch can go climb the endorsement ranks faster than those who play less (Deus 2018). As the most dedicated demography consists mainly of streamers and semi-professional/professional players, the forum discussants feel that the content creators for Overwatch benefit from the system where others might not. A month later, a player describes that the “shiny has worn off” (Truen 2018) the endorsements, and players solely endorse to gain an extra 150XP, which multiple players agree within their responses to OP. By endorsing without reason to gain the maximum amount of XP per match, these players are simply gaming the system to their advantage. A new wave of responses in the thread came in October–November, some three months later. One player highlights that endorsements are more easily received for support players and voices concern over how difficult it is to maintain a high endorsement level. The decay rate of endorsements has frustrated a majority of players. Although they receive an endorsement from another player, they may soon drop an endorsement level regardless. Working to maintain a positive attitude with players only to find that the endorsement level drops anyway is demotivational and counterproductive. Players have no insight into the numbers operating within the endorsement system, as in the report system, so that it cannot be gamed by players; however, this also means that it appears entirely nonsensical to players who drop an endorsement rank. Another second-wave respondent also points out that endorsements occur more frequently when a team has won, meaning when a team loses, players acknowledge less. The system might thus favor players with higher win rates.

All of this suggests there are significant flaws in the system that are not being addressed by the developers. Players are keen to keep their endorsement level high as it ties into how they will be perceived by other players in matches (level 1 is holistically understood as a toxic player), but they do not feel supported in how to maintain the level when it seems to drop without warning.


A key conflict at the heart of Overwatch is the mixing of players with diverse—and even opposite—motivations. All the game’s moderation systems are closely linked to one another, forming a tight-knit moderation strategy with solutions as well as new problems. Each moderation system, despite their good intentions, seems to have issues and is often abused for various purposes.

These issues exist across most online gaming platforms, however, and especially in those with an esports scene. Moderating large-scale communities online is hardly a trivial task, and Blizzard Entertainment is known to respond to its community more than many other companies do. Perhaps what makes Overwatch, to some degree, a unique case is its early inclination to serve the less competitive player base, now contrasted with a strong focus on the esports scene and balancing for high-level ranked play.

The solutions suggested by players voice active participation in governance with a desire to improve the systems they are a part of (TL Taylor 2006; Kou and Nardi 2014; Duguay et al. 2018). Repeatedly, players disagree with the developer’s choices, pointing at where the system is not functioning as it should.

John Banks discusses the tensions between developers and community members surrounding the co-creation of games during his time consulting for Australian game company Auran (Banks 2009). The development for Auran’s game Fury used a mixture of developers and community testers, but in the final months, there were multiple disagreements about the game’s design between “the expertise and creative control” of Auran and “the collective intelligence” of the game’s community (Banks 2009, 80). Fury flopped, and the lack of response to the community’s concerns was highlighted as part of the reason for the failure. Although the renovations in the moderation systems have had an apparent positive impact on the game, as shown by 40% less reports on toxic behavior since 2018 (Grayson 2019), it is evident that the systems are hardly flawless. The persistent issues are at least partly a product of conflicting player expectations in the community, which consists of players who play Overwatch for different reasons.