Content moderation is the organized practice of screening user-generated content (UGC) posted to Internet sites, social media, and other online outlets, in order to determine the appropriateness of the content for a given site, locality, or jurisdiction. The process can result in UGC being removed by a moderator, acting as an agent of the platform or site in question. Increasingly, social media platforms rely on massive quantities of UGC data to populate them and to drive user engagement; with that increase has come the concomitant need for platforms and sites to enforce their rules and relevant or applicable laws, as the posting of inappropriate content is considered a major source of liability.
The style of moderation can vary from site to site, and from platform to platform, as rules around what UGC is allowed are often set at a site or platform level and reflect that platform’s brand and reputation,...
KeywordsSocial Media Online Community Content Moderation Social Media Platform Moderation Practice
- Galloway, A. R. (2006). Protocol: How control exists after decentralization. Cambridge, MA: MIT Press.Google Scholar
- Roberts, S. T. (2016). Commercial content moderation: Digital laborers’ dirty work. In S. U. Noble & B. Tynes (Eds.), The intersectional internet: Race, sex, class and culture online (pp. 147–160). New York: Peter Lang.Google Scholar