This paper explores Wikipedia bots and problematic information in order to consider implications for cultivating students’ critical media literacy. While we recognize the key role of Wikipedia bots in addressing and reducing problematic information (misinformation and disinformation) on the encyclopedia, it is ultimately reductive to construe bots as merely having benign impacts. In order to understand bots and other algorithms as more than just tools, we turn towards a postdigital theorization of these as ‘agents’ that co-produce knowledge in conjunction with human editors and actors. This paper presents case studies of three specific bots on Wikipedia, including ClueBot NG, AAlertbot, and COIBot, each of which engages in some type of information validation in the encyclopedia. The activities involving these bots, illustrated in these case studies, ultimately support our argument that information validation processes in Wikipedia are complicated by their distribution across multiple human-computer relations and agencies. Despite the programming of these bots for combating problematic information, their efficacy is challenged by social, cultural, and technical issues related to misogyny, systemic bias, and conflict of interest. Studying the function of Wikipedia bots makes space for extending educational models for critical media literacy. In the postdigital era of problematic information, students should be on the alert for how the human and the nonhuman, the digital and the nondigital, interfere and exert agency in Wikipedia’s complex and highly volatile processes of information validation.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Price excludes VAT (USA)
Tax calculation will be finalised during checkout.
Alcoff, L. M. (2007). Epistemologies of ignorance: Three types. In S. Sullivan & N. Tuana (Eds.), Race and epistemologies of ignorance (pp. 39–58). Albany: State University of New York Press.
Bazely, D. (2018). Why Nobel prize winner Donna Strickland didn’t have a Wikipedia page. The Washington Post, 8 October. https://www.washingtonpost.com/outlook/2018/10/08/why-nobel-winner-donna-strickland-didnt-have-wikipedia-page/?utm_term=.cedfbbe4ae8e. Accessed 5 May 2019.
Beck, E. (2015). The invisible digital identity: Assemblages in digital networks. Computers and Composition, 35, 125–140. https://doi.org/10.1016/j.compcom.2015.01.005.
Bhatt, I., & MacKenzie, A. (2019). Just google it! Digital literacy and the epistemology of ignorance. Teaching in Higher Education, 24(3), 302–317. https://doi.org/10.1080/13562517.2018.1547276.
Boyd, D. (2014). It's complicated: The social lives of networked teens. New Haven: Yale University Press.
Brock, K. (2019). Rhetorical code studies: Discovering arguments in and around code. Ann Arbor: University of Michigan Press.
Brown, J. (2015). Ethical programs: Hospitality and the rhetorics of software. Ann Arbor: University of Michigan Press.
Cassam, Q. (2018). Epistemic insouciance. Journal of Philosophical Research, 43, 1–20. https://doi.org/10.5840/jpr2018828131.
Cecco, L. (2018). Female Nobel prize winner deemed not important enough for Wikipedia entry. The Guardian, 3 October. https://www.theguardian.com/science/2018/oct/03/donna-strickland-nobel-physics-prize-wikipedia-denied. Accessed 5 May 2019.
Collier, B., & Bear J. (2012). Conflict, confidence, or criticism: An empirical examination of the gender gap in Wikipedia. The ACM Conference on Computer Supported Cooperative Work, pp. 383-92. https://doi.org/10.1145/2145204.2145265.
Davis, L. (2016). Why wiki Education’s work combats fake news — And how you can help. Wiki Edu. https://wikiedu.org/blog/2016/11/21/why-wiki-eds-work-combats-fake-news-and-how-you-can-help/. Accessed 5 May 2019.
Famiglietti, A. (2015). Adieu Wikipedia: Understanding the ethics of Wikipedia after gamergate. The 16th Annual Meeting of the Association of Internet Researchers, AoIR, pp.1-8.
Gallert, P., & van der Velden, M. (2015). The sum of all human knowledge? Wikipedia and indigenous knowledge. At the Intersection of Indigenous and Traditional Knowledge and Technology Design, 117–133.
Geiger, R. S. (2011). The lives of bots. In G. L. Geert & N. Tkacz (Eds.), Critical point of view: A Wikipedia reader (pp. 78–93). Amsterdam: Institute of Network Cultures.
Geiger, R. S., & Halfaker, A. (2013). When the levee breaks: Without bots, what happens to Wikipedia's quality control processes? In WikiSym (pp. 1-58). ACM. https://doi.org/10.1145/2491055.2491061.
Geiger, R. S., & Halfaker, A. (2017). Operationalizing conflict and cooperation between automated software agents in Wikipedia: A replication and expansion of "even good bots fight". Proceedings of the ACM on Human-Computer Interaction, 1(CSCW), pp. 49:2-49:33. https://doi.org/10.1145/3134684.
Geiger, R. S., & Ribes, D. (2010). The work of sustaining order in Wikipedia: The banning of a vandal. In CSCW 2010 (pp. 6-10). ACM. https://doi.org/10.1145/1718918.1718941.
Gitelman, L. (2013). "raw data" is an oxymoron. Cambridge, MA: MIT Press.
Glott, R., Schmidt, P., & Ghosh, R. (2010). Wikipedia survey - Overview of results. http://www.ris.org/uploadi/editor/1305050082Wikipedia_Overview_15March2010-FINAL.pdf. Accessed 5 May 2019.
Gruwell, L. (2015). Wikipedia's politics of exclusion: Gender, epistemology, and feminist rhetorical (in)action. Computers and Composition, 37, 117–131. https://doi.org/10.1016/j.compcom.2015.06.009.
Halfaker, A., & Riedl, J. (2012). Bots and cyborgs: Wikipedia's immune system. IEEE Computer Society, 45, 79–82. https://doi.org/10.1109/MC.2012.82.
Harrison, S. (2019). Happy 18th birthday, Wikipedia! Let’s celebrate the internet’s good grown-up. Washington Post, 14 January. https://www.washingtonpost.com/opinions/happy-18th-birthday-wikipedia-lets-celebrate-the-internets-good-grown-up/2019/01/14/e4d854cc-1837-11e9-9ebf-c5fed1b7a081_story.html?utm_term=.2f433889ec97. Accessed 5 May 2019.
Holmes, S., & Lussos, R. G. (2018). Cultivating metanoia in twitter publics: Analyzing and producing bots of protest in the #GamerGate controversy. Computers and Composition, 47, 118–138. https://doi.org/10.1016/j.compcom.2018.03.010.
Jack, C. (2017). Lexicon of lies: Terms for problematic information. Data & Society, 9 August. https://datasociety.net/output/lexicon-of-lies/. Accessed 5 May 2019.
Jandrić, P., Knox, J., Besley, T., Ryberg, T., Suoranta, J., & Hayes, S. (2018). Postdigital science and education. Educational Philosophy and Theory, 50(10), 893–899. https://doi.org/10.1080/00131857.2018.1454000.
Jennings, E. (2008). Using Wikipedia to teach information literacy. College & Undergraduate Libraries, 15(4), 432–437. https://doi.org/10.1080/10691310802554895.
Jones, C., Ryberg, T., & de Laat, M. (2015). Networked learning. In M. Peters (Ed.), Encyclopedia of educational philosophy and theory. Singapore: Springer.
Kellner, D., & Share, J. (2005). Toward critical media literacy: Core concepts, debates, organizations, and policy. Discourse: Studies in the Cultural Politics of Education, 26(3), 369–386. https://doi.org/10.1080/01596300500200169.
Kennedy, K. (2010). Textual machinery: Authorial agency and bot-written texts in Wikipedia. In M. Smith & B. Warnick (Eds.), The responsibilities of rhetoric (pp. 303–309). Long Grove: Waveland Press.
Kennedy, K. (2016). Textual curation: Authorship, agency, and technology in Wikipedia and Chambers's Cyclopædia. Columbia: The University of South Carolina Press.
Kittur, A., & Kraut, R. E. (2008). Harnessing the wisdom of crowds in wikipedia: Quality through coordination. In Proceedings of the 2008 ACM conference on Computer supported cooperative work (pp. 37-46). ACM. https://doi.org/10.1145/1460563.1460572.
Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford: Clarendon.
Lewis, H. (2012). Dear the internet, this is why you can’t have anything nice. New Statesman America, 12 June. https://www.newstatesman.com/blogs/internet/2012/06/dear-internet-why-you-cant-have-anything-nice. Accessed 5 May 2019.
MacKenzie, A., & Bhatt, I. (2018). Lies, bullshit and fake news: Some epistemological concerns. Postdigital Science and Education. https://doi.org/10.1007/s42438-018-0025-4.
Martin, B. (2018). Persistent bias on Wikipedia: Methods and responses. Social Science Computer Review, 36(3), 379–388. https://doi.org/10.1177/0894439317715434.
Nasaw, D. (2012). Meet the 'bots' that edit Wikipedia. BBC News, 25 July. https://www.bbc.com/news/magazine-18892510. Accessed 5 May 2019.
Nguyen, T. C. (2018). Echo chambers and epistemic bubbles. Episteme, 1–21. https://doi.org/10.1017/epi.2018.32.
Noble, U. N. (2018). Algorithms of oppression: How search engines reinforce racism. New York: New York University Press.
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Crown Publishers.
Ohlheiser, A. (2015). Eric Garner’s Wikipedia page was edited from an NYPD computer, NYPD admits. Washington Post, 16 March. https://www.washingtonpost.com/news/post-nation/wp/2015/03/16/eric-garners-wikipedia-page-was-edited-from-an-nypd-computer-the-nypd-admits/?noredirect=on&utm_term=.951097ef2128. Accessed 5 May 2019.
Pepperell, R., & Punt, M. (2000). The postdigital membrane: Imagination, technology, and desire. Bristol: Intellect.
Pinsker, J. (2015). The covert world of people trying to edit Wikipedia for pay. The Atlantic, 11 August. https://www.theatlantic.com/business/archive/2015/08/wikipedia-editors-for-pay/393926/. Accessed 5 May 2019.
Potthast, M., Stein, B., & Gerling, R. (2008). Automatic vandalism detection in Wikipedia. In C. Macdonald, I. Ounis, V. Plachouras, I. Ruthven, & R. W. White (Eds.), Advances in information retrieval (pp. 663–668). Berlin: Springer.
Sinclair, C., & Hayes, S. (2019). Between the post and the com-post: Examining the postdigital “work” of a prefix. Postdigital Science and Education, 1(1), 119–131. https://doi.org/10.1007/s42438-018-0017-4.
Taffel, S. (2016). Perspectives on the postdigital: Beyond rhetorics of progress and novelty. Convergence, 22(3), 324–338. https://doi.org/10.1177/1354856514567827.
The Top 500 Sites on the Web. (2019). Alexa. https://www.alexa.com/topsites. Accessed 5 May 2019.
Tran, K., & Christen, P. (2015). Cross-language learning from bots and users to detect vandalism on Wikipedia. IEEE Transactions on Knowledge and Data Engineering, 27(3), 673–685. https://doi.org/10.1109/TKDE.2014.2339844.
Tsvetkova, M., Garcia-Gavilanes, R., Fioridi, L., & Yasseri, T. (2017). Even good bots fight: The case of Wikipedia. PLoS One, 12(2), 1–13. https://doi.org/10.1371/journal.pone.0171774.
Unit Fellows. (2017). The post-digital U-turn-with Lawrence Liang and Nishant Shah. YouTube, 10 May. https://www.youtube.com/watch?v=5hP33dwM28g. Accessed 5 May 2019.
User contributions:Balobo. (2013). Wikipedia. https://en.wikipedia.org/wiki/Special:Contributions/Balobo. Accessed 5 May 2019.
User:AAlertBot. (2018). Wikipedia. https://commons.wikimedia.org/wiki/User:AAlertBot, Accessed 5 May 2019.
User:ClueBot NG. (2010). Wikipedia. https://en.wikipedia.org/wiki/User:ClueBot_NG. Accessed 5 May 2019.
User:COIBot. (2019). Wikipedia. https://en.wikipedia.org/wiki/User:COIBot. Accessed 5 May 2019.
Wadewitz, A. (2013). Wikipedia’s gender gap and the complicated reality of systemic gender bias (web log post). Hastac, 26 July. https://www.hastac.org/blogs/wadewitz/2013/07/26/wikipedias-gender-gap-and-complicated-reality-systemic-gender-bias. Accessed May 5 2019.
Weill, K. (2015). Edits to Wikipedia pages on bell, Garner, Diallo traced to 1 police plaza. Politico, 13 March. https://www.politico.com/states/new-york/city-hall/story/2015/03/edits-to-wikipedia-pages-on-bell-garner-diallo-traced-to-1-police-plaza-087652. Accessed 5 May 2019.
Wiki Education. (2019). https://wikiedu.org. Accessed 5 May 2019.
Wikipedia Human Gender Indicator. (2019). Gender by language. http://whgi.wmflabs.org/gender-by-language.html. Accessed 5 May 2019.
Wikipedia:About. (2019). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:About. Accessed 28 June 2019.
Wikipedia:AfD. (2019). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:Articles_for_deletion. Accessed 5 May 2019.
Wikipedia:Anita Sarkeesian (2019). Wikipedia. https://en.wikipedia.org/wiki/Anita_Sarkeesian. Accessed 5 May 2019.
Wikipedia:Article Alerts. (2018). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:Article_alerts. Accessed 5 May 2019.
Wikipedia:Bots. (2019). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:Bots. Accessed 28 June 2019.
Wikipedia:Conflict of Interest (2019). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:Conflict_of_interest. Accessed 5 May 2019.
Wikipedia:Conflict-of-Interest Editing on Wikipedia (2019). Wikipedia. https://en.wikipedia.org/wiki/Conflict-of-interest_editing_on_Wikipedia. Accessed 5 May 2019.
Wikipedia:Death of Eric Garner: Difference between revisions 1. (2014). Wikipedia. https://en.wikipedia.org/w/index.php?title=Death_of_Eric_Garner&diff=prev&oldid=636522175. Accessed 5 May 2019.
Wikipedia:Death of Eric Garner: Difference between revisions 2. (2014). Wikipedia. https://en.wikipedia.org/w/index.php?title=Death_of_Eric_Garner&diff=prev&oldid=636522333. Accessed 5 May 2019.
Wikipedia:FAC. (2019). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:Featured_article_candidates. Accessed 8 May 2019.
Wikipedia:List of Bots by Number of Edits. (2018). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:List_of_bots_by_number_of_edits. Accessed 5 May 2019.
Wikipedia:Revision Deletion. (2019). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:Revision_deletion. Accessed 5 May 2019.
Wikipedia: Vandalism on Wikipedia. (2019). Wikipedia. https://en.wikipedia.org/wiki/Vandalism_on_Wikipedia. Accessed 5 May 2019.
Wikipedia:WikiProject Women in Red. (2019). Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Women_in_Red. Accessed 5 May 2019.
Rights and permissions
About this article
Cite this article
Jiang, J., Vetter, M. The Good, the Bot, and the Ugly: Problematic Information and Critical Media Literacy in the Postdigital Era. Postdigit Sci Educ 2, 78–94 (2020). https://doi.org/10.1007/s42438-019-00069-4
- Critical media literacy
- Problematic information