Trolls, bans and reverts: simulating Wikipedia


The surprisingly high reliability of Wikipedia has often been seen as a beneficial effect of the aggregation of diverse contributors, or as an instance of the wisdom of crowds phenomenon; additional factors such as elite contributors, Wikipedia’s policy or its administration have also been mentioned. We adjudicate between such explanations by modelling and simulating the evolution of a Wikipedia entry. The main threat to Wikipedia’s reliability, namely the presence of epistemically disruptive agents such as disinformers and trolls, turns out to be offset only by a combination of factors: Wikipedia’s administration and the possibility to instantly revert entries, both of which are insufficient when considered in isolation. Our results suggest that the reliability of Wikipedia should receive a pluralist explanation, involving factors of different kinds.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9


  1. 1.

    “We are also concerned with how much knowledge can be acquired from an information source, how fast that knowledge can be acquired, and how many people can acquire it” (Fallis 2011: p. 305).

  2. 2.

    Some of these works have been later criticised for their lack of robustness or artificial assumptions. However, here, we are merely concerned with the assumptions of epistemically beneficial agents that are shared by these works as well as their refinements.

  3. 3.

    In the non-formal literature, an earlier similar thesis is that of Hull (1988), who argues that the success of science cannot be properly explained if one neglects the scientists’ desire for recognition.

  4. 4.

    Moreover, there exist a watchlist tool, which allows users to be notified whenever a page of interest is modified.

  5. 5.

    The recommendations may be found on:

  6. 6.


  7. 7.

    In particular, one salient characteristic of troll actions is that they are particularly repetitive; see Shachaf and Hara (2010).

  8. 8.

    Note that in our model, this process involves noise. More precisely, the model compares the reliability of the checking user, modified by some noise, to the reliability of the author of the checked information, also modified by some noise. When the former is greater than the latter, a user will know whether an information is true or false.

  9. 9.

    For instance, with a Gaussian distribution of 0.1 average, the entry’s reliability still converges to 1, although more slowly, because of two factors. First, the rate of increase of true information is about a third of what it was for the 0.5 average condition. Second, there is a constant, noisy but non negligible amount of false information.

  10. 10.

    Moreover, the threshold proportion of trolls from which reversion becomes inefficient is typically around 10%; but this value varies depending on parameters such as the size of modifications performed by trolls.

  11. 11.

    According to Viegas et al. (2004), about half of mass deletions are reverted within three minutes.

  12. 12.


  13. 13.

    Note that the results would not necessarily be robust to a decrease in troll activity, because this would both make trolls less disruptive and harder to detect (which hampers the efficiency of the administration/revert solution).

  14. 14.


  15. 15.

    We thank an anonymous referee for drawing our attention to the distribution of user activity.

  16. 16.

    We than an anonymous referee for urging us to mention such motivations.

  17. 17.

    Currently more than 5 millions—see

  18. 18.

    One particular difficulty for modelling multiple entries is to find a common time scale. In our graphs, changes are measured as functions of the number of changes of entry version. However, such changes may occur at different paces for different entries, which complicates their combination.

  19. 19.

    For a fuller description of Wikipedia’s administrators, see


  1. Anderson, C. (2006). The long tail. New York: Hyperion.

    Google Scholar 

  2. Fallis, D. (2011). Wikipistemology. In A. I. Goldman & D. Whitcomb (Eds.), Social epistemology: Essential readings. Oxford: Oxford University Press.

    Google Scholar 

  3. Frankfurt, H. (2005). On bullshit. Princeton: Princeton University Press.

    Google Scholar 

  4. Giles, J. (2005). Internet encyclopaedias go head to head. Nature, 438, 900–901.

    Article  Google Scholar 

  5. Goldman, A. (1987). Foundations of social epistemics. Synthese, 73, 109–144.

    Article  Google Scholar 

  6. Hull, D. (1988). Science as a process. Chicago: University of Chicago Press.

    Google Scholar 

  7. Kitcher, P. (1990). The division of cognitive labor. The Journal of Philosophy, 87(1), 5–22.

    Article  Google Scholar 

  8. Magnus, P. D. (2009). On trusting Wikipedia. Episteme, 6(1), 74–90.

    Article  Google Scholar 

  9. Sanger, L. M. (2009). The fate of expertise after Wikipedia. Episteme, 6(1), 52–73.

    Article  Google Scholar 

  10. Shachaf, P., & Hara, N. (2010). Beyond vandalism: Wikipedia trolls. Journal of Information Science, 36, 357–370.

    Article  Google Scholar 

  11. Strevens, M. (2003). The role of the priority rule in science. The Journal of Philosophy, 100(2), 50–79.

    Article  Google Scholar 

  12. Surowiecki, J. (2004). The wisdom of crowds. Garden City: Anchor Books.

    Google Scholar 

  13. Tsvetkova, M., García-Gavilanes, R., Floridi, L., & Yasseri, T. (2017). Even good bots fight: The case of Wikipedia. PLoS ONE, 12(2), e0171774.

    Article  Google Scholar 

  14. Viegas, F., Wattenberg, M., & Dave, K. (2004). Studying cooperation and conflict between authors with history flow visualizations. Proceedings of the Computer-Human Interaction, 6(1), 575–582.

    Google Scholar 

  15. Weisberg, M., & Muldoon, R. (2009). Epistemic landscapes and the division of cognitive labor. Philosophy of Science, 76(2), 225–252.

    Article  Google Scholar 

  16. Zollman, K. J. S. (2007). The communication structure of epistemic communities. Philosophy of Science, 74, 574–587.

    Article  Google Scholar 

Download references


We thank Anouk Barberousse for her comments on an early version of this work.

Author information



Corresponding author

Correspondence to Cédric Paternotte.

Additional information

Cédric Paternotte: The simulations were programmed and run by Valentin Lageard.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lageard, V., Paternotte, C. Trolls, bans and reverts: simulating Wikipedia. Synthese (2018).

Download citation


  • Wikipedia
  • Social epistemology
  • Computer simulation
  • Collective knowledge
  • Wisdom of crowds