Skip to main content
Log in

Trolls, bans and reverts: simulating Wikipedia

  • Published:
Synthese Aims and scope Submit manuscript

Abstract

The surprisingly high reliability of Wikipedia has often been seen as a beneficial effect of the aggregation of diverse contributors, or as an instance of the wisdom of crowds phenomenon; additional factors such as elite contributors, Wikipedia’s policy or its administration have also been mentioned. We adjudicate between such explanations by modelling and simulating the evolution of a Wikipedia entry. The main threat to Wikipedia’s reliability, namely the presence of epistemically disruptive agents such as disinformers and trolls, turns out to be offset only by a combination of factors: Wikipedia’s administration and the possibility to instantly revert entries, both of which are insufficient when considered in isolation. Our results suggest that the reliability of Wikipedia should receive a pluralist explanation, involving factors of different kinds.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. “We are also concerned with how much knowledge can be acquired from an information source, how fast that knowledge can be acquired, and how many people can acquire it” (Fallis 2011: p. 305).

  2. Some of these works have been later criticised for their lack of robustness or artificial assumptions. However, here, we are merely concerned with the assumptions of epistemically beneficial agents that are shared by these works as well as their refinements.

  3. In the non-formal literature, an earlier similar thesis is that of Hull (1988), who argues that the success of science cannot be properly explained if one neglects the scientists’ desire for recognition.

  4. Moreover, there exist a watchlist tool, which allows users to be notified whenever a page of interest is modified.

  5. The recommendations may be found on: https://en.wikipedia.org/wiki/Wikipedia:Verifiability.

  6. See https://en.wikipedia.org/wiki/Wikipedia:Contributing_to_Wikipedia.

  7. In particular, one salient characteristic of troll actions is that they are particularly repetitive; see Shachaf and Hara (2010).

  8. Note that in our model, this process involves noise. More precisely, the model compares the reliability of the checking user, modified by some noise, to the reliability of the author of the checked information, also modified by some noise. When the former is greater than the latter, a user will know whether an information is true or false.

  9. For instance, with a Gaussian distribution of 0.1 average, the entry’s reliability still converges to 1, although more slowly, because of two factors. First, the rate of increase of true information is about a third of what it was for the 0.5 average condition. Second, there is a constant, noisy but non negligible amount of false information.

  10. Moreover, the threshold proportion of trolls from which reversion becomes inefficient is typically around 10%; but this value varies depending on parameters such as the size of modifications performed by trolls.

  11. According to Viegas et al. (2004), about half of mass deletions are reverted within three minutes.

  12. See https://en.wikipedia.org/wiki/Wikipedia:Edit_warring.

  13. Note that the results would not necessarily be robust to a decrease in troll activity, because this would both make trolls less disruptive and harder to detect (which hampers the efficiency of the administration/revert solution).

  14. See https://en.wikipedia.org/wiki/Wikipedia:List_of_Wikipedians_by_number_of_edits.

  15. We thank an anonymous referee for drawing our attention to the distribution of user activity.

  16. We than an anonymous referee for urging us to mention such motivations.

  17. Currently more than 5 millions—see https://en.wikipedia.org/wiki/Wikipedia:Size_of_Wikipedia.

  18. One particular difficulty for modelling multiple entries is to find a common time scale. In our graphs, changes are measured as functions of the number of changes of entry version. However, such changes may occur at different paces for different entries, which complicates their combination.

  19. For a fuller description of Wikipedia’s administrators, see https://en.wikipedia.org/wiki/Wikipedia:Administrators.

References

  • Anderson, C. (2006). The long tail. New York: Hyperion.

    Google Scholar 

  • Fallis, D. (2011). Wikipistemology. In A. I. Goldman & D. Whitcomb (Eds.), Social epistemology: Essential readings. Oxford: Oxford University Press.

    Google Scholar 

  • Frankfurt, H. (2005). On bullshit. Princeton: Princeton University Press.

    Book  Google Scholar 

  • Giles, J. (2005). Internet encyclopaedias go head to head. Nature, 438, 900–901.

    Article  Google Scholar 

  • Goldman, A. (1987). Foundations of social epistemics. Synthese, 73, 109–144.

    Article  Google Scholar 

  • Hull, D. (1988). Science as a process. Chicago: University of Chicago Press.

    Book  Google Scholar 

  • Kitcher, P. (1990). The division of cognitive labor. The Journal of Philosophy, 87(1), 5–22.

    Article  Google Scholar 

  • Magnus, P. D. (2009). On trusting Wikipedia. Episteme, 6(1), 74–90.

    Article  Google Scholar 

  • Sanger, L. M. (2009). The fate of expertise after Wikipedia. Episteme, 6(1), 52–73.

    Article  Google Scholar 

  • Shachaf, P., & Hara, N. (2010). Beyond vandalism: Wikipedia trolls. Journal of Information Science, 36, 357–370.

    Article  Google Scholar 

  • Strevens, M. (2003). The role of the priority rule in science. The Journal of Philosophy, 100(2), 50–79.

    Article  Google Scholar 

  • Surowiecki, J. (2004). The wisdom of crowds. Garden City: Anchor Books.

    Google Scholar 

  • Tsvetkova, M., García-Gavilanes, R., Floridi, L., & Yasseri, T. (2017). Even good bots fight: The case of Wikipedia. PLoS ONE, 12(2), e0171774. https://doi.org/10.1371/journal.pone.0171774.

    Article  Google Scholar 

  • Viegas, F., Wattenberg, M., & Dave, K. (2004). Studying cooperation and conflict between authors with history flow visualizations. Proceedings of the Computer-Human Interaction, 6(1), 575–582.

    Google Scholar 

  • Weisberg, M., & Muldoon, R. (2009). Epistemic landscapes and the division of cognitive labor. Philosophy of Science, 76(2), 225–252.

    Article  Google Scholar 

  • Zollman, K. J. S. (2007). The communication structure of epistemic communities. Philosophy of Science, 74, 574–587.

    Article  Google Scholar 

Download references

Acknowledgements

We thank Anouk Barberousse for her comments on an early version of this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cédric Paternotte.

Additional information

Cédric Paternotte: The simulations were programmed and run by Valentin Lageard.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lageard, V., Paternotte, C. Trolls, bans and reverts: simulating Wikipedia. Synthese 198, 451–470 (2021). https://doi.org/10.1007/s11229-018-02029-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11229-018-02029-0

Keywords

Navigation