The surprisingly high reliability of Wikipedia has often been seen as a beneficial effect of the aggregation of diverse contributors, or as an instance of the wisdom of crowds phenomenon; additional factors such as elite contributors, Wikipedia’s policy or its administration have also been mentioned. We adjudicate between such explanations by modelling and simulating the evolution of a Wikipedia entry. The main threat to Wikipedia’s reliability, namely the presence of epistemically disruptive agents such as disinformers and trolls, turns out to be offset only by a combination of factors: Wikipedia’s administration and the possibility to instantly revert entries, both of which are insufficient when considered in isolation. Our results suggest that the reliability of Wikipedia should receive a pluralist explanation, involving factors of different kinds.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
Tax calculation will be finalised during checkout.
“We are also concerned with how much knowledge can be acquired from an information source, how fast that knowledge can be acquired, and how many people can acquire it” (Fallis 2011: p. 305).
Some of these works have been later criticised for their lack of robustness or artificial assumptions. However, here, we are merely concerned with the assumptions of epistemically beneficial agents that are shared by these works as well as their refinements.
In the non-formal literature, an earlier similar thesis is that of Hull (1988), who argues that the success of science cannot be properly explained if one neglects the scientists’ desire for recognition.
Moreover, there exist a watchlist tool, which allows users to be notified whenever a page of interest is modified.
The recommendations may be found on: https://en.wikipedia.org/wiki/Wikipedia:Verifiability.
In particular, one salient characteristic of troll actions is that they are particularly repetitive; see Shachaf and Hara (2010).
Note that in our model, this process involves noise. More precisely, the model compares the reliability of the checking user, modified by some noise, to the reliability of the author of the checked information, also modified by some noise. When the former is greater than the latter, a user will know whether an information is true or false.
For instance, with a Gaussian distribution of 0.1 average, the entry’s reliability still converges to 1, although more slowly, because of two factors. First, the rate of increase of true information is about a third of what it was for the 0.5 average condition. Second, there is a constant, noisy but non negligible amount of false information.
Moreover, the threshold proportion of trolls from which reversion becomes inefficient is typically around 10%; but this value varies depending on parameters such as the size of modifications performed by trolls.
According to Viegas et al. (2004), about half of mass deletions are reverted within three minutes.
Note that the results would not necessarily be robust to a decrease in troll activity, because this would both make trolls less disruptive and harder to detect (which hampers the efficiency of the administration/revert solution).
We thank an anonymous referee for drawing our attention to the distribution of user activity.
We than an anonymous referee for urging us to mention such motivations.
Currently more than 5 millions—see https://en.wikipedia.org/wiki/Wikipedia:Size_of_Wikipedia.
One particular difficulty for modelling multiple entries is to find a common time scale. In our graphs, changes are measured as functions of the number of changes of entry version. However, such changes may occur at different paces for different entries, which complicates their combination.
For a fuller description of Wikipedia’s administrators, see https://en.wikipedia.org/wiki/Wikipedia:Administrators.
Anderson, C. (2006). The long tail. New York: Hyperion.
Fallis, D. (2011). Wikipistemology. In A. I. Goldman & D. Whitcomb (Eds.), Social epistemology: Essential readings. Oxford: Oxford University Press.
Frankfurt, H. (2005). On bullshit. Princeton: Princeton University Press.
Giles, J. (2005). Internet encyclopaedias go head to head. Nature, 438, 900–901.
Goldman, A. (1987). Foundations of social epistemics. Synthese, 73, 109–144.
Hull, D. (1988). Science as a process. Chicago: University of Chicago Press.
Kitcher, P. (1990). The division of cognitive labor. The Journal of Philosophy, 87(1), 5–22.
Magnus, P. D. (2009). On trusting Wikipedia. Episteme, 6(1), 74–90.
Sanger, L. M. (2009). The fate of expertise after Wikipedia. Episteme, 6(1), 52–73.
Shachaf, P., & Hara, N. (2010). Beyond vandalism: Wikipedia trolls. Journal of Information Science, 36, 357–370.
Strevens, M. (2003). The role of the priority rule in science. The Journal of Philosophy, 100(2), 50–79.
Surowiecki, J. (2004). The wisdom of crowds. Garden City: Anchor Books.
Tsvetkova, M., García-Gavilanes, R., Floridi, L., & Yasseri, T. (2017). Even good bots fight: The case of Wikipedia. PLoS ONE, 12(2), e0171774. https://doi.org/10.1371/journal.pone.0171774.
Viegas, F., Wattenberg, M., & Dave, K. (2004). Studying cooperation and conflict between authors with history flow visualizations. Proceedings of the Computer-Human Interaction, 6(1), 575–582.
Weisberg, M., & Muldoon, R. (2009). Epistemic landscapes and the division of cognitive labor. Philosophy of Science, 76(2), 225–252.
Zollman, K. J. S. (2007). The communication structure of epistemic communities. Philosophy of Science, 74, 574–587.
We thank Anouk Barberousse for her comments on an early version of this work.
Cédric Paternotte: The simulations were programmed and run by Valentin Lageard.
About this article
Cite this article
Lageard, V., Paternotte, C. Trolls, bans and reverts: simulating Wikipedia. Synthese (2018). https://doi.org/10.1007/s11229-018-02029-0
- Social epistemology
- Computer simulation
- Collective knowledge
- Wisdom of crowds