Skip to main content
Log in

From open-source software to Wikipedia: ‘Backgrounding’ trust by collective monitoring and reputation tracking

  • Original Paper
  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

Abstract

Open-content communities that focus on co-creation without requirements for entry have to face the issue of institutional trust in contributors. This research investigates the various ways in which these communities manage this issue. It is shown that communities of open-source software—continue to—rely mainly on hierarchy (reserving write-access for higher echelons), which substitutes (the need for) trust. Encyclopedic communities, though, largely avoid this solution. In the particular case of Wikipedia, which is confronted with persistent vandalism, another arrangement has been pioneered instead. Trust (i.e. full write-access) is ‘backgrounded’ by means of a permanent mobilization of Wikipedians to monitor incoming edits. Computational approaches have been developed for the purpose, yielding both sophisticated monitoring tools that are used by human patrollers, and bots that operate autonomously. Measures of reputation are also under investigation within Wikipedia; their incorporation in monitoring efforts, as an indicator of the trustworthiness of editors, is envisaged. These collective monitoring efforts are interpreted as focusing on avoiding possible damage being inflicted on Wikipedian spaces, thereby being allowed to keep the discretionary powers of editing intact for all users. Further, the essential differences between backgrounding and substituting trust are elaborated. Finally it is argued that the Wikipedian monitoring of new edits, especially by its heavy reliance on computational tools, raises a number of moral questions that need to be answered urgently.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Note that West et al. (2012)—an important reference later on in the section on reputation—fails to make this distinction and lumps all OCCs together under one label: Collaborative Web Applications (CWAs).

  2. Andrea Forte and Cliff Lampe introduce the category of ‘open collaboration projects’ in their introductory piece to a recent special issue of the American Behavioral Scientist about such projects (Forte and Lampe 2013). In my terminology this refers to both co-contributing (2.0) and co-creation (3.0) communities without barriers to entry.

  3. As a rule, co-created outcomes are licensed to the public with a so-called Creative Commons licence. This aspect, though of crucial importance, does not figure in this article and therefore deserves to be mentioned at least in this footnote.

  4. Throughout this article I employ the terms entry/admission and access as follows. ‘Entry’ or ‘admission’ refers to being accepted as a participant in the co-creative process; ‘access’ refers to subsequently obtaining permission to carry out various activities associated with the process. Compare the—albeit imperfect—analogy of entering a building through the front door and reaching a hallway (entry, admission), and subsequently gaining access to the various floors (access).

  5. I do not come back anymore to the topic of ‘netiquette’. It is assumed, by default, that it exists in some form—and is actively maintained and ‘applied’—in textual/pictorial OCCs. Similarly, without mentioning it explicitly, it is assumed, as the default again, that OSS communities are culturally’embedded’ in a hacker ethic.

  6. Notice that the essential differences between the third and fourth mechanisms of trust management are fleshed out and explored more extensively in the sections that follow, culminating in a more elaborate analysis under the section ‘Collective monitoring within Wikipedia: interpretation’.

  7. I felt free to use the masculine personal pronoun in the paragraphs above, since almost all developers concerned are male.

  8. Attempts to write a book collectively (‘networked book’) failed. In this vein several projects were initiated, of which the most famous one was staged by Penguin (dubbed ‘A Million Penguins’), inviting the crowds to produce a book together (2007). In a time span of 5 weeks a ‘wikinovel’ was produced, with some 1,500 people contributing (Pullinger 2012). From our perspective (of trust) the main observation to be made is that, due to the many reactions ultimately verging on vandalism, a team of students had to practise filtering of incoming edits – a hierarchical kind of arrangement that figures later in the main text as well.

  9. Note that more such general encyclopedias have actually been initiated during the last decade, many of them copying the software of the Wikipedian production model (available as ‘open source’). They are not taken into consideration here since they are either only part-encyclopedia, or carry a distinct ideological message, or have simply not survived.

  10. The exceptions to this rule either slightly qualify full write-access (Wikinews’ front page), or never introduced it in the first place (Scholarpedia); to be commented on below.

  11. These observations refer to the English version of Wikipedia. In the remainder of this article, unless specified otherwise, I always refer to that language version—actually the largest of all language versions of Wikipedia.

  12. Some conditions have been introduced in Wikipedia that qualify write-access for all (WP:UAL). Any user, without an account (‘unregistered’), may read and edit entries (pages). Upon registration, the user may also create new pages. After some time (four days and at least ten edits) the registered user automatically becomes ‘autoconfirmed’, which implies that (s)he may also move pages around and upload files and pictures. Write-access may be said to be ‘complete’ by then. Currently over a million (English) users are autoconfirmed. Let me remark finally, in order to avoid any misunderstanding, that write-access does not only involve the right to add or change text but also to delete text.

  13. Note the analogy with the division of roles in OSS: observers’ contributions have to be scrutinized by developers before acceptance.

  14. Based on the likely introduction of this flagged-revisions scheme, some time ago I foresaw a convergence of the designs for open-source software and encyclopedias (de Laat 2010). It has now become clear that this convergence is not taking place.

  15. As demonstrated in de Laat (2012c), social news sites and citizen journals similarly rely on backgrounding trust: voting schemes push high quality articles to a prominent or visible position—and likewise, relegate low quality contributions to an inconspicuous or invisible position. These sites are not considered here, however, as they are of the 2.0 co-contributing category.

  16. On a more personal note, let me quote from my own recent experience of vandalism patrolling: words may be inserted (yodolohee, poo, popcorn, peanut butter), substituted (Boeing 747 Dreamliner is changed into Nighmareliner; a hip hop album sales figure is changed from 295,000 to 2,295,000), or whole paragraphs blanked (or replaced by HAHAHA). Vandalist insertions can also be larger, and even be creative. Let me give the example of the entry ‘Heat Pipe’, in the middle of which the following lines were inserted (at 20:42 on 28 February 2013): “A little known fact is that number of Dwarfs actually live inside these pipes and help with constant maintenance, they may need to be replaced at some point in the computers life due to wars between the dwarfs that end with numerous casualties. Treaties have been implemented between the dwarf clans, but they can never live in harmony.”

  17. An early example of ‘algorithmic power’, to be discussed more fully below.

  18. In a larger vein not only new edits but also new entries (pages) as a whole are watched constantly. ‘New Pages Patrol’ is a system that signals newly created pages and invites Wikipedians to check whether or not these conform to various criteria (concerning not only vandalism, but also relevance, substance, harassment, advertising, copyright violations, etc.) (WP:NPP). Unwelcome candidates are to be nominated for so-called ‘speedy deletion’. This patrol is intended to eradicate quality problems right from the start.

  19. In actual fact, as soon as someone’s reputation is too low in relationship to the credibility of a specific edit, endorsing the edit does not increase its credibility at all.

  20. This model has resulted in a practical tool: the WikiTrust extension (Adler et al. 2008). It continuously calculates the credibility of words in an entry as ‘voting’ continues and assigns colours to them accordingly (ever lighter shades of orange indicate old age). The tool may assist users to focus their efforts on the fresh parts of the text (dark orange).

  21. A similar experience can be obtained from using the experimental tool wpcvn.com. It presents possible instances of vandalism that occurred over the last hour to its human operators, combined with the ‘karma’ (i.e., reputation) of their authors. Only edits performed by contributors with negative karma are shown. The design thus steers attention to the low-karma-contributors. This tool, however, is no longer working as of January 2014.

  22. Note that the number of edits to Wikipedia patrolled by means of STiki is also kept track of on a ‘leader board’ – ‘assisted editing’ itself is also subjected to gamification.

  23. The privilege rotates regularly over the Slashdot population-of-high-repute as a whole, thereby avoiding role fixation.

  24. Due to operational difficulties this reputational type of engine for STiki is now out of order.

  25. Notice that I followed Jaques’ convention of exclusive usage of the masculine personal pronoun throughout. Of course, it was 1956 then; nowadays a gender neutral use of pronouns is considered more appropriate.

  26. This corresponds to the debate about models of how the human mind accepts information: a procedure à la Spinoza versus a procedure à la Descartes (Gilbert et al. 1990).

  27. Giving ample references here would take me too far afield, but let me just mention two of them. For access control, cf. O’Connor and Loomis (2010); for anti-intrusion systems, cf. Scarfone and Mell (2007).

  28. Another comparison of an epistemological kind can be drawn. The Wikipedian monitoring campaign can be seen as an institutional form of ‘epistemic vigilance’ concerning information communicated by others—as Sperber et al. (2010) coined the term. In our case, such vigilance is not exercised in judicial or scientific institutions (idem: 383), but in the largest open-content encyclopedia of all.

References

All websites were last accessed on February 10, 2014.

  • Adler, B. T., Chatterjee, K., de Alfaro, L., Faella, M., Pye, I., & Raman, V. (2008). Assigning trust to Wikipedia content. In Proceedings of the 4th International Symposium on Wikis (WikiSym’08), September 8–10, 2008, Porto, Portugal. http://dx.doi.org/10.1145/1822258.1822293.

  • Adler, B. T., & de Alfaro, L. (2007). A content-driven reputation system for the Wikipedia. In Proceedings of the 16th International Conference on World Wide Web, May 8–12, 2007, Banff, Alberta, Canada. http://dx.doi.org/10.1145/1242572.1242608.

  • Adler, B. T., de Alfaro, L., Mola-Velasco, S. M., Rosso, P., & West, A. G. (2011). Wikipedia vandalism detection: Combining natural language, metadata, and reputation features. In CICLing ‘11: Proceedings of the 12th International Conference on Intelligent Text Processing and Computational Linguistics, LNCS 6609 (pp. 277–288), Tokyo, Japan.

  • Brey, P. (2000). Disclosive computer ethics. Computers and Society, 30(4), 10–16.

    Article  Google Scholar 

  • Cohen, L. J. (1989). Belief and acceptance. Mind, New Series, 98(391), 367–389.

    Article  Google Scholar 

  • Crowston, K., Annabi, H., Howison, J., & Masango, Ch. (2004). Effective work practices for software engineering: Free/libre open source software development. In Proceedings of the 2004 ACM workshop on Interdisciplinary software engineering research (WISER ‘04) (pp. 18–26), ACM, New York, NY, USA. http://doi.acm.org/10.1145/1029997.1030003.

  • de Laat, P. B. (2007). Governance of open source software: State of the art. Journal of Management and Governance, 11(2), 165–177.

    Article  Google Scholar 

  • de Laat, P. B. (2010). How can contributors to open-source communities be trusted? On the assumption, inference, and substitution of trust. Ethics and Information Technology, 12(4), 327–341.

    Article  Google Scholar 

  • de Laat, P. B. (2012a). Open source production of encyclopedias: Editorial policies at the intersection of organizational and epistemological trust. Social Epistemology, 26(1), 71–103.

    Article  Google Scholar 

  • de Laat, P. B. (2012b). Coercion or empowerment? Moderation of content in Wikipedia as ‘essentially contested’ bureaucratic rules. Ethics and Information Technology, 14(2), 123–135.

    Article  Google Scholar 

  • de Laat, P. B. (2012c). Navigating between chaos and bureaucracy: Backgrounding trust in open-content communities. In K. Aberer et al. (Eds.), Proceedings of the 4th International Conference on Social Informatics, SocInfo 2012, LNCS 7710 , Heidelberg: Springer (pp. 534–557), December 5–7, Lausanne, Switzerland.

  • de Laat, P. B. (2014). Tools and bots against vandalism: Eroding Wikipedia’s moral order? In Proceedings of the 11th International Conference of Computer Ethics: Philosophical Explorations (CEPE), Paris.

  • Deterding, S., Dixon, D., Khaled, R., & Nacke, L. E. (2011). From game design elements to gamefulness: Defining »Gamification«. In Mindtrek 2011 Proceedings, Tampere: ACM Press.

  • Dutton, W. H. (2008). The wisdom of collaborative network organizations: Capturing the value of networked individuals. Prometheus, 26(3), 211–230.

    Article  MathSciNet  Google Scholar 

  • Farmer, F. R., & Glass, B. (2010). Building web reputation systems. Sebastopol: O’Reilly.

    Google Scholar 

  • Forte, A., & Lampe, C. (2013). Defining, understanding, and supporting open collaboration: Lessons from the literature. American Behavioral Scientist, 57(5), 535–547.

    Article  Google Scholar 

  • Gilbert, D., Krull, D., & Malone, P. (1990). Unbelieving the unbelievable: Some problems in the rejection of false information. Journal of Personality and Social Psychology, 59(4), 601–613.

    Article  Google Scholar 

  • Holck, J., & Jørgensen, N. (2005). Do not check in on red: Control meets anarchy in two open source projects. In S. Koch (Ed.), Free/open source software development (pp. 1–26). Hershey: Idea Group.

    Google Scholar 

  • Jaques, E. (1956). Measurement of responsibility: A study of work, payment, and individual capacity. London: Tavistock.

    Google Scholar 

  • McGeer, V. (2008). Trust, hope and empowerment. Australasian Journal of Philosophy, 86(2), 237–254.

    Article  Google Scholar 

  • Moran, R. (2005). Getting told and being believed. Philosophers’ Imprint, 5(5); also published in J. Lackey, & E. Sosa (Eds.) (2006), The epistemology of testimony (pp. 272–306). Oxford: Oxford University Press.

  • O’Connor, A. C., & Loomis, R. J. (2010). Economic analysis of role-based access control. Prepared for NIST. http://csrc.nist.gov/groups/SNS/rbac/documents/20101219_RBAC2_Final_Report.pdf.

  • Orsila, H., Geldenhuys, J., Ruokonen, A., & Hammouda, I. (2009). Trust issues in open source software development. In N. Medvidovic, & T. Tamai (Eds.), Proceedings of the Warm Up Workshop for ACM/IEEE ICSE 2010 (WUP ‘09) (pp. 9–12), ACM, New York, NY, USA. http://doi.acm.org/10.1145/1527033.1527037.

  • Pullinger, K. (2012). A million penguins’ five years on, blog post from 25 January 2012. http://www.katepullinger.com/blog/comments/a-million-penguins-five-years-on/.

  • Scarfone, K. and Mell, P. (2007). Guide to intrusion detection and prevention systems (IDPS). NIST Special Publication 800-94. http://csrc.nist.gov/publications/nistpubs/800-94/SP800-94.pdf.

  • Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., & Wilson, D. (2010). Epistemic vigilance. Mind & Language, 25, 359–393.

  • West, A. G. (2011). Anti‐vandalism research: The year in review (Presentation at Wikimania 2011). www.cis.upenn.edu/~westand/docs/wikimania_11_vandalism_slides.pdf.

  • West, A. G., Chang, J., Venkatasubramanian, K. K., & Lee, I. (2012). Trust in collaborative web applications. Future Generation Computer Systems, 28 (8), 1238–1251. http://dx.doi.org/10.1016/j.future.2011.02.007.

    Google Scholar 

  • WP:Huggle. http://en.wikipedia.org/wiki/Wikipedia:Huggle.

  • WP:Lupin. http://en.wikipedia.org/wiki/User:Lupin/Anti-vandal_tool.

  • WP:NPP. http://en.wikipedia.org/wiki/Wikipedia:Npp.

  • WP:STiki. http://en.wikipedia.org/wiki/Wikipedia:STiki.

  • WP:Twinkle. http://en.wikipedia.org/wiki/Wikipedia:Twinkle.

  • WP:UAL. http://en.wikipedia.org/wiki/Wikipedia:User_access_levels.

Download references

Acknowledgments

Thanks are due to two anonymous reviewers of this journal for their comments, in particular for alerting me to the comparison with cyber security studies.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paul B. de Laat.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

de Laat, P.B. From open-source software to Wikipedia: ‘Backgrounding’ trust by collective monitoring and reputation tracking. Ethics Inf Technol 16, 157–169 (2014). https://doi.org/10.1007/s10676-014-9342-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-014-9342-9

Keywords

Navigation