Crowd Dynamics: Conflicts, Contradictions, and Community in Crowdsourcing

  • Karin Hansson
  • Thomas LudwigEmail author


Crowdsourcing Crowd Dynamics Contradiction CSCW 

1 Introduction

Unfair reputation systems, slow payments, lack of transparency, discrimination, and socio-spatial inequalities are only some of the many reasons behind conflicts in crowdsourcing. The divisive logic of the system and the sharing processes in the peer-community create interesting dynamics and new foci on old conflicts.

The development of new types of working relationships has previously been problematized for several reasons. For example, Irani and Silberman (2014) have questioned crowd-worker dynamics from a labor rights perspective, leading to calls for collective action by crowd workers (Salehi et al. 2015). An analysis of the online discussion in the community of workers at the Mechanical Turk shows the tensions between the dividing logic of the system and the information-sharing processes in the community (Martin et al. 2014). A study of Indian workers by Gupta et al. (2014) shows how lack of control over the work environment affects participants. Ludwig et al. (2015) show the impact of crowdsourcing during crisis management and the danger to which the workers expose themselves during their tasks.

Other common conflicts are due to rejected work, slow or unfair payments, lack of transparency and technical problems (Silberman et al. 2010). In particular, wages are a conflict area among “web workers” (Bederson and Quinn 2011). In the area of e-democracy, digital differentiation and inequalities within the crowd become problematic (Hansson et al. 2015). Studies of Amazon Mechanical Turk (Fort et al. 2011), Wikipedia (Menking and Erickson 2015; Ortega et al. 2008), and Twitter (Duggan et al. 2015) indicate a lack of representativity in terms of age, gender and education. Cultural geographers have also pointed out the hegemonic discourses and socio-spatial relations in the geographic web (Crampton et al. 2013; Shelton et al. 2014; Soden and Palen 2014; Zook et al. 2015). Research on conflicts within groups online tends to focus on communities like those in open source development (Filippova and Cho 2015), professional groups, or learning processes (Aaltonen and Kallinikos 2012). However, there is a need for more empirical research on the conflicts and dynamics within the more fluid work relations in crowd work and on the typologies of participation indicating levels of power and agency in the context of crowd work.

In this special issue, we therefore gather research that develops new methodologies, which explore power asymmetry, motivation, working conditions and community dynamics in crowdsourcing, as well as developing new typologies to describe the relations in crowdsourcing.

It can be challenging to study crowdsourcing processes due to their distributed nature. Therefore, in “Investigating the Amazon Mechanical Turk Market through Tool Design,” Benjamin Vincent Hanrahan, David Martin, Jutta K. Willamowski, and John M. Carroll use a design process as a technical probe to engage a group of Amazon Mechanical Turk workers, so called Turkers. Through the probe they manage to identify a number of previously unreported difficulties that crowdworkers face in both building their own tools and working on AMT, such as a high velocity of the market and a lack of infrastructural support for workers.

Another methodological challenge is how to identify and describe power asymmetry within the group of workers. In “Crowd Anatomy Beyond the Good and Bad: Behavioral Traces for Crowd Worker Modeling and Pre-selection” by Ujwal Gadiraju, Gianluca Demartini, Ricardo Kawase, and Stefan Dietze, the authors address the issue of inequalities between workers, proposing an automated mechanism using machine learning models to detect differences between worker types. This can be useful to identify the need for support to less effective and efficient workers.

One of the major challenges faced in a crowdsourcing project is attaining a high level of engagement over time. In their study of Wikidata editors, Cristina Sarasua, Alessandro Checco, Gianluca Demartini, Djellel E. Difallah, Michael Feldman, and Lydia Pintscher explore the means to predict user engagement at an early stage. “The Evolution of Power and Standard Wikidata Editors – Comparing Editing Behavior over Time to Predict Lifespan and Volume of Edits” provides a large-scale longitudinal data analysis that covers Wikidata edits over almost four years, observing the way the participation, the volume and the diversity of edits done by Wikidata editors change.

Benjamin Y. Clark and Jeffrey L. Brudney conducted another longitudinal study. In “Citizen Representation in City Government-Driven Crowdsourcing”, they analyzed the citizen representativeness of crowdsourcing achieved through the 311 systems. These systems provide access to non-emergency municipal services. The article shows that no systematic biases exist in participation rates across a range of socio-economic indicators and that participation may be responding positively to the city’s responsiveness.

While the focus in crowdsourcing research is often on workers, some researchers highlight the requesters’ role in shaping workers conditions. In “Rating Working Conditions on Digital Labor Platforms,” Ellie Harmon and Six Silberman present their work for IG Metal, German Metalworkers’ Union, where they developed a method for rating working conditions on digital labor platforms, providing a rich case study from this trade union setting.

Civic crowdfunding is another type of setting in which the ambition is not only about efficiency and workers’ rights but potentially a stronger democracy, involving citizens more directly both in service delivery and the community development process. In “Examining Community Dynamics of Civic Crowdfunding Participation” Martin Mayer investigates community dynamics and their potential impact on project success in jurisdictions proposing civic crowdfunding proposals. The results highlight the dynamics and characteristics of contexts where project proposals are likely to be funded.

While crowdsourcing is increasingly used for data gathering, problem solving, and civic participation, the relationships within the social structure herein remain largely unexamined. In the final paper in this special issue, “Capitalizing Relationships: Modes of Participation in Crowdsourcing,” Karin Hansson, Thomas Ludwig, and Tanja Aitamurto propose a typology of participation in crowdsourcing. They provide a theoretical model for analyzing crowdsourcing in terms of relations, to inform the design of appropriate technologies and processes.

In conclusion, we hope the design and research community will find this Special Issue to be a useful collection of papers that provide an informative foundation for further research in this field. We also wish to thank all authors who contributed and the set of capable reviewers.

Karin Hansson and Thomas Ludwig, guest editors.



  1. Aaltonen, Aleksi; and Jannis Kallinikos (2012). Coordination and Learning in Wikipedia: Revisiting the Dynamics of Exploitation and Exploration. In M. Holmqvist and A. Spicer (eds.), Managing ‘Human Resources’ by Exploiting and Exploring People’s Potentials Research in the Sociology of Organizations. Emerald Group Publishing Limited, pp. 161-192.Google Scholar
  2. Bederson, Benjamin B; and Alexander J. Quinn (2011). Web Workers Unite ! Addressing Challenges of Online Laborers. Human Factors, 2011, pp. 97–105.Google Scholar
  3. Crampton, Jeremy W.; Mark Graham; Ate Poorthuis; Taylor Shelton; Monica Stephens; Matthew W. Wilson; and Matthew Zook (2013). Beyond the geotag: situating ‘big data’ and leveraging the potential of the geoweb. Cartography and Geographic Information Science, vol. 40, no. 2, 2013, pp. 130–139.Google Scholar
  4. Duggan, Maeve; Nicole B. Ellison; Cliff Lampe; Amanda Lenhart; and Mary Madden. (2015). Demographics of Key Social Networking Platforms. Pew Research Center, 9 January 2015. Scholar
  5. Filippova, Anna; and Hichang Cho (2015). Mudslinging and Manners. In CSCW ’15. Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, Vancouver, British Columbia, Canada — 14-18 March 2015. New York: ACM Press, pp. 1393–1403.Google Scholar
  6. Fort, Karën; Gilles Adda; and K. Bretonnel Cohen (2011). Amazon Mechanical Turk: Gold Mine or Coal Mine? Computational Linguistics, vol. 37, no. 2, 2011, pp. 413–420.Google Scholar
  7. Gupta, Neha; David Martin; Benjamin V. Hanrahan; and Jacki O’Neill (2014). Turk-Life in India. In GROUP ’14. Proceedings of the 18th International Conference on Supporting Group Work. New York: ACM Press, pp. 1–11.Google Scholar
  8. Hansson, Karin; Kheira Belkacem; and Love Ekenberg (2015). Open Government and Democracy. Social Science Computer Review, vol. 33, no. 5, 2015, pp. 540–555.Google Scholar
  9. Irani, Lilly; and M. Six Silberman (2014). From critical design to critical infrastructure. interactions, vol. 21, no. 4, July 2014, pp. 32–35.Google Scholar
  10. Ludwig, Thomas; Christian Reuter; Tim Siebigteroth; and Volkmar Pipek (2015). CrowdMonitor: Mobile Crowd Sensing for Assessing Physical and Digital Activities of Citizens during Emergencies. In CHI ’15. Proceedings of the International Conference on Human Factors in Computing Systems . Seoul, Korea, 18-23 April 2015. New York, ACM Press, pp. 4083–4092.Google Scholar
  11. Martin, David; Benjamin V. Hanrahan; Jacki O’Neill; and Neha Gupta (2014). Being a turker. In CSCW ‘14. Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing, Baltimore, Maryland, 15-19 February 2014. New York: ACM Press, pp. 224–235.Google Scholar
  12. Menking, Amanda; and Ingrid Erickson (2015). The Heart Work of Wikipedia. In CHI ’15. Proceedings of the International Conference on Human Factors in Computing Systems. Seoul, Korea, 18-23 April 2015. New York: ACM Press, pp. 207–210.Google Scholar
  13. Ortega, Felipe; Jesus M. Gonzalez-Barahona; and Gregorio Robles (2008). On the inequality of contributions to Wikipedia. In HICSS ’08. Proceedings of the Annual Hawaii International Conference on System Sciences, Hawaii, 7-10 January 2008, IEEE, pp. 304-304.Google Scholar
  14. Salehi, Niloufar; Lilly C. Irani; Michael S. Bernstein; Ali Alkhatib; Eva Ogbe; and Kristy Milland (2015). We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers. In CHI ’15. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. New York: ACM Press, pp. 1621–1630.Google Scholar
  15. Shelton, Taylor; A. Poorthuis; M. Graham; and M. Zook (2014). Mapping the data shadows of hurricane Sandy: Uncovering the sociospatial dimensions of ‘Big Data’ Geoforum, vol. 52, March 2014, pp. 167–179.Google Scholar
  16. Silberman, M. Six; Joel Ross; Lilly Irani; and Bill Tomlinson (2010). Sellers’ problems in human computation markets. In HCOMP ’10. Proceedings of the ACM SIGKDD Workshop on Human Computation, Washington DC, 25-25 July 2010. New York: ACM Press, pp. 18-21.Google Scholar
  17. Soden, Robert; and Leysia Palen (2014). From Crowdsourced Mapping to Community Mapping: The Post-Earthquake Work of OpenStreetMap Haiti. In COOP 2014: Proceedings of the 11th International Conference on the Design of Cooperative Systems, 27-30 May 2014, Nice, France. London: Springer, pp. 311–326.Google Scholar
  18. Zook, By Matthew; Mark Graham; and Andrew Boulton (2015). Crowd-Sourced Augmented Realities: Social Media and the Power of Digital Representation. In S. P. Mains, J. Cupples, and C. Lukinbeal (eds), Mediated Geographies and Geographies of Media. Springer, pp. 223–240.Google Scholar

Copyright information

© Springer Nature B.V. 2018

Authors and Affiliations

  1. 1.Department of Computer and Systems Sciences, DSV,Stockholm UniversityStockholmSweden
  2. 2.Cyber-Physiscal SystemsUniversity of SiegenSiegenGermany

Personalised recommendations