Business & Information Systems Engineering

, Volume 57, Issue 3, pp 167–179

How (not) to Incent Crowd Workers

Payment Schemes and Feedback in Crowdsourcing
  • Tim Straub
  • Henner Gimpel
  • Florian Teschner
  • Christof Weinhardt
Research Paper

DOI: 10.1007/s12599-015-0384-2

Cite this article as:
Straub, T., Gimpel, H., Teschner, F. et al. Bus Inf Syst Eng (2015) 57: 167. doi:10.1007/s12599-015-0384-2

Abstract

Crowdsourcing gains momentum: In digital work places such as Amazon Mechanical Turk, oDesk, Clickworker, 99designs, or InnoCentive it is easy to distribute human work to hundreds or thousands of freelancers. In these crowdsourcing settings, one challenge is to properly incent worker effort to create value. Common incentive schemes are piece rate payments and rank-order tournaments among workers. Tournaments might or might not disclose a worker’s current competitive position via a leaderboard. Following an exploratory approach, we derive a model on worker performance in rank-order tournaments and present a series of real effort studies using experimental techniques on an online labor market to test the model and to compare dyadic tournaments to piece rate payments. Data suggests that on average dyadic tournaments do not improve performance compared to a simple piece rate for simple and short crowdsourcing tasks. Furthermore, giving feedback on the competitive position in such tournaments tends to be negatively related to workers’ performance. This relation is partially mediated by task completion and moderated by the provision of feedback: When playing against strong competitors, feedback is associated with workers quitting the task altogether and, thus, showing lower performance. When the competitors are weak, workers tend to complete the task but with reduced effort. Overall, individual piece rate payments are most simple to communicate and implement while incenting performance is on par with more complex dyadic tournaments.

Keywords

Crowdsourcing Online labor Incentives Exploratory study Experimental techniques Real effort task Rank-order tournament Piece rate Feedback 

Copyright information

© Springer Fachmedien Wiesbaden 2015

Authors and Affiliations

  • Tim Straub
    • 1
  • Henner Gimpel
    • 2
  • Florian Teschner
    • 3
  • Christof Weinhardt
    • 1
  1. 1.Institute of Information Systems and Marketing, Karlsruhe Service Research Institute (KSRI)Karlsruhe Institute of Technology (KIT)KarlsruheGermany
  2. 2.Research Center Finance and Information Management, Project Group Business and Information Systems Engineering of Fraunhofer FITUniversity of AugsburgAugsburgGermany
  3. 3.Institute of Information Systems and MarketingKarlsruhe Institute of Technology (KIT)KarlsruheGermany

Personalised recommendations