Skip to main content

A Six-Month, Multi-platform Investigation of Creative Crowdsourcing

  • 183 Accesses

Part of the Human–Computer Interaction Series book series (HCIS)

Abstract

Crowdsourcing platforms can be roughly divided into two kinds: the ones that offer simple, short, and unskilled work (microtasking) and those that offer complex, longer tasks, which are difficult to break down and usually involve creativity (macrotasking). Past research has mapped the landscape of microtask crowdsourcing. Little, however, is known about where commercial platforms stand when it comes to creative crowdsourcing. Which types of creative tasks are offered? How are these remunerated? Do all platforms facilitate the same type of creative work? Given the increasing importance that creative crowdsourcing is expected to play in the near future, in this chapter we partially map the current state of this type of online work over time. During a six-month period, and on a daily basis, we collected public data from seven creative crowdsourcing platforms. Our data, covering more than thirteen thousand tasks, show that there are plenty of graphic design tasks but better financial rewards for other types of creative tasks, as well as a trend for creative crowd work platforms to offer longer tasks. Judging from the total rewards in those six months, we can also conclude that creative crowdsourcing will benefit from a shift to dynamic rather than fixed rewards, but also that this type of crowd work is still at an embryonic stage and has growth potential. Finally, our results highlight the need for a platform data watchdog, as well as the need for a more nuanced perspective of creative crowdsourcing, distinguishing between the types of platforms within this genre of online work.

Keywords

  • Crowdsourcing
  • Longitudinal data analysis
  • Multi-platform data analysis

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-67322-2_11
  • Chapter length: 16 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   139.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-67322-2
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   179.99
Price excludes VAT (USA)
Hardcover Book
USD   179.99
Price excludes VAT (USA)
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Notes

  1. 1.

    https://www.mturk.com/.

  2. 2.

    Skyscanner.com is essentially a website that crawls airline websites to index information about flights. In that way travelers looking for a flight to a certain destination can easily compare between flights.

  3. 3.

    L2 was acquired and is currently owned by Gartner.

References

  1. à Campo S, Khan VJ, Papangelis K, Markopoulos P (2019) Community heuristics for user interface evaluation of crowdsourcing platforms. Future Gener Comput Syst 95:775–789

    Google Scholar 

  2. Araujo RM (2013) 99designs: an analysis of creative competition in crowdsourced design. In First AAAI conference on human computation and crowdsourcing

    Google Scholar 

  3. Bayus BL (2013) Crowdsourcing new product ideas over time: an analysis of the Dell IdeaStorm community. Manage Sci 59(1):226–244

    CrossRef  Google Scholar 

  4. Berg J (2016) Income security in the on-demand economy: findings and policy lessons from a survey of crowdworkers. International Labour Office, Inclusive Labour Markets, Labour Relations and Working Conditions Branch. - Geneva: ILO, 2016 (Conditions of work and employment series ; No. 74)

    Google Scholar 

  5. Daly TM, Nataraajan R (2015) Swapping bricks for clicks: crowdsourcing longitudinal data on Amazon Turk. J Bus Res 68(12):2603–2609

    CrossRef  Google Scholar 

  6. Difallah D, Filatova E, Ipeirotis PG (2018) Demographics and dynamics of mechanical Turk workers. In: Proceedings of the eleventh ACM international conference on web search and data mining (WSDM ‘18). ACM, New York, pp 135–143. https://doi.org/10.1145/3159652.3159661

  7. Galloway S (2017) The four: the hidden DNA of Amazon, Apple. Random House, Facebook and Google

    Google Scholar 

  8. Geiger D, Rosemann M, Fielt E, Schader M, Huang M-H (2012) Crowdsourcing information systems—definition typology, and design. In: ICIS 2012 : proceedings of the 33rd international conference on information systems (ICIS 2012). Atlanta, Ga.: AISeL: Paper 53

    Google Scholar 

  9. Hara K, Adams A, Milland K, Savage S, Callison-Burch C, Bigham JP (2018) A data-driven analysis of workers’ earnings on Amazon Mechanical Turk. In: Proceedings of the 2018 CHI conference on human factors in computing systems, pp 1–14

    Google Scholar 

  10. Howcroft D, Bergvall-Kåreborn B (2019) A typology of crowdwork platforms. Work Employ Soc 33(1):21–38

    CrossRef  Google Scholar 

  11. Howe J (2006) The rise of crowdsourcing. Wired Maga 14(6):1–4

    Google Scholar 

  12. Ipeirotis PG (2010) Demographics of mechanical Turk (March 2010). NYU Working Paper No.; CEDER-10-01. Available at SSRN: https://ssrn.com/abstract=1585030

  13. Irani LC, Silberman MS (2013) Turkopticon: interrupting worker invisibility in Amazon mechanical Turk. In: Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’13), pp 611–620. https://doi.org/10.1145/2470654.2470742

  14. Kittur A, Nickerson JV, Bernstein M, Gerber E, Shaw A, Zimmerman J, Horton J (2013) The future of crowd work. In: Proceedings of the 2013 conference on computer supported cooperative work, pp 1301–1318

    Google Scholar 

  15. Kuek SC, Paradi-Guilford C, Fayomi T, Imaizumi S, Ipeirotis P, Pina P, Singh M (2015) The global opportunity in online outsourcing

    Google Scholar 

  16. Lhermitte M, Perrin B, Melbouci L (2014) Creating growth; measuring cultural and creative markets in the EU. Ernest & Young

    Google Scholar 

  17. Lykourentzou I, Khan VJ, Papangelis K, Markopoulos P (2019) Macrotask crowdsourcing: an integrated definition. In: Macrotask crowdsourcing. Springer, Cham, pp 1–13

    Google Scholar 

  18. Mao A, Chen Y, Gajos KZ, Parkes DC, Procaccia AD, Zhang H (2012) Turkserver: enabling synchronous and longitudinal online experiments. In: Workshops at the twenty-sixth AAAI conference on artificial intelligence

    Google Scholar 

  19. Litman L, Robinson J, Abberbock T (2017) TurkPrime.com: a versatile crowdsourcing data acquisition platform for the behavioral sciences. Behav Res Methods 49(2):433–442

    Google Scholar 

  20. Turker Nation. Retrieved April 27, 2020 from https://www.reddit.com/r/TurkerNation/

  21. Von Ahn L (2008) Human computation. In: 2008 IEEE 24th international conference on data engineering. IEEE, New York, pp 1–2

    Google Scholar 

  22. Wang Y, Papangelis K, Saker M, Lykourentzou I, Chamberlain A, Khan VJ (2020) Crowdsourcing in China: exploring the work experiences of solo crowdworkers and crowdfarm workers. In: Proceedings of the 2020 CHI conference on human factors in computing systems, pp 1–13

    Google Scholar 

  23. Wiederhold BK (2020) Social media use during social distancing

    Google Scholar 

  24. Wikipedia contributors (2020) Supply and demand. In: Wikipedia, the free Encyclopedia. Retrieved 07:34, August 27, 2020, from https://en.wikipedia.org/w/index.php?title=Supply_and_demand&oldid=974813303

  25. Zheng H, Li D, Hou W (2011) Task design, motivation, and participation in crowdsourcing contests. Int J Electron Commer 15(4):57–88

    CrossRef  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vassilis-Javed Khan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Khan, VJ., Lykourentzou, I., Metaxas, G. (2021). A Six-Month, Multi-platform Investigation of Creative Crowdsourcing. In: Karapanos, E., Gerken, J., Kjeldskov, J., Skov, M.B. (eds) Advances in Longitudinal HCI Research. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-030-67322-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-67322-2_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-67321-5

  • Online ISBN: 978-3-030-67322-2

  • eBook Packages: Computer ScienceComputer Science (R0)