Crowdsourcing User and Design Research

  • Vassilis Javed Khan
  • Gurjot Dhillon
  • Maarten Piso
  • Kimberly Schelle


Crowdsourcing can be defined as a task, which is usually performed by an employee, that is given out as an open call to a crowd of users to be completed. Although crowdsourcing has been growing in recent years, its application to design research and education has only scratched the surface of its potential. In this chapter we first introduce the different types of crowdsourcing. Then, following the typical design cycle we present examples from literature and cases from an educational setting of how crowdsourcing can support designers. Based on these examples we provide a list of tips for utilizing crowdsourcing for design and user research activities.


Design Team Design Cycle Requirement Elicitation Amazon Mechanical Turk Amazon Mechanical Turk 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



We would like to extend our gratitude to all trainees of the User-system interaction (USI) program of Eindhoven University of Technology (generation 2013) for their insights into using crowdsourcing for design and user research. Moreover, we would like to specially thank the Mindswarms team for their continuous support in helping us to use their service for our educational objectives and the other aforementioned services.

Annotated Bibliography

  1. Kittur A, Chi EH, Suh B (2008) Crowdsourcing user studies with Mechanical Turk. In: Proceedings of the SIGCHI conference on human factors in computing systems, April. ACM, New York, USA, pp 453–456Google Scholar
  2. The authors describe an experiment in which they investigate Amazon’s Mechanical Turk and how they can get quality responses from workers Google Scholar
  3. Mason W, Suri S (2012) Conducting behavioral research on Amazon’s Mechanical Turk. Behav Res Methods 44(1):1–23CrossRefGoogle Scholar
  4. The paper demonstrates how to use Amazon’s Mechanical Turk website for conducting behavioral research and tries to lower the entry barrier for researchers who could benefit from this platform Google Scholar
  5. McDonald P, Mohebbi M, Slatkin B (2012) Comparing Google consumer surveys to existing probability and non-probability based internet surveys. Google Whitepaper. Retrieved from
  6. Description of how Google Consumer Surveys is superior to current probability based Internet panels by using what is known as a “surveywall” to attract respondents Google Scholar
  7. Aitamurto T, Leiponen A, Tee R (2013). The promise of idea crowdsourcing: benefits, contexts, limitations, September 25. Retrieved from
  8. The authors review crowdsourcing for idea generation (‘idea crowdsourcing’) both from the perspective of academic literature and actual cases from businesses to understand how and when to use crowdsourcing and with which benefits and costs Google Scholar


  1. 4 Reasons you should consider crowdsourced design for your next big project (n.d.) Retrieved September 30, 2014, from
  2. Adepetu A, Ahmed KA, Al Abd, Y, Al Zaabi A, Svetinovic D (2012) CrowdREquire: a requirements engineering crowdsourcing platform. In: AAAI spring symposium: wisdom of the crowd, MarchGoogle Scholar
  3. Albert W, Tullis T, Tedesco D (2009) Beyond the usability lab: conducting large-scale online user experience studies. Morgan KaufmannGoogle Scholar
  4. Bayus BL (2013) Crowdsourcing new product ideas over time: an analysis of the dell IdeaStorm community. Manag Sci 59(1):226–244, CrossRefGoogle Scholar
  5. Bjelland O, Wood R (2008) An inside view of IBM’s ‘innovation jam. MIT Sloan Manag Rev 50(1):32–40Google Scholar
  6. Blattberg E (2011) Crowdsourcing industry landscape. Retrieved from
  7. Brabham DC (2008) Crowdsourcing as a model for problem solving: an introduction and cases. Converg Int J Res New Media Technol 14(1):75–90, CrossRefGoogle Scholar
  8. Corney JR, Sanchez CT, Jagadeesan AP, Regli WC (2009) Outsourcing labour to the cloud. Int J Innov Sustain Dev 4(4):294, CrossRefGoogle Scholar
  9. Crowdsourcing your brand design: the math just doesn’t work out (n.d.) Retrieved from
  10. Dow S, Gerber E, Wong A (2013) A pilot study of using crowds in the classroom. ACM Press, New York, USA, p227,
  11. Felstiner A (2011) Working the crowd: employment and labor law in the crowdsourcing industry (SSRN scholarly paper No. ID 1593853). Social Science Research Network, Rochester. Retrieved from
  12. Finke RA (1990) Creative imagery: discoveries and inventions in visualization. L. Erlbaum Associates, HillsdaleGoogle Scholar
  13. Geiger D, Rosemann M, Fielt E, Schader M (2012) Crowdsourcing information systems – definition, typology, and design. In: ICIS 2012 proceedings. Retrieved from
  14. Glinz M, Wieringa RJ (2007) Guest editors’ introduction: stakeholders in requirements engineering. Software IEEE 24(2):18–20CrossRefGoogle Scholar
  15. Grier DA (2011) Foundational issues in human computing and crowdsourcing. In: CHI 2011Google Scholar
  16. Heer J, Bostock M (2010). Crowdsourcing graphical perception: using mechanical turk to assess visualization design. In: Proceedings of the SIGCHI conference on human factors in computing systems, March. ACM, pp 203–212Google Scholar
  17. Hosseini M, Phalp K, Taylor J, Ali R (2013) Towards crowdsourcing for requirements engineering. In: Joint proceedings of REFSQ-2014 workshops, doctoral symposium, empirical track, and posters, co-located with the 20th international conference on requirements engineering: foundation for software quality (REFSQ 2014). Retrieved from
  18. Howe J (2006) The rise of crowdsourcing. North 14:1–5, Google Scholar
  19. Ipeirotis PG (2010) Analyzing the Amazon Mechanical Turk marketplace. XRDS Crossroads ACM Magaz Stud 17(2):16.
  20. Ko AJ, Chilana PK (2010) How power users help and hinder open bug reporting. ACM Press, New York, USA, p1665.
  21. Lewis JR (2014) Usability: lessons learned … yet to be learned. Int J Hum Comput Interact 30(9):663–684, CrossRefGoogle Scholar
  22. Lim SL, Quercia D, Finkelstein A (2010a) StakeNet: using social networks to analyse the stakeholders of large-scale software projects. In: Proceedings of the 32Nd ACM/IEEE international conference on software engineering – Volume 1. ACM, New York, pp 295–304.
  23. Lim SL, Quercia D, Finkelstein A (2010b) StakeSource: harnessing the power of crowdsourcing and social networks in stakeholder analysis, vol 2. ACM Press, p 239.
  24. Liu D, Bias RG, Lease M, Kuipers R (2012) Crowdsourcing for usability testing. Proc Am Soc Inf Sci Technol 49(1):1–10, Google Scholar
  25. Li Z (2011) Research of crowdsourcing model based on case study. Manag Sci 1–5.
  26. Malone TW, Laubacher RJ (1999) The dawn of the E-Lance economy. In: Nüttgens M, Scheer A-W (eds) Electronic business engineering. Physica-Verlag HD, Heidelberg, pp 13–24. Retrieved from
  27. Mason W, Watts DJ (2009) Financial incentives and the "performance of crowds". In: Bennett P, Chandrasekar R, Chickering M, Ipeirotis P, Law E, Mityagin A, Provost F, von Ahn L (eds) Proceedings of the ACM SIGKDD workshop on human computation (HCOMP '09). ACM, New York, pp 77–85,
  28. Nelson ET, Stavrou A (2011) Advantages and disadvantages of remote asynchronous usability testing using Amazon Mechanical Turk. Proc Hum Fact Ergon Soc Annu Meet 55(1):1080–1084, CrossRefGoogle Scholar
  29. Poetz MK, Schreier M (2012) The value of crowdsourcing: can users really compete with professionals in generating new product ideas?: the value of crowdsourcing. J Prod Innov Manag 29(2):245–256, CrossRefGoogle Scholar
  30. Quinn AJ, Bederson BB (2011) Human computation: a survey and taxonomy of a growing field. ACM Press, New York, USA, p 1403.
  31. Rogers Y, Sharp H, Preece J (2011) Interaction design: beyond human computer interaction, 3rd edn. WileyGoogle Scholar
  32. Ross J, Irani L, Silberman MS, Zaldivar A, Tomlinson B (2010) Who are the crowdworkers?: shifting demographics in mechanical turk. ACM Press, p 2863.
  33. Schön DA (1983) The reflective practitioner: how professionals think in action. Basic Books, New YorkGoogle Scholar
  34. Simula H, Ahola T (2014) A network perspective on idea and innovation crowdsourcing in industrial firms. Ind Mark Manag 43(3):400–408, CrossRefGoogle Scholar
  35. Standish Group (1999) Chaos: a recipe for success. Standish Group InternationalGoogle Scholar
  36. Surowiecki J (2005) The wisdom of crowds. Anchor Books, New YorkGoogle Scholar
  37. Tates K, Zwaanswijk M, Otten R, van Dulmen S, Hoogerbrugge PM, Kamps WA, Bensing JM (2009) Online focus groups as a tool to collect data in hard-to-include populations: examples from paediatric oncology. BMC Med Res Methodol 9(1):15, CrossRefGoogle Scholar
  38. Taylor A (2000) IT projects: sink or swim. Comput Bull January, 24–26Google Scholar
  39. Viskovic D, Varga M, Curko K (2008) Bad practices in complex IT projects. In: ITI 2008 – 30th international conference on information technology interfaces, p. 301Google Scholar
  40. von Ahn L, Maurer B, McMillen C, Abraham D, Blum M (2008) reCAPTCHA: human-based character recognition via web security measures. Science 321(5895):1465–1468, MathSciNetCrossRefzbMATHGoogle Scholar
  41. Why Designers Hate Crowdsourcing (n.d.) Retrieved September 30, 2014, from
  42. Yu L, Nickerson JV (2013) An internet-scale idea generation system. ACM Trans Interact Intell Syst 3(1):1–24, CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Vassilis Javed Khan
    • 1
  • Gurjot Dhillon
    • 1
  • Maarten Piso
    • 1
  • Kimberly Schelle
    • 1
  1. 1.Industrial Design DepartmentEindhoven University of TechnologyEindhovenThe Netherlands

Personalised recommendations