Skip to main content
Log in

Knowledge based quality analysis of crowdsourced software development platforms

  • S.I. : CMKBO
  • Published:
Computational and Mathematical Organization Theory Aims and scope Submit manuscript

Abstract

As an emerging and promising approach, crowdsourcing-based software development has become popular in many domains due to the participation of talented pool of developers in the contests, and to promote the ability of requesters (or customers) to choose the ‘wining’ solution with respect to their desired quality levels. However, due to lack of a central mechanism for team formation, continuity in the developer’s work on consecutive tasks and risk of noise in submissions of a contest, there is a gap between the requesters of a domain and their quality concerns related to the adaptation of a crowdsourcing-based software development platform. In order to address concerns and aid requesters, we describe three measures; Quality of Registrant Developers (QRD), Quality of Contest (QC) and Quality of Support (QS) to compute and predict the quality of a crowdsourcing-based platform through historical information on its completed tasks. We evaluate the capacity of the QRD, QC and QS as assessors to predict the quality. Subsequently, we implement a crawler to mine the information of completed development tasks from the TopCoder platform to inspect the proposed measures. The promising results of our QRD, QC, and QS measures suggest to use the proposed measures to the requesters and researchers of other domains such as pharmaceutical research and development, in order to investigate and predict the quality of crowdsourcing-based software development platforms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. https://www.topcoder.com/

  2. https://www.topcoder.com/member-onboarding/understanding-your-topcoder-rating/.

  3. http://www.mathwave.com/.

References

  • Alsadik B (2016) Crowdsource and web-published videos for 3D documentation of cultural heritage objects. J Cult Herit 21:899–903

    Article  Google Scholar 

  • Bonabeau E (2009) Decisions 2.0: the power of collective intelligence, MIT. Sloan Manag Rev 50(2):45–52

    Google Scholar 

  • Brabham DC (2012) The myth of amateur crowds: a critical discourse analysis of crowdsourcing coverage. Inf Commun Soc 15(3):394–410

    Article  Google Scholar 

  • Dow SP, Kulkarni A, Klemmer SR, Hartmann B (2012) Shepherding the crowd yields better work. In: Proc. computer-supported cooperative work. ACM

  • Ebner W, Leimeister M, Bretschneider U, Krcmar H (2008) Leveraging the wisdom of crowds: designing an IT supported Ideas competition for an ERP software company. In: Proceedings of 41st Hawaii international conference system sciences

  • Feller J, Fitzgerald B (2002) Understanding open source software development. Addison-Wesley, Boston

    Google Scholar 

  • Hussain S, Keung J, Khan AA, Bennin KE (2016) A methodology to automate the selection of design patterns. In: Proceedings of international conference on computers, software and application. IEEE (2016)

  • Hussain S et al (2017a) Implication of deep learning for the automation of design patterns organization. J Parallel Distrib Comput

  • Hussain S, Keung J, Khan AA (2017b) Software design patterns classification and selection using text categorization. Appl Soft Comput 58:225–248

    Article  Google Scholar 

  • Kulkarni A, Can M, Hartmann B (2012) Collaboratively crowdsourcing workflows with turkomatic. In: Proceedings of the ACM 2012 conference on computer supported cooperative work (CSCW), pp 1003–1012

  • Mao K, Capra L, Harman M, Jia Y (2017) A survey of the use of crowdsourcing in software engineering. J Syst Softw 126:57–84

    Article  Google Scholar 

  • Morschheuser B, Hamari J, Koivisto J, Maedche A (2017) Gamified crowdsourcing: conceptualization, literature review, and future agenda. Int J Hum Comput Stud 106:26–43

    Article  Google Scholar 

  • Schenk, E, Guittard C (2009) Crowdsourcing: what can be outsourced to the crowd, and why? 3

  • Schenk E, Guittard C (2011) Towards a characterization of crowdsourcing practices. J Innov Econ 1(7):93–107

    Article  Google Scholar 

  • Stol KJ, Fitzgerald B (2014) Two’s company, three’s a crowd: a case study of crowdsourcing software development. In: Proceeding of international conference on software engineering (ICSE)

  • TopCoder (2013)10 Burning Questions on Crowdsourcing: your starting guide to open innovation and crowdsourcing success https://www.topcoder.com/blog/10-burning-questions-on-crowdsourcing-and-open-innovation/

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Asad Habib.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Habib, A., Hussain, S., Khan, A.A. et al. Knowledge based quality analysis of crowdsourced software development platforms. Comput Math Organ Theory 25, 122–131 (2019). https://doi.org/10.1007/s10588-018-9269-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10588-018-9269-5

Keywords

Navigation