Advertisement

Crowdsourcing pp 113-130 | Cite as

An Evolutionary and Automated Virtual Team Making Approach for Crowdsourcing Platforms

  • Tao YueEmail author
  • Shaukat Ali
  • Shuai Wang
Chapter
Part of the Progress in IS book series (PROIS)

Abstract

Crowdsourcing has demonstrated its capability of supporting various software development activities including development and testing as it can be seen by several successful crowdsourcing platforms such as TopCoder and uTest. However, to crowd source large-scale and complex software development and testing tasks, there are several optimization challenges to be addressed such as division of tasks, searching a set of registrants, and assignment of tasks to registrants.Since in crowdsourcing a task can be assigned to registrants geographically distributed with various backgrounds, the quality of final task deliverables is a key issue. As the first step to improve the quality, we propose a systematic and automated approach to optimize the assignment of registrants in a crowdsourcing platform to a crowdsourcing task. The objective is to find the best fit of a group of registrants to the defined task. A few examples of factors forming the optimization problem include budget defined by the task submitter and pay expectation from a registrant, skills required by a task, skills of a registrant, task delivering deadline, and availability of a registrant. We first collected a set of commonly seen factors that have impact on the perfect matching between tasks submitted and a virtual team that consists of a selected set of registrants. We then formulated the optimization objective as a fitness functionłthe heuristics used by search algorithms (e.g., Genetic Algorithms) to find an optimal solution. We empirically evaluated a set of well-known search algorithms in software engineering, along with the proposed fitness function, to identify the best solution for our optimization problem. Results of our experiments are very positive in terms of solving optimization problems in a crowdsourcing context.

Keywords

Crowdsourcing Search algorithms Empirical studies 

References

  1. 1.
    Ali, S., Briand, L.C., Hemmati, H., Panesar-Walawege, R.K.: A systematic review of the application and empirical investigation of search-based test case generation. IEEE Trans. Softw. Eng. 36(6), 742–762 (2010)Google Scholar
  2. 2.
    Arcuri, A.: It does matter how you normalise the branch distance in search based software testing. In: 2010 Third International Conference on Software Testing, Verification and Validation (ICST), pp. 205–214. IEEE (2010)Google Scholar
  3. 3.
    Arcuri, A.: It really does matter how you normalize the branch distance in search-based software testing. Softw. Test. Verif. and Reliab. 23(2), 119–147 (2013)CrossRefGoogle Scholar
  4. 4.
    Arcuri, A., Briand, L.: A practical guide for using statistical tests to assess randomized algorithms in software engineering. In: 2011 33rd International Conference on Software Engineering (ICSE), pp. 1–10. IEEE (2011)Google Scholar
  5. 5.
    Arcuri, A., Fraser, G.: Search based software engineering. On Parameter Tuning in Search Based Software Engineering, pp. 33–47. Springer, New York (2011)CrossRefGoogle Scholar
  6. 6.
    Bagnall, A.J., Rayward-Smith, V.J., Whittley, I.M.: The next release problem. Info. Softw. Technol. 43(14), 883–890 (2001)CrossRefGoogle Scholar
  7. 7.
    Baker, P., Harman, M., Steinhofel, K., Skaliotis, A.: Search based approaches to component selection and prioritization for the next release problem. In: 22nd IEEE International Conference on Software Maintenance. ICSM’06. pp. 176–185. IEEE (2006)Google Scholar
  8. 8.
    Burke, E.K., Kendall, G.: Search Methodologies. Springer, New York (2005)CrossRefGoogle Scholar
  9. 9.
    Droste, S., Jansen, T., Wegener, I.: On the analysis of the (1+ 1) evolutionary algorithm. Theor. Comput. Sci. 276(1), 51–81 (2002)CrossRefGoogle Scholar
  10. 10.
    Finkelstein, A., Harman, M., Mansouri, S.A., Ren, J., Zhang, Y.: A search based approach to fairness analysis in requirement assignments to aid negotiation, mediation and decision making. Requir. Eng. 14(4), 231–245 (2009)CrossRefGoogle Scholar
  11. 11.
    Harman, M., Mansouri, S.A., Zhang, Y.: Search based software engineering: a comprehensive analysis and review of trends techniques and applications. Technical Report Department of Computer Science, Kings College London, TR-09-03 (2009)Google Scholar
  12. 12.
    Korel, B.: Automated software test data generation. IEEE Trans. Softw. Eng. 16(8), 870–879 (1990)CrossRefGoogle Scholar
  13. 13.
    McMinn, P.: Search-based software test data generation: a survey. Softw. Test. Verif. Reliab. 14(2), 105–156 (2004)CrossRefGoogle Scholar
  14. 14.
    Yue, T., Ali, S.: Applying search algorithms for optimizing stakeholders familiarity and balancing workload in requirements assignment. In: Proceedings of the 2014 Conference on Genetic and Evolutionary Computation, pp. 1295–1302. ACM, New York (2014)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2015

Authors and Affiliations

  1. 1.Certus Software V&V CenterSimula Research LaboratoryOsloNorway

Personalised recommendations