A Review on the Methods to Evaluate Crowd Contributions in Crowdsourcing Applications
Due to the nature of crowdsourcing that openly accepts contributions from the crowd, the need to evaluate the contributions is obvious to ensure their reliability. A number of evaluation methods are used in the existing crowdsourcing applications to evaluate these contributions. This study aims to identify and document these methods. To do this, 50 crowdsourcing applications obtained from an extensive literature and online search were reviewed. Analysis performed on the applications found that depending on the types of crowdsourcing applications, whether simple, complex or creative, three different methods are being used. These are expert judgement, rating and feedback. While expert judgement is mostly used in complex and creative crowdsourcing initiatives, rating is widely used in simple ones. This paper is the only reference known so far that documents the current state of the evaluation methods in the existing crowdsourcing applications. It would be useful in determining the way forward for research in the area, such as designing a new evaluation method. It also justifies the need for automated evaluation method for crowdsourced contributions.
KeywordsEvaluation method Crowdsourcing evaluation Survey Systematic review Grounded theory
Information presented in this paper forms part of the research work funded by Universiti Tenaga Nasional entitled Formulation of a Trust Building Mechanism for Trustworthy Non-profit Mobile Crowdsourcing Initiatives Using Beta Reputation System (J510050668).
- 1.Howe, J.: Crowdsourcing: a definition (2015). http://www.crowdsourcing.com
- 3.Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., The PRISMA Group: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann. Intern. Med. 151, 264–269 (2009)Google Scholar
- 6.Rogstadius, J., Kostakos, V., Smus, B., Laredo, J., Vukovic, M.: An assessment of intrinsic and extrinsic motivation on task performance in crowdsourcing markets. In: Fifth International AAAI Conference on Weblogs and Social Media, pp. 321–328. Association of Advanced Artificial Intelligence (2011)Google Scholar
- 10.Smeets, B.C.W.: Crowdsourcing: the process of innovation (2011)Google Scholar
- 11.Kleemann, F., Voß, G.G., Rieder, K.: Un(der)paid innovators: the commercial utilization of consumer work through crowdsourcing. Sci. Technol. Innov. Stud. 4, 5–26 (2008)Google Scholar
- 12.Brown, T.C., Daniel, T.C.: Scaling of ratings: concepts and methods. USDA For. Serv. Res. Pap. RM-293, pp. 1–24 (1990)Google Scholar
- 13.Müller, N.: BlaBlaCar – business model and empirical analysis of usage patterns (2018)Google Scholar
- 14.Pan, W., Xiang, E.W., Yang, Q.: Transfer learning in collaborative filtering with uncertain ratings transfer by integrative factorization. In: Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence, pp. 662–668, Toronto (2011)Google Scholar
- 15.Wang, G., Xie, S., Liu, B., Yu, P.S.: Review graph based online store review spammer detection. In: 2011 IEEE 11th International Conference on Data Mining, Vancouver (2011)Google Scholar
- 17.Leibold, N., Schwarz, L.M.: The art of giving online feedback. J. Eff. Teach. 15, 34–46 (2015)Google Scholar