Abstract
Crowdsourcing is a technique that aims to obtain data, ideas, and funds, conduct tasks, or even solve problems with the aid of a group of people. It’s a useful technique to save money and time. The quality of data is an issue that confronts crowdsourcing websites; as the data is obtained from the crowd, and how they control the quality of data. In some of the crowdsourcing websites they have implemented mechanisms in order to manage the data quality; such as, rating, reporting, or using specific tools. In this paper, five crowdsourcing websites: Wikipedia, Amazon Mechanical Turk, YouTube, Rally Fighter, and Kickstarter are studied as cases in order to identify the possible quality assurance methods or techniques that are useful to represent crowdsourcing data.
A survey is conducted to gather general opinion about the range of reliability of crowdsourcing sites, their passion and contribution to improve the contents of these sites. Combining those to the available knowledge in the crowdsourcing research, the paper highlights the factors that influence the data quality in crowdsourcing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Howe, J.: The Rise of Crowdsourcing. Wired magazine 14 (6), 1-4 (2006)
Webster, M.: http://www.merriam-webster.com/dictionary/crowdsourcing (accessed April 19, 2014)
Webster, M.: http://www.merriam-webster.com/dictionary/quality?show=0&t=1398180177 (accessed April 19, 2014)
Roebuck, K.: Data Quality: High-impact Strategies - What You Need to Know: Definitions Adoptions, Impact, Benefits, Maturity, Vendors. Emereo Publishing (2001)
Brabham, D.C.: Crowdsourcing: A Model for Leveraging Online Communities. In: Delwiche, A., Henderson, J. (eds.) The Routledge handbook of participatory cultures. Routledge, New York (2001)
Aitamurto, T., Leiponen, A., Tee, R.: The Promise of Idea Crowdsourcing – Benefits, Contexts, Limitations. white paper online available (2011)
Soresina, C.: SkipsoLabs, http://www.skipsolabs.com/en/blog/crowdsourcing/types-ofcrowdsourcing (accessed March 6, 2014)
Erickson, T.: Geocentric Crowdsourcing and Smarter Cities: Enabling Urban Intelligence in Cities and Regions. Founded in the industry website, Crowdsourcing.org (September 2010)
Mashhadi, A.J., Capra, L.: Quality Control for Real-time Ubiquitous Crowdsourcing. In: Proceedings of the 2nd international workshop on Ubiquitous Crowdsouring, pp. 5–8 (2011)
HoĂŸfeld, T., Seufert, M., Hirth, M., Zinner, T., Tran-Gia, P., Schatz, R.: Quantification of YouTube QoE via Crowdsourcing. In: IEEE International Symposium on Multimedia, pp. 494–499 (2011)
Aker, A., El-Haj, M., Albakour, M.-D., Kruschwitz, U.: Assessing Crowdsourcing Quality through Objective Tasks. In: Proceedings of the 8th International Conference on Language Resources and Evaluation (LREC 2012), Istanbul (2012)
Oleson, D., Sorokin, A., Laughlin, G., Hester, V., Le, J., Biewald, L.: Programmatic Gold: Targeted and Scalable Quality Assurance in Crowdsourcing, Human Computation. Papers from the 2011 AAAI Workshop (WS-11-11), pp. 43–48 (2011)
Li, K., Xiao, J., Wang, Y., Wang, Q.: Analysis of the Key Factors for Software Quality in Crowdsourcing Development: An Empirical Study on TopCoder.com. In: IEEE 37th Annual Computer Software and Applications Conference, pp. 812–817 (2013)
Zaidan, O.F., Callison-Burch, C.: Crowdsourcing Translation: Professional Quality from Non-Professionals. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics (ACL 2011), pp. 1220–1229 (2011)
Papineni, K., Poukos, S., Ward, T., Zhu, W.-J.: BLEU: a method for automatic evaluation of machine translation. In: Proceedings of ACL, pp. 311–318 (2002)
Ipeirotis, P.G., Provost, F., Wang, J.: Quality management on Amazon Mechanical Turk. In: Proceedings of the Second Human Computation Workshop (KDD-HCOMP 2010), Washington DC, USA (2010)
Dawid, A.P., Skene, A.M.: Maximum likelihood estimation of observer error-rates using the EM algorithm. J. Roy. Statist. Soc. C (Applied Statistics) 28(1), 20–28 (1979)
QualificationRequirement - Amazon Mechanical Turk, http://docs.aws.amazon.com/AWSMechTurk/latest/AWSMturkAPI/ApiReference_QualificationRequirementDataStructureArticle.html (accessed March 27, 2014)
SpeechInk, Getting Qualified To Do Jobs (June 26, 2010), http://www.youtube.com/watch?v=yMXlCaH7VcQ (accessed March 27, 2014)
Wikipedia, http://en.wikipedia.org/wiki/Wikipedia:Five_pillars (accessed April 22, 2014)
Wikipedia, http://en.wikipedia.org/wiki/Wikipedia:About (accessed April 22 2014)
Tools - YouTube, YouTube, https://www.youtube.com/yt/creators/tools.html (accessed April 25, 2014)
YouTube Community Guidelines, YouTube, https://www.youtube.com/t/community_guidelines (accessed April 25, 2014)
Other reporting options,YouTube, https://support.google.com/youtube/answer/2802057?hl=en&ref_topic=2803138 (Accessed April 25, 2014)
What does the, Mark as Spam, feature do?, YouTube, https://support.google.com/youtube/answer/128036?hl=en (accessed 25 April 2014)
Submit a copyright infringement notification, YouTube, https://support.google.com/youtube/answer/128036?hl=en (accessed April 25, 2014)
How Content ID works, YouTube, https://support.google.com/youtube/answer/2797370?p=cid_what_is&rd=1 (accessed April 25, 2014)
How to use the YouTube Content Verification Program, YouTube, https://support.google.com/youtube/answer/3010500?hl=en (accessed April 25, 2014)
Rally Fighter, http://localmotors.com/rallyfighter/ (accessed April 17, 2014)
Munoz, J.A.: CNN, How the Internet built a $100,000 race car (March 13, 2013), http://www.cnn.com/2013/03/12/tech/web/crowdsourced-carsxsw/ (accessed April 17, 2014)
Kickstarter, https://www.kickstarter.com/ (accessed April 21, 2014)
kickstarter, https://www.kickstarter.com/help/guidelines (accessed April 21, 2014)
kickstarter, https://www.kickstarter.com/privacy?ref=footer (Accessed April 21, 2014)
Jane, W., Watson, R.T.: Analyzing the past to present, MIS quarterly (2001), https://www.kickstarter.com/privacy?ref=footer
Saunders, M., Lewis, P., Thornhill, A.: Research Methods for Business Students, 6th edn. Prentice Hall/ Pearson Education (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Al Sohibani, M., Al Osaimi, N., Al Ehaidib, R., Al Muhanna, S., Dahanayake, A. (2015). Factors That Influence the Quality of Crowdsourcing. In: Bassiliades, N., et al. New Trends in Database and Information Systems II. Advances in Intelligent Systems and Computing, vol 312. Springer, Cham. https://doi.org/10.1007/978-3-319-10518-5_22
Download citation
DOI: https://doi.org/10.1007/978-3-319-10518-5_22
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-10517-8
Online ISBN: 978-3-319-10518-5
eBook Packages: EngineeringEngineering (R0)