Abstract
Linguistic resources can be populated with data through the use of such approaches as crowdsourcing and gamification when motivated people are involved. However, current crowdsourcing genre taxonomies lack the concept of cooperation, which is the principal element of modern video games and may potentially drive the annotators’ interest. This survey on crowdsourcing taxonomies and cooperation in linguistic resources provides recommendations on using cooperation in existent genres of crowdsourcing and an evidence of the efficiency of cooperation using a popular Russian linguistic resource created through crowdsourcing as an example.
Keywords
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
References
Biemann, C.: Creating a system for lexical substitutions from scratch using crowdsourcing. Lang. Resour. Eval. 47(1), 97–122 (2013)
Sabou, M., Bontcheva, K., Derczynski, L., Scharl, A.: Corpus annotation through crowdsourcing: towards best practice guidelines. In: Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC 2014), European Language Resources Association (ELRA), pp. 859–866 (2014)
Lofi, C., Selke, J., Balke, W.T.: Information extraction meets crowdsourcing: a promising couple. Datenbank-Spektrum 12(2), 109–120 (2012)
Quinn, A.J., Bederson, B.B.: A Taxonomy of Distributed Human Computation. Human-Computer Interaction Lab Tech Report, University of Maryland (2009)
Yuen, M.C., Chen, L.J., King, I.: A survey of human computation systems. In: International Conference on Computational Science and Engineering, 2009. CSE 2009. vol. 4, pp. 723–728. IEEE (2009)
Sabou, M., Bontcheva, K., Scharl, A.: Crowdsourcing Research Opportunities: Lessons from Natural Language Processing. In: Proceedings of the 12th International Conference on Knowledge Management and Knowledge Technologies, pp. 17:1–17:8. ACM (2012)
Sabou, M., Scharl, A., Michael, F.: Crowdsourced knowledge acquisition: towards hybrid-genre workflows. Int. J. Semant. Web Inf. Syst. 9(3), 14–41 (2013)
Zwass, V.: Co-Creation: toward a taxonomy and an integrated research perspective. Int. J. Electron. Commer. 15(1), 11–48 (2010)
Erickson, T.: Some thoughts on a framework for crowdsourcing. In: CHI 2011 Workshop on Crowdsourcing and Human Computation (2011)
Suendermann, D., Pieraccini, R.: Crowdsourcing for Industrial Spoken Dialog Systems. In: Eskénazi, M., Levow, G.A., Meng, H., Parent, G., Suendermann, D. (eds.) Crowdsourcing for Speech Processing: Applications to Data Collection, pp. 280–302. Transcription and Assessment. John Wiley & Sons, Ltd (2013)
Wang, A., Hoang, C.D.V., Kan, M.Y.: Perspectives on crowdsourcing annotations for natural language processing. Lang. Resour. Eval. 47(1), 9–31 (2013)
Kohn, A.: No Contest: A Case Against Competition. Houghton Mifflin Harcourt, New York (1992)
Wilkinson, D.M., Huberman, B.A.: Cooperation and quality in wikipedia. In: Proceedings of the 2007 International Symposium on Wikis, pp. 157–164. ACM (2007)
Arazy, O., Nov, O.: Determinants of wikipedia quality: the roles of global and local contribution inequality. In: Proceedings of the 2010 ACM Conference on Computer Supported Cooperative Work, pp. 233–236. ACM (2010)
Budzise-Weaver, T., Chen, J., Mitchell, M.: Collaboration and crowdsourcing: the cases of multilingual digital libraries. Electron. Libr. 30(2), 220–232 (2012)
Ranj Bar, A., Maheswaran, M.: Case study: integrity of wikipedia articles. In: Confidentiality and Integrity in Crowdsourcing Systems. SpringerBriefs in Applied Sciences and Technology. Springer International Publishing, pp. 59–66 (2014)
Poesio, M., Chamberlain, J., Kruschwitz, U., Robaldo, L., Ducceschi, L.: Phrase Detectives: Utilizing Collective Intelligence for Internet-scale Language Resource Creation. ACM Trans. Interact. Intell. Syst. 3(1), 3:1–3:44 (2013)
Yasseri, T., Sumi, R., Rung, A., Kornai, A., Kertész, J.: Dynamics of conflicts in wikipedia. PLOS ONE 7(6), e38869 (2012)
Braslavski, P., Ustalov, D., Mukhin, M.: A spinning wheel for YARN: user interface for a crowdsourced thesaurus. In: Proceedings of the Demonstrations at the 14th Conference of the European Chapter of the Association for Computational Linguistics, Association for Computational Linguistics, pp. 101–104 (2014)
Bocharov, V., Alexeeva, S., Granovsky, D., Protopopova, E., Stepanova, M., Surikov, A.: Crowdsourcing morphological annotation. In: Computational Linguistics and Intellectual Technologies: papers from the Annual conference “Dialogue”, RGGU, pp. 109–124 (2013)
Acknowledgments
This work is supported by the Russian Foundation for the Humanities, project no. 13-04-12020 “New Open Electronic Thesaurus for Russian”, and by the Program of Government of the Russian Federation 02.A03.21.0006 on 27.08.2013.
The author would like to thank Dmitry Granovsky for the extended statistical information collected from http://opencorpora.org/. The author is also grateful to the anonymous referees who offered very useful comments on the present paper.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Ustalov, D. (2015). Towards Crowdsourcing and Cooperation in Linguistic Resources. In: Braslavski, P., Karpov, N., Worring, M., Volkovich, Y., Ignatov, D.I. (eds) Information Retrieval. RuSSIR 2014. Communications in Computer and Information Science, vol 505. Springer, Cham. https://doi.org/10.1007/978-3-319-25485-2_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-25485-2_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-25484-5
Online ISBN: 978-3-319-25485-2
eBook Packages: Computer ScienceComputer Science (R0)