Abstract
Tasks that require users to have expert knowledge are difficult to crowdsource. They are mostly too complex to be carried out by non-experts and the available experts in the crowd are difficult to target. Adapting an expert task into a non-expert user task, thereby enabling the ordinary “crowd” to accomplish it, can be a useful approach. We studied whether a simplified version of an expert annotation task can be carried out by non-expert users. Users conducted a game-style annotation task of oil paintings. The obtained annotations were compared with those from experts. Our results show a significant agreement between the annotations done by experts and non-experts, that users improve over time and that the aggregation of users’ annotations per painting increases their precision.
Keywords
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Carletti, L., Giannachi, G., McAuley, D.: Digital humanities and crowdsourcing: An exploration. In: MW 2013: Museums and the Web 2013 (2013)
Dijkshoorn, C., Leyssen, M.H.R., Nottamkandath, A., Oosterman, J., Traub, M.C., Aroyo, L., Bozzon, A., Fokkink, W., Houben, G.-J., Hovelmann, H., Jongma, L., van Ossenbruggen, J., Schreiber, G., Wielemaker, J.: Personalized nichesourcing: Acquisition of qualitative annotations from niche communities. In: 6th International Workshop on Personalized Access to Cultural Heritage (PATCH 2013), pp. 108–111 (2013)
Galton, F.: Vox populi. Nature 75(1949), 7 (1907)
Golbeck, J., Koepfler, J., Emmerling, B.: An experimental study of social tagging behavior and image content. Journal of the American Society for Information Science and Technology 62(9), 1750–1760 (2011)
He, J., van Ossenbruggen, J., de Vries, A.P.: Do you need experts in the crowd?: a case study in image annotation for marine biology. In: Proceedings of the 10th Conference on Open Research Areas in Information Retrieval, OAIR 2013, Paris, France, pp. 57–60 (2013); Le Centre De Hautes Etudes Internationales D’Informatique Documentaire
Heimerl, K., Gawalt, B., Chen, K., Parikh, T., Hartmann, B.: Communitysourcing: engaging local crowds to perform expert work via physical kiosks. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems, CHI 2012, pp. 1539–1548. ACM, New York (2012)
Hildebrand, M., van Ossenbruggen, J., Hardman, L., Jacobs, G.: Supporting subject matter annotation using heterogeneous thesauri: A user study in web data reuse. International Journal of Human-Computer Studies 67(10), 887–902 (2009)
Hosseini, M., Cox, I.J., Milić-Frayling, N., Kazai, G., Vinay, V.: On aggregating labels from multiple crowd workers to infer relevance of documents. In: Baeza-Yates, R., de Vries, A.P., Zaragoza, H., Cambazoglu, B.B., Murdock, V., Lempel, R., Silvestri, F. (eds.) ECIR 2012. LNCS, vol. 7224, pp. 182–194. Springer, Heidelberg (2012)
von Ahn, L., Dabbish, L.: ESP: Labeling images with a computer game. In: AAAI Spring Symposium: Knowledge Collection from Volunteer Contributors, pp. 91–98. AAAI (2005)
Wouters, S.: Semi-automatic annotation of artworks using crowdsourcing. Master’s thesis, Vrije Universiteit Amsterdam, The Netherlands (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Traub, M.C., van Ossenbruggen, J., He, J., Hardman, L. (2014). Measuring the Effectiveness of Gamesourcing Expert Oil Painting Annotations. In: de Rijke, M., et al. Advances in Information Retrieval. ECIR 2014. Lecture Notes in Computer Science, vol 8416. Springer, Cham. https://doi.org/10.1007/978-3-319-06028-6_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-06028-6_10
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-06027-9
Online ISBN: 978-3-319-06028-6
eBook Packages: Computer ScienceComputer Science (R0)