Advertisement

On the Invitation of Expert Contributors from Online Communities for Knowledge Crowdsourcing Tasks

  • Jasper OostermanEmail author
  • Geert-Jan Houben
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9671)

Abstract

The successful execution of knowledge crowdsourcing (KC) tasks requires contributors to possess knowledge or mastery in a specific domain. The need for expert contributors limits the capacity of online crowdsourcing marketplaces to cope with KC tasks. While online social platforms emerge as a viable alternative source of expert contributors, how to successfully invite them remains an open research question. We contribute an experiment in expert contributors invitation where we study the performance of two invitation strategies: one addressed to the individual expert contributors, and one addressed to communities of knowledge. We target reddit, a popular social bookmarking platform, to seek expert contributors in the botany and ornithology domains of knowledge, and to invite them to contribute an artwork annotation KC task. Results provide novel insights on the effectiveness of direct invitations strategies, but show how soliciting collaboration through communities yields, in the context of our experiment, more contributions.

Keywords

Target Platform Annotation Task Spam Detection Open Research Question Invitation Message 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Arolas, E.E., González-Ladrón-de-Guevara, F.: Towards an integrated crowdsourcing definition. J. Inf. Sci. 38(2), 189–200 (2012)CrossRefGoogle Scholar
  2. 2.
    Bozzon, A., Brambilla, M., Ceri, S.: Answering search queries with crowdsearcher. In: Proceedings of WWW, pp. 1009–1018, 16–20 April 2012, Lyon, France (2012)Google Scholar
  3. 3.
  4. 4.
    Chandler, D., Kapelner, A.: Breaking monotony with meaning: motivation in crowdsourcing markets. CoRR, abs/1210.0962 (2012)Google Scholar
  5. 5.
    Cheng, Z., Caverlee, J., Barthwal, H., Bachani, V.: Who is the barbecue king of texas?: a geo-spatial approach to finding local experts on twitter. In: Proceedings of SIGIR, SIGIR 2014, pp. 335–344, New York, NY, USA. ACM (2014)Google Scholar
  6. 6.
    Difallah, D.E., Demartini, G., Cudré-Mauroux, P.: Pick-a-crowd: tell me what you like, and i’ll tell you what to do. In: Proceedings of WWW, pp. 367–374 (2013)Google Scholar
  7. 7.
    Dijkshoorn, C., et al.: Personalized nichesourcing: acquisition of qualitative annotations from niche communities. In: Proceedings of UMAP Workshops, 10–14 June 2013, Rome, Italy (2013)Google Scholar
  8. 8.
    Gneezy, U., Rustichini, A.: Pay enough or don’t pay at all. Q. J. Econ. 115(3), 791–810 (2000)CrossRefGoogle Scholar
  9. 9.
    Ipeirotis, P.G., Gabrilovich, E.: Quizz: targeted crowdsourcing with a billion (potential) users. In: Proceedings WWW, WWW 2014, pp. 143–154, New York, NY, USA. ACM (2014)Google Scholar
  10. 10.
    Kassing, S., Oosterman, J., Bozzon, A., Houben, G.: Locating domain-specific contents and experts on social bookmarking communities. In: Proceedings of SAC 2015, pp. 747–752, 13–17 April 2015, Salamanca, Spain (2015)Google Scholar
  11. 11.
    Rogstadius, J., Kostakos, V., Kittur, A., Smus, B., Laredo, J., Vukovic, M.: An assessment of intrinsic and extrinsic motivation on task performance in crowdsourcing markets. In: Proceedings of ICWSM, 17–21 July 2011, Barcelona, Catalonia, Spain (2011)Google Scholar
  12. 12.
    Yu, L., André, P., Kittur, A., Kraut, R.: A comparison of social, learning, and financial strategies on crowd engagement and output quality. In: Proceedings of CSCW, pp. 967–978. ACM (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Delft University of TechnologyDelftThe Netherlands

Personalised recommendations