Skip to main content

Towards a Research Agenda for Enterprise Crowdsourcing

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6415))

Abstract

Over the past few years the crowdsourcing paradigm has evolved from its humble beginnings as isolated purpose-built initiatives, such as Wikipedia and Elance and Mechanical Turk to a growth industry employing over 2 million knowledge workers, contributing over half a billion dollars to the digital economy. Web 2.0 provides the technological foundations upon which the crowdsourcing paradigm evolves and operates, enabling networked experts to work collaboratively to complete a specific task. Enterprise crowdsourcing poses interesting challenges for both academic and industrial research along the social, legal, and technological dimensions.

In this paper we describe the challenges that researchers and practitioners face when thinking about various aspects of enterprise crowdsourcing. First, to establish technological foundations, what are the interaction models and protocols between the Enterprise and the crowd. Secondly, how is crowdsourcing going to face the challenges in quality assurance, enabling Enterprises to optimally leverage the scalable workforce. Thirdly, what are the novel (Web) applications enabled by Enterprise crowdsourcing.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vukovic, M., Bartolini, C.: First International Enterprise Crowdsourcing Workshop. In: Daniel, F., Facca, F.M. (eds.) Current Trends in Web Engineering – ICWE 2010 Workshop Proceedings (2010) (in publication)

    Google Scholar 

  2. Vukovic, M.: Crowdsourcing for Enterprises. In: International Workshop on Cloud Services, In Conjunction with 7th IEEE International Conference on Web Services (July 2009)

    Google Scholar 

  3. Karnin, E., Walach, E., Drory, T.: Crowdsourcing in the Document Processing Practice. In: Proceedings of First Enterprise Crowdsourcing Workshop in conjunction with ICWE 2010 (2010)

    Google Scholar 

  4. Lopez, M., Vukovic, M., Laredo, J.: PeopleCloud Service for Enterprise Crowdsourcing. In: International Conference on Services Computing, Miami, Florida (July 2010)

    Google Scholar 

  5. Vukovic, M., Lopez, M., Laredo, J.: People cloud for globally integrated enterprise. In: First International Workshop on SOA, Globalization, People, & Work, in conjuction with Seventh International Conference on Service Oriented Computing (2009)

    Google Scholar 

  6. Stewart, O., Lubensky, D., Huerta, J.M.: Crowdsourcing participation inequality: a SCOUT model for the enterprise domain. In: Proceedings of the ACM SIGKDD Workshop on Human Computation, HCOMP 2010, Washington DC, July 25, pp. 30–33. ACM, New York (2010)

    Google Scholar 

  7. La Vecchia, G., Cisternino, A.: Collaborative workforce, business process crowdsourcing as an alternative of BPO. In: Proceedings of First Enterprise Crowdsourcing Workshop in conjunction with ICWE 2010 (2010)

    Google Scholar 

  8. Chen, K.Y., Fine, L., Huberman, B.: Predicting the Future. Information Systems Frontiers 5(1), 47–61 (2005)

    Article  Google Scholar 

  9. Surowiecki, J.: The Wisdom of Crowds. Anchor (2005)

    Google Scholar 

  10. Archak, N., Sundararajan, A.: Optimal Design of Crowdsourcing Contest. In: Proceedings Thirtieth International Conference on Information Systems (ICIS 2009), Phoenix (2009)

    Google Scholar 

  11. Carpenter, H.: – Four Models for Competitive Crowdsourcing, Technical Report (2009), spigit.com

  12. Kern, R., Thies, H., Bauer, C., Satzger, G.: Quality Assurance for Human-based Electronic Services: A Decision Matrix for Choosing the Right Approach. In: Proceedings of First Enterprise Crowdsourcing Workshop in conjunction with ICWE 2010 (2010)

    Google Scholar 

  13. von Law, A.: Input-Agreement: A new Mechanism for Collecting Data Using Human Computation Games. In: Proceedings ACM Conference on Human Factors in Computing Systems, CHI 2009, pp. 1197–1206 (2009)

    Google Scholar 

  14. Sorokin, A., Forsyth, D.: Utility data annotation with Amazon Mechanical Turk. In: Proceedings of the Conference on Computer Vision and Pattern Recognition Workshops. IEEE Computer Society, Washington (2008)

    Google Scholar 

  15. Snow, R., O’Connor, B., Jurafsky, D., Ng, A.Y.: Cheap and fast, but is it good? Evaluating non-expert annotations for natural language tasks. In: EMNLP 2008: Proceedings of the Conference on Empirical Methods in Natural Language Processing. ACL, Stroudsburg (2008)

    Google Scholar 

  16. Sheng, V., Provost, F., Ipeirotis, P.: Get Another Label? Improving Data Quality and Data Mining Using Multiple, Noisy Labelers. In: Proceedings of the Fourteenth International Conference on Knowledge Discovery and Data Mining (KDD) (2008)

    Google Scholar 

  17. Kittur, A., Kraut, R.E.: Harnessing the wisdom of crowds in Wikipedia: quality through coordination. In: Shen, W., Yong, J., Yang, Y., Barthès, J.-P.A., Luo, J. (eds.) CSCWD 2007. LNCS, vol. 5236. Springer, Heidelberg (2008)

    Google Scholar 

  18. Chen, K., Chang, C., Wu, C., Chang, Y., Lei, C.: Quadrant of euphoria: a crowdsourcing platform for QoE assessment. Network Magazine of Global Internetworking (2010)

    Google Scholar 

  19. Lakhani, K., Garbin, D., Lonstein, E.: TopCoder (A): Developing Software through Crowdsourcing. Harvard Business School Case 610-032

    Google Scholar 

  20. Jeppesen, L., Lakhani, K.: Marginality and problem solving effectiveness in broadcast search. Organization Science 20 (forthcoming)

    Google Scholar 

  21. Oliviera, F., Ramos, I., Santos, L.: Definition of a Crowdsourcing Innovation service for the European SMEs. In: Proceedings of First Enterprise Crowdsourcing Workshop in conjunction with ICWE 2010 (2010)

    Google Scholar 

  22. Brabham, D.: Crowdsourcing as a model for problem solving: An introduction and cases. Convergence: The International Journal of Research into New Media Technologies 14(1), 75–90 (2008)

    Google Scholar 

  23. Lakhani, K., Boudreau, K.: How to Manage Outside Innovation. MIT Sloan Management Review 50(4)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Vukovic, M., Bartolini, C. (2010). Towards a Research Agenda for Enterprise Crowdsourcing. In: Margaria, T., Steffen, B. (eds) Leveraging Applications of Formal Methods, Verification, and Validation. ISoLA 2010. Lecture Notes in Computer Science, vol 6415. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-16558-0_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-16558-0_36

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-16557-3

  • Online ISBN: 978-3-642-16558-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics