Skip to main content

Estimating the Ability of Crowd Workers: An Exploratory Experiment Using the Japanese-English Translation Work

  • Conference paper
  • First Online:
Collaboration and Technology (CRIWG 2018)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11001))

Included in the following conference series:

  • 634 Accesses

Abstract

Crowdsourcing (CS) has its superiority in regards to the quick access to workers throughout the world. On the other hand, when viewed from the prospect of clients who are seeking workers, it is difficult to estimate workers’ performance prior to ordering a task in CS. Crowdsourcing service providers (CSP) produce some indices which may be useful in estimating workers’ performance, however, the correlation between workers’ performance and these indices has not been verified.

In this study, several new indices are proposed and their effectiveness are tested via an exploratory experiment using the Japanese-English translation work. The experimental result indicates some of the proposed indices such as the contribution of consciousness to clients, ambition, the degree of difficulty workers show in the work, awareness of the reward, and the degree of colloquial tone in writing show the correlation with the quality of deliverables. In particular, these trends are more significant for low-performers in terms of the quality of deliverables. Therefore, clients may be able to avoid low-performers by using the proposed indices when they choose workers for CS.

Supported by JSPS KAKENHI Grant Number 15K03647.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Afuah, A., Tucci, L.C.: Crowdsourcing as a solution to distant search. Acad. Manag. Rev. 37(3), 355–375 (2012)

    Article  Google Scholar 

  2. Howe, J.: The rise of crowdsourcing. Wired Mag. 14(6), 1–4 (2006)

    Google Scholar 

  3. Kittur, A., et al.: The future of crowd work. In: CSCW 2013, San Antonio, Texas, USA, 23–27 February 2013

    Google Scholar 

  4. Assemi, B., Schlagwein, D.: Profile information and business outcomes of providers in electronic service marketplaces: an empirical investigation. In: Proceedings of the 23rd Australasian Conference on Information Systems (2012)

    Google Scholar 

  5. Igawa, K., Higa, K., Takamiya, T.: An exploratory study on estimating the ability of high skilled crowd workers. In: Proceedings of the 5th IIAI International Congress on Advanced Applied Informatics (2016)

    Google Scholar 

  6. Conway, J.M.: Distinguishing contextual performance from task performance for managerial jobs. J. Appl. Psychol. 84(1), 3–13 (1999)

    Article  Google Scholar 

  7. Bozzon, A., Brambilla, M., Ceri, S., Silvestri, M., Vesci, G.: Choosing the right crowd: expert finding in social networks. In: EDBT/ICDT 2013, Genoa, Italy, 18–22 March 2013

    Google Scholar 

  8. Gottlieb, L., Friedland, G., Choi, J., Kelm, P., Sikora, T.: Creating experts from the crowd: techniques for finding workers for difficult tasks. IEEE Trans. Multimed. 16(7), 2075–2079 (2014)

    Google Scholar 

  9. Retelny, D., et al.: Expert crowdsourcing with flash teams. In: ACM Symposium on User Interface Software and Technology, UIST (2014)

    Google Scholar 

  10. Burnap, A., Ren, Y., Gerth, R., Papazoglou, G., Gonzalez, R., Papalambros, P.Y.: When crowdsourcing fails: a study of expertise on crowdsourced design evaluation. J. Mech. Des. 137(3), 031101 (2015)

    Article  Google Scholar 

  11. Borman, W.C., Motowidlo, S.J.: Expanding the criterion domain to include elements of contextual performance. In: Schmitt, N., Borman, W.C. (eds.) Personnel Selection in Organizations, pp. 71–98. Jossey-Bass, San Francisco (1993)

    Google Scholar 

  12. Koopmans, L., Bernaards, C.M., Hildebrandt, V.H., van Buuren, S., van der Beek, A.J., de Vet, H.C.: Improving the individual work performance questionnaire using Rasch analysis. J. Appl. Meas. 15(2), 160–175 (2014)

    Google Scholar 

  13. Deci, E.L.: Effects of externally mediated rewards on intrinsic motivation. J. Pers. Soc. Psychol. 18(1), 105–115 (1971)

    Article  Google Scholar 

  14. Deci, E.L.: Intrinsic motivation, extrinsic reinforcement, and inequity. J. Pers. Soc. Psychol. 22(1), 113–120 (1972)

    Article  Google Scholar 

  15. Deci, E.L.: The effects of contingent and noncontingent rewards and controls on intrinsic motivation. Org. Behav. Hum. Perform. 8, 217–229 (1972)

    Article  Google Scholar 

  16. Deci, E.L., Ryan, R.M.: Intrinsic Motivation and Self-determination in Human Behavior. Plenum, New York (1985)

    Book  Google Scholar 

  17. Utman, C.H.: Performance effects of motivational state: a meta-analysis. Pers. Soc. Psychol. Rev. 1(2), 170–182 (1997)

    Article  Google Scholar 

  18. Rogstadius, J., Kostakos, V., Kittur, A., Smus, B., Laredo, J., Vukovic, V.: An assessment of intrinsic and extrinsic motivation on task performance in crowdsourcing markets. In: Proceedings of the Fifth International AAAI Conference on Weblogs and Social Media (2011)

    Google Scholar 

  19. Straub, T., Gimpel, H., Teschner, F., Weinhardt, C.: How (not) to incent crowd workers: payment schemes and feedback in crowdsourcing. Bus. Inf. Syst. Eng. 57(3), 167–179 (2015)

    Article  Google Scholar 

  20. McCrae, R.R., John, O.P.: An introduction to the five factor model and its applications. J. Pers. 60(2), 175–215 (1992)

    Article  Google Scholar 

  21. Kashima, H., Kajino, H.: Crowdsourcing and machine learning. J. Jpn. Soc. Artif. Intell. 27(4), 381–388 (2012)

    Google Scholar 

  22. Resnik, P., Buzek, O., Kronrod, Y., Hu, C., Quinn, A.J., Bederson, B.B.: Using targeted paraphrasing and monolingual crowdsourcing to improve translation. ACM Trans. Intell. Syst. Technol. 4(3), 38 (2013)

    Article  Google Scholar 

  23. Baba, Y., Kashima, H., Kinoshita, K., Yamaguchi, G., Akiyoshi, Y.: Leveraging non-expert crowdsourcing workers for improper task detection in crowdsourcing marketplaces. Expert Syst. Appl. 41, 2678–2687 (2014)

    Article  Google Scholar 

  24. Goto, I., Chow, K.-P., Lu, B., Sumita, E., Tsou, B.K.: Overview of the patent machine translation task at the NTCIR-10 workshop. In: Proceedings of the 10th NTCIR Conference, Tokyo, Japan, 18–21 June 2013

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tsutomu Takamiya .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Takamiya, T., Higa, K., Igawa, K. (2018). Estimating the Ability of Crowd Workers: An Exploratory Experiment Using the Japanese-English Translation Work. In: Rodrigues, A., Fonseca, B., Preguiça, N. (eds) Collaboration and Technology. CRIWG 2018. Lecture Notes in Computer Science(), vol 11001. Springer, Cham. https://doi.org/10.1007/978-3-319-99504-5_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-99504-5_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-99503-8

  • Online ISBN: 978-3-319-99504-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics