Abstract
Crowdsourcing techniques have been developed over the past decade to leverage the ‘wisdom of crowds’ to solve real-world problems and human intelligence tasks. An example of such a task is fact-checking, that asks workers to check whether the shown claim is true or not. Fact-checking tasks are used in many applications such as services to identify fake news. This context motivates us to investigate a crowdsourcing-based framework to find convincing proofs for fact-checking. This study investigates a good crowdsourcing strategy to obtain convincing proofs with evidence. We focus on an iterative workflow in which workers improve the proofs provided by other crowd workers. In this study, we show the results of our experiment, in which we compare the results of two conditions, by which we see the effects of the diversification of shown evidence and proofs. The experimental results with real-world workers show that showing diverse evidence and proofs to crowd workers in the iterative workflow helps them develop more convincing proofs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
References
Bevelander, K.E.: Crowdsourcing novel childhood predictors of adult obesity. PLoS ONE 9(2), e87756 (2014)
Daniel, F., Kucherbaev, P., Cappiello, C., Benatallah, B., Allahbakhsh, M.: Quality control in crowdsourcing: a survey of quality attributes, assessment techniques, and assurance actions. ACM Comput. Surv. (CSUR) 51(1), 1–40 (2018)
Ghosh, S., Sharma, N., Benevenuto, F., Ganguly, N., Gummadi, K.: Cognos: crowdsourcing search for topic experts in microblogs. In: Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 575–590 (2012)
Graves, D.: Understanding the promise and limits of automated fact-checking (2018)
Guo, Z., Schlichtkrull, M., Vlachos, A.: A survey on automated fact-checking. arXiv preprint arXiv:2108.11896 (2021)
Karadzhov, G., Nakov, P., Màrquez, L., Barrón-Cedeño, A., Koychev, I.: Fully automated fact checking using external sources. arXiv preprint arXiv:1710.00341 (2017)
Kim, J., Tabibian, B., Oh, A., Schölkopf, B., Gomez-Rodriguez, M.: Leveraging the crowd to detect and reduce the spread of fake news and misinformation. In: Proceedings of the Eleventh ACM International Conference On Web Search and Data Mining, pp. 324–332 (2018)
Kittur, A., et al.: The future of crowd work. In: Proceedings of the 2013 Conference On Computer Supported Cooperative Work, pp. 1301–1318 (2013)
Kobayashi, N., Matsubara, M., Tajima, K., Morishima, A.: A crowd-in-the-loop approach for generating conference programs with microtasks. In: 2017 IEEE International Conference on Big Data, Big Data, pp. 4394–4396. IEEE (2017)
Nguyen, A.T., et al.: Believe it or not: designing a human-ai partnership for mixed-initiative fact-checking. In: Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, pp. 189–199 (2018)
Popoola, A., et al.: Information verification during natural disasters. In: Proceedings of the 22nd International Conference on World Wide Web, pp. 1029–1032 (2013)
Retelny, D., Bernstein, M.S., Valentine, M.A.: No workflow can ever be enough: how crowdsourcing workflows constrain complex work. In: Proceedings of the ACM on Human-Computer Interaction, CSCW, vol. 1, pp. 1–23 (2017)
Sethi, R.J.: Crowdsourcing the verification of fake news and alternative facts. In: Proceedings of the 28th ACM Conference on Hypertext and Social Media, pp. 315–316 (2017)
Shah, N., Zhou, D.: No oops, you won’t do it again: mechanisms for self-correction in crowdsourcing. In: International Conference On Machine Learning, PMLR, pp. 1–10 (2016)
Swan, M., et al.: Crowdsourced health research studies: an important emerging complement to clinical trials in the public health research ecosystem. J. Med. Internet Res. 14(2), e1988 (2012)
Thorne, J., Vlachos, A.: Automated fact checking: Task formulations, methods and future directions. arXiv preprint arXiv:1806.07687 (2018)
Venetis, P., Garcia-Molina, H.: Quality control for comparison microtasks. In: Proceedings of the First International Workshop on Crowdsourcing and Data Mining, pp. 15–21 (2012)
Wijerathna, N., Matsubara, M., Morishima, A.: Finding evidences by crowdsourcing. In: 2018 IEEE International Conference on Big Data, Big Data, pp. 3560–3563. IEEE (2018)
Wu, Y., Agarwal, P.K., Li, C., Yang, J., Yu, C.: Toward computational fact-checking. Proc. VLDB Endowment 7(7), 589–600 (2014)
Yan, T., Kumar, V., Ganesan, D.: Crowdsearch: exploiting crowds for accurate real-time image search on mobile phones. In: Proceedings of the 8th International Conference on Mobile Systems, Applications, and Services, pp. 77–90 (2010)
Zhai, Z., Kijewski-Correa, T., Hachen, D., Madey, G.: Haiti earthquake photo tagging: lessons on crowdsourcing in-depth image classifications. In: Seventh International Conference on Digital Information Management, ICDIM 2012, pp. 357–364. IEEE (2012)
Acknowledgments
This work was partially supported by JST CREST Grant Number JPMJCR16E3, JSPS KAKENHI Grant Number 21H03552, Japan.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wijerathna, N., Matsubara, M., Morishima, A. (2022). Effectiveness of Diverse Evidence for Developing Convincing Proofs with Crowdsourcing. In: Yamamoto, S., Mori, H. (eds) Human Interface and the Management of Information: Visual and Information Design. HCII 2022. Lecture Notes in Computer Science, vol 13305. Springer, Cham. https://doi.org/10.1007/978-3-031-06424-1_14
Download citation
DOI: https://doi.org/10.1007/978-3-031-06424-1_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-06423-4
Online ISBN: 978-3-031-06424-1
eBook Packages: Computer ScienceComputer Science (R0)