Skip to main content

Effectiveness of Diverse Evidence for Developing Convincing Proofs with Crowdsourcing

  • Conference paper
  • First Online:
Human Interface and the Management of Information: Visual and Information Design (HCII 2022)

Abstract

Crowdsourcing techniques have been developed over the past decade to leverage the ‘wisdom of crowds’ to solve real-world problems and human intelligence tasks. An example of such a task is fact-checking, that asks workers to check whether the shown claim is true or not. Fact-checking tasks are used in many applications such as services to identify fake news. This context motivates us to investigate a crowdsourcing-based framework to find convincing proofs for fact-checking. This study investigates a good crowdsourcing strategy to obtain convincing proofs with evidence. We focus on an iterative workflow in which workers improve the proofs provided by other crowd workers. In this study, we show the results of our experiment, in which we compare the results of two conditions, by which we see the effects of the diversification of shown evidence and proofs. The experimental results with real-world workers show that showing diverse evidence and proofs to crowd workers in the iterative workflow helps them develop more convincing proofs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://datareportal.com/reports/digital-2021-global-overview-report.

  2. 2.

    https://phys.org/news/2019-06-percent-internet-users-duped-fake.html.

  3. 3.

    https://www.mturk.com/.

  4. 4.

    https://www.katsuyoshinakahara.com/spring/.

  5. 5.

    http://travel.nationalgeographic.com/photographer-of-the-year-2016/gallery/week-6-nature/19/.

  6. 6.

    https://www.nationalgeographic.com/photography/photo-of-the-day/2015/6/mountain-view-poppies/.

  7. 7.

    https://snapshot.canon-asia.com/article/en/stunning-summer-landscapes-scenic-spots-in-japan-pro-photography-tips-1.

  8. 8.

    http://travel.nationalgeographic.com/photographer-of-the-year-2016/gallery/week-6-nature/19/.

  9. 9.

    https://en.japantravel.com/ibaraki/hiking-up-mt-tsukuba/2653.

References

  1. Bevelander, K.E.: Crowdsourcing novel childhood predictors of adult obesity. PLoS ONE 9(2), e87756 (2014)

    Article  Google Scholar 

  2. Daniel, F., Kucherbaev, P., Cappiello, C., Benatallah, B., Allahbakhsh, M.: Quality control in crowdsourcing: a survey of quality attributes, assessment techniques, and assurance actions. ACM Comput. Surv. (CSUR) 51(1), 1–40 (2018)

    Article  Google Scholar 

  3. Ghosh, S., Sharma, N., Benevenuto, F., Ganguly, N., Gummadi, K.: Cognos: crowdsourcing search for topic experts in microblogs. In: Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 575–590 (2012)

    Google Scholar 

  4. Graves, D.: Understanding the promise and limits of automated fact-checking (2018)

    Google Scholar 

  5. Guo, Z., Schlichtkrull, M., Vlachos, A.: A survey on automated fact-checking. arXiv preprint arXiv:2108.11896 (2021)

  6. Karadzhov, G., Nakov, P., Màrquez, L., Barrón-Cedeño, A., Koychev, I.: Fully automated fact checking using external sources. arXiv preprint arXiv:1710.00341 (2017)

  7. Kim, J., Tabibian, B., Oh, A., Schölkopf, B., Gomez-Rodriguez, M.: Leveraging the crowd to detect and reduce the spread of fake news and misinformation. In: Proceedings of the Eleventh ACM International Conference On Web Search and Data Mining, pp. 324–332 (2018)

    Google Scholar 

  8. Kittur, A., et al.: The future of crowd work. In: Proceedings of the 2013 Conference On Computer Supported Cooperative Work, pp. 1301–1318 (2013)

    Google Scholar 

  9. Kobayashi, N., Matsubara, M., Tajima, K., Morishima, A.: A crowd-in-the-loop approach for generating conference programs with microtasks. In: 2017 IEEE International Conference on Big Data, Big Data, pp. 4394–4396. IEEE (2017)

    Google Scholar 

  10. Nguyen, A.T., et al.: Believe it or not: designing a human-ai partnership for mixed-initiative fact-checking. In: Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, pp. 189–199 (2018)

    Google Scholar 

  11. Popoola, A., et al.: Information verification during natural disasters. In: Proceedings of the 22nd International Conference on World Wide Web, pp. 1029–1032 (2013)

    Google Scholar 

  12. Retelny, D., Bernstein, M.S., Valentine, M.A.: No workflow can ever be enough: how crowdsourcing workflows constrain complex work. In: Proceedings of the ACM on Human-Computer Interaction, CSCW, vol. 1, pp. 1–23 (2017)

    Google Scholar 

  13. Sethi, R.J.: Crowdsourcing the verification of fake news and alternative facts. In: Proceedings of the 28th ACM Conference on Hypertext and Social Media, pp. 315–316 (2017)

    Google Scholar 

  14. Shah, N., Zhou, D.: No oops, you won’t do it again: mechanisms for self-correction in crowdsourcing. In: International Conference On Machine Learning, PMLR, pp. 1–10 (2016)

    Google Scholar 

  15. Swan, M., et al.: Crowdsourced health research studies: an important emerging complement to clinical trials in the public health research ecosystem. J. Med. Internet Res. 14(2), e1988 (2012)

    Article  Google Scholar 

  16. Thorne, J., Vlachos, A.: Automated fact checking: Task formulations, methods and future directions. arXiv preprint arXiv:1806.07687 (2018)

  17. Venetis, P., Garcia-Molina, H.: Quality control for comparison microtasks. In: Proceedings of the First International Workshop on Crowdsourcing and Data Mining, pp. 15–21 (2012)

    Google Scholar 

  18. Wijerathna, N., Matsubara, M., Morishima, A.: Finding evidences by crowdsourcing. In: 2018 IEEE International Conference on Big Data, Big Data, pp. 3560–3563. IEEE (2018)

    Google Scholar 

  19. Wu, Y., Agarwal, P.K., Li, C., Yang, J., Yu, C.: Toward computational fact-checking. Proc. VLDB Endowment 7(7), 589–600 (2014)

    Article  Google Scholar 

  20. Yan, T., Kumar, V., Ganesan, D.: Crowdsearch: exploiting crowds for accurate real-time image search on mobile phones. In: Proceedings of the 8th International Conference on Mobile Systems, Applications, and Services, pp. 77–90 (2010)

    Google Scholar 

  21. Zhai, Z., Kijewski-Correa, T., Hachen, D., Madey, G.: Haiti earthquake photo tagging: lessons on crowdsourcing in-depth image classifications. In: Seventh International Conference on Digital Information Management, ICDIM 2012, pp. 357–364. IEEE (2012)

    Google Scholar 

Download references

Acknowledgments

This work was partially supported by JST CREST Grant Number JPMJCR16E3, JSPS KAKENHI Grant Number 21H03552, Japan.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Masaki Matsubara or Atsuyuki Morishima .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wijerathna, N., Matsubara, M., Morishima, A. (2022). Effectiveness of Diverse Evidence for Developing Convincing Proofs with Crowdsourcing. In: Yamamoto, S., Mori, H. (eds) Human Interface and the Management of Information: Visual and Information Design. HCII 2022. Lecture Notes in Computer Science, vol 13305. Springer, Cham. https://doi.org/10.1007/978-3-031-06424-1_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-06424-1_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-06423-4

  • Online ISBN: 978-3-031-06424-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics