Skip to main content

An Annotation Protocol for Collecting User-Generated Counter-Arguments Using Crowdsourcing

  • Conference paper
  • First Online:
Artificial Intelligence in Education (AIED 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11626))

Included in the following conference series:

Abstract

Constructive feedback is important for improving critical thinking skills. However, little work has been done to automatically generate such feedback for an argument. In this work, we experiment with an annotation protocol for collecting user-generated counter-arguments via crowdsourcing. We conduct two parallel crowdsourcing experiments, where workers are instructed to produce (i) a counter-argument, and (ii) a counter-argument after identifying a fallacy. Our analysis indicates that we can collect counter-arguments that are useful as constructive feedback, especially when workers are first asked to identify a fallacy type.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://www.grammarly.com/.

  2. 2.

    https://www.ets.org/erater.

  3. 3.

    http://www.figure-eight.com.

  4. 4.

    http://www.softschools.com/examples/fallacies/.

  5. 5.

    We calculate the Cohen’s kappa after filtering out unsure instances.

References

  1. El Khoiri, N., Widiati, U.: Logical fallacies in Indonesian EFL learners’ argumentative writing: students’ perspectives. Dinamika Ilmu 17(1), 71–81 (2017)

    Article  Google Scholar 

  2. Ghosh, D., Khanam, A., Han, Y., Muresan, S.: Coarse-grained argumentation features for scoring persuasive essays. In: Proceedings of the 54th Annual Meeting of ACL (Volume 2: Short Papers), pp. 549–554 (2016)

    Google Scholar 

  3. Habernal, I., Pauli, P., Gurevych, I.: Adapting serious game for fallacious argumentation to German: pitfalls, insights, and best practices. In: Proceedings of the Eleventh International Conference on LREC, pp. 3329–3335 (2018)

    Google Scholar 

  4. Habernal, I., Wachsmuth, H., Gurevych, I., Stein, B.: The argument reasoning comprehension task: identification and reconstruction of implicit warrants. In: Proceedings of the 2018 Conference of NAACL: HLT, Volume 1 (Long Papers), pp. 1930–1940. Association for Computational Linguistics (2018)

    Google Scholar 

  5. Hua, X., Wang, L.: Neural argument generation augmented with externally retrieved evidence. In: Proceedings of the 56th Annual Meeting of ACL (Volume 1: Long Papers), pp. 219–230 (2018)

    Google Scholar 

  6. Indah, R.N., Kusuma, A.W.: Fallacies in English department students’ claims: a rhetorical analysis of critical thinking. Jurnal Pendidikan Humaniora 3(4), 295–304 (2015)

    Google Scholar 

  7. Lucas, C., Gibson, A., Buckingham Shum, S.: Utilization of a novel online reflective learning tool for immediate formative feedback to assist pharmacy students’ reflective writing skills. Am. J. Pharm. Educ. (2018)

    Google Scholar 

  8. Nguyen, H.V., Litman, D.J.: Argument mining for improving the automated scoring of persuasive essays. In: The Thirty-Second AAAI Conference on Artificial Intelligence, pp. 5892–5899 (2018)

    Google Scholar 

  9. Oktavia, W., Yasin, A., et al.: An analysis of students’ argumentative elements and fallacies in students’ discussion essays. Engl. Lang. Teach. 2(3) (2014)

    Google Scholar 

  10. Persing, I., Davis, A., Ng, V.: Modeling organization in student essays. In: Proceedings of the 2010 Conference on EMNLP, pp. 229–239. Association for Computational Linguistics (2010)

    Google Scholar 

  11. Persing, I., Ng, V.: Modeling thesis clarity in student essays. In: Proceedings of the 51st Annual Meeting of ACL (Volume 1: Long Papers), vol. 1, pp. 260–269 (2013)

    Google Scholar 

  12. Persing, I., Ng, V.: Modeling stance in student essays. In: Proceedings of the 54th Annual Meeting of ACL (Volume 1: Long Papers), vol. 1, pp. 2174–2184 (2016)

    Google Scholar 

  13. Wachsmuth, H., Al-Khatib, K., Stein, B.: Using argument mining to assess the argumentation quality of essays. In: Proceedings of the 26th International Conference on COLING, pp. 1680–1692 (2016)

    Google Scholar 

  14. Wachsmuth, H., Syed, S., Stein, B.: Retrieval of the best counterargument without prior topic knowledge. In: Proceedings of the 56th Annual Meeting of ACL (Volume 1: Long Papers), vol. 1, pp. 241–251 (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paul Reisert .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Reisert, P., Vallejo, G., Inoue, N., Gurevych, I., Inui, K. (2019). An Annotation Protocol for Collecting User-Generated Counter-Arguments Using Crowdsourcing. In: Isotani, S., Millán, E., Ogan, A., Hastings, P., McLaren, B., Luckin, R. (eds) Artificial Intelligence in Education. AIED 2019. Lecture Notes in Computer Science(), vol 11626. Springer, Cham. https://doi.org/10.1007/978-3-030-23207-8_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-23207-8_43

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-23206-1

  • Online ISBN: 978-3-030-23207-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics