Skip to main content

Inductive Model Using Abstract Meaning Representation for Text Classification via Graph Neural Networks

  • Conference paper
  • First Online:
Human Interface and the Management of Information (HCII 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14015))

Included in the following conference series:

  • 804 Accesses

Abstract

Text classification is a fundamental task in NLP. Recently, graph neural networks (GNN) have been applied to this field. GNN-based methods sufficiently capture text structures well and improve the performance of text classification. However, previous work could not accurately capture non-consecutive and long-distance semantics in individual documents. To address this issue, we propose in this study an ensemble model comprising two aspects: a model for capturing non-consecutive and long-distance semantics and another model for capturing local word sequence semantics. To capture each of the semantics, we use abstract meaning representation (AMR) graphs for the relations between entities and another set of graphs based on the fixed-size sliding window for capturing local word sequence semantics. Furthermore, we propose a learning method that considers the edge features for AMR graphs. Extensive experiments on benchmark datasets are conducted, and the results illustrate the effectiveness of our proposed methods and AMR graphs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://disi.unitn.it/moschitti/corpora.htm

  2. 2.

    The code has not been released. Therefore, we only cite the accuracy on the MR dataset as reported by Xie et al. [11].

References

  1. Wang, S., Manning, C.D.: Baselines and bigrams: simple, good sentiment and topic classification. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short papers), pp. 90–94 (2012)

    Google Scholar 

  2. Wang, Y., Huang, M., Zhu, X., Zhao, L.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 606–615 (2016)

    Google Scholar 

  3. Dhingra, B., Pruthi, D., Rajagopal, D.: Simple and effective semi-supervised question answering. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short papers), pp. 582–587 (2018)

    Google Scholar 

  4. Joachims, T.: Text categorization with support vector machines: learning with many relevant features. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, pp. 137–142. Springer, Heidelberg (1998). https://doi.org/10.1007/BFb0026683

    Chapter  Google Scholar 

  5. Cavnar, W.B., Trenkle, J.M.: M-gram-based text categorization. In: Proceedings of SDAIR-94, 3rd Annual Symposium on Document Analysis and Information Retrieval (1994)

    Google Scholar 

  6. Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 1746–1751 (2014)

    Google Scholar 

  7. Zhou, C., Sun, C., Liu, Z., Lau, F.: A C-LSTM neural network for text classification. arXiv preprint arXiv:1511.08630 (2015)

  8. Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Proceedings of the 2016 conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2016)

    Google Scholar 

  9. Peng, H., et al.: Large-scale hierarchical text classification with recursively regularized deep graph-CNN. In: Proceedings of the 2018 World Wide Web Conference, pp. 1063–1072 (2018)

    Google Scholar 

  10. Yao, L., Mao, C., Luo, Y.: Graph convolutional networks for text classification. In: Proceedings of the 33th AAAI Conference on Artificial Intelligence, pp. 7370–7377 (2019)

    Google Scholar 

  11. Xie, Q., Huang, J.: Inductive topic variational graph auto-encoder for text classification. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 4218–4227 (2021)

    Google Scholar 

  12. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: Proceedings of the 2017 International Conference on Learning Representations (2017)

    Google Scholar 

  13. Huang, L., Ma, D., Li, S., Zhang, X., Wang, H.: Text level graph neural network for text classification. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, pp. 3444–3450 (2019)

    Google Scholar 

  14. Rousseau, F., Kiagias, E., Vazirgiannis, M.: Text categorization as a graph classification problem. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 1702–1712 (2015)

    Google Scholar 

  15. Zhang, Y., Yu, X., Cui, Z., Wu, S., Wen, Z., Wang, L.: Every document owns its structure: Inductive text classification via graph neural networks. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 334–339 (2020)

    Google Scholar 

  16. Banarescu, L., et al.: Abstract meaning representation for SemBanking. In: Proceedings of the 7th Linguistic Annotation Workshop and Interoperability with Discourse, pp. 178–186 (2013)

    Google Scholar 

  17. Zhang, Y., et al.: Lightweight, dynamic graph convolutional networks for AMR-to-text generation. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 2162–2172 (2020)

    Google Scholar 

  18. Wang, T., Wan, X., Yao, S.: Better AMR-to-text generation with graph structure reconstruction. In: Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, pp. 3919–3925 (2021)

    Google Scholar 

  19. Bevilacqua, M., Blloshmi, R., Navigli, R.: One SPRING to rule them both: symmetric AMR semantic parsing and generation without a complex pipeline. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence, pp. 12564–12573 (2021)

    Google Scholar 

  20. Lam, H.T., et al.: Ensembling graph predictions for AMR parsing. In: Proceedings of 35th Conference on Neural Information Processing Systems (2021)

    Google Scholar 

  21. Song, L., Gildea, D., Zhang, Y., Wang, Z., Su, J.: Semantic neural machine translation using AMR. Trans. Assoc. Comput. Linguist. 7, 19–31 (2019)

    Article  Google Scholar 

  22. Hardy, H., Vlachos, A.: Guided neural language generation for abstractive summarization using abstract meaning representation. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 768–773 (2018)

    Google Scholar 

  23. Liao, K., Lebanoff, L., Liu, F.: Abstract meaning representation for multi-document summarization. In: Proceedings of the 27th International Conference on Computational Linguistics, pp. 1178–1190 (2018)

    Google Scholar 

  24. Rao, S., Marcu, D., Knight, K., Daumé III, H.: Biomedical event extraction using abstract meaning representation. In: Proceedings of the BioNLP 2017 Workshop, pp. 126–135 (2017)

    Google Scholar 

  25. Zhang, Z., Ji, H.: Abstract meaning representation guided graph encoding and decoding for joint information extraction. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 39–49 (2021)

    Google Scholar 

  26. Mitra, A., Baral, C.: Addressing a question answering challenge by combining statistical methods with inductive rule learning and reasoning. In: Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, pp. 2779–2785 (2016)

    Google Scholar 

  27. Li, Y., Tarlow, D., Brockschmidt, M., Zemel, R.: Gated graph sequence neural networks. In: Proceedings of the 2016 International Conference on Learning Representations (2016)

    Google Scholar 

  28. Beck, D., Gholamreza, M., Cohn, T.: Graph-to-sequence learning using gated graph neural networks. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Long Papers), pp. 273–283 (2018)

    Google Scholar 

  29. Liu, P., Qiu, X., Huang, X.: Recurrent neural network for text classification with multi-task learning. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, pp. 2873–2879 (2016)

    Google Scholar 

  30. Liu, X., You, X., Zhang, X., Wu, J., Lv, P.: Tensor graph convolutional networks for text classification. In: Proceedings of the 34th AAAI Conference on Artificial Intelligence, pp. 8409–8416 (2020)

    Google Scholar 

  31. Ding, K., Wang, J., Li, J., Li, D., Liu, H.: Be more with less: hypergraph attention networks for inductive text classification. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 4927–4936 (2020)

    Google Scholar 

  32. Graves, A., Schmidhuber, J.: Framewise phoneme classification with bidirectional LSTM and other neural network architecture. Neural Netw. 18(5–6), 602–610 (2005)

    Article  Google Scholar 

  33. Cai, S., Li, L., Han, X., Zha, Z.-J., Huang, Q.: Edge-featured graph neural architecture search. arXiv preprint arXiv:2109.01356 (2021)

  34. Pang, B., Lee, L.: Seeing stars: exploiting class relationships for sentiment categorization with respect to rating scales. In: Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics, pp. 115–124 (2005)

    Google Scholar 

  35. Socher, R., et al.: Recursive deep models for semantic compositionality over a sentiment tree-bank. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1631–1642 (2013)

    Google Scholar 

  36. Li, X., Roth, D.: Learning question classifiers. In: Proceedings of the 19th International Conference on Computational Linguistics, (Volume 1), pp. 1–7 (2002)

    Google Scholar 

  37. Pang, B., Lee, L.: A sentimental education: sentiment analysis using subjectivity summarization based on minimum cuts. In: Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics, pp. 271–278 (2004)

    Google Scholar 

  38. Papineni, K., Roukos, S., Ward, T., Zhu, W.J.: BLEU: a method for automatic evaluation of machine translation. In: Proceeding of the 40th Annual Meeting of the Association for Computational Linguistics, pp. 311–318 (2002)

    Google Scholar 

  39. Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)

    Google Scholar 

  40. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Proceedings of the 3rd International Conference for Learning Representations (2015)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ryosuke Saga .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ogawa, T., Saga, R. (2023). Inductive Model Using Abstract Meaning Representation for Text Classification via Graph Neural Networks. In: Mori, H., Asahi, Y. (eds) Human Interface and the Management of Information. HCII 2023. Lecture Notes in Computer Science, vol 14015. Springer, Cham. https://doi.org/10.1007/978-3-031-35132-7_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35132-7_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35131-0

  • Online ISBN: 978-3-031-35132-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics