Abstract
The joint entity and relation extraction method establishes a bond between tasks, surpassing sequential extraction based on the pipeline method. Many joint works focus on learning a unified representation for both tasks to explore the correlations between Named Entity Recognition (NER) and Relation Extraction (RE). However, they suffer from the feature confusion that features extracted from one task may conflict with those from the other. To address this issue, we propose a novel Dual Interactive Attention Network to learn independent representations and meanwhile guarantee bidirectional and fine-grained interaction between NER and RE. Specifically, we propose a Fine-grained Attention Cross-Unit to model interaction at the token level, which fully explores the correlation between entity and relation. To obtain task-specific representation, we introduce a novel attention mechanism that can capture the correlations among multiple sequences from the specific task and performs better than the traditional self-attention network. We conduct extensive experiments on five standard benchmarks (ACE04, ACE05, ADE, CoNLL04, SciERC) and achieve state-of-the-art performance, demonstrating the effectiveness of our approach in joint entity and relation extraction.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Joint entity recognition and relation extraction as a multi-head selection problem. Expert Syst. Appl. 114, 34–45 (2018)
Cabot, P.L.H., Navigli, R.: Rebel: relation extraction by end-to-end language generation. In: Proceedings of the EMNLP 2021, pp. 2370–2381 (2021)
Walker, C., Strassel, S., Medero, J., Maeda, K.: ACE 2005 multilingual training corpus. J. Biomed. Inf. 45(5), 57–45 (2005)
Doddington, G.R., Mitchell, A., Przybocki, M.A., Ramshaw, L.A., Strassel, S.M., Weischedel, R.M.: The automatic content extraction (ACE) program-tasks, data, and evaluation. In: LREC, vol. 2, pp. 837–840 (2004)
Eberts, M., Ulges, A.: Span-based joint entity and relation extraction with transformer pre-training. In: ECAI 2020, pp. 2006–2013 (2019)
Guo, M.H., Liu, Z.N., Mu, T.J., Hu, S.M.: Beyond self-attention: external attention using two linear layers for visual tasks. arXiv preprint arXiv:2105.02358
Gurulingappa, H., Rajput, A.M., Roberts, A., Fluck, J., Hofmann-Apitius, M., Toldo, L.: Development of a benchmark corpus to support the automatic extraction of drug-related adverse effects from medical case reports. J. Biomed. Inf. 45(5), 885–892 (2012)
Luan, Y., He, L., Ostendorf, M., Hajishirzi, H.: Multi-task identification of entities, relations, and coreference for scientific knowledge graph construction. In: Proceedings of the EMNLP, pp. 3219–3232 (2018)
Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the ACL, pp. 1105–1116 (2016)
Ren, F., et al.: A novel global feature-oriented relational triple extraction model based on table filling. In: Proceedings of the EMNLP, pp. 2646–2656 (2021)
Roth, D., Yih, W.: A linear programming formulation for global inference in natural language tasks. In: Proceedings of the HLT-NAACL, pp. 1–8 (2004)
Vaswani, A., et al.: Attention is all you need. In: Proceedings of the NIPS, pp. 5998–6008 (2017)
Wang, J., Lu, W.: Two are better than one: joint entity and relation extraction with table-sequence encoders. In: Proceedings of the EMNLP, pp. 1706–1721 (2020)
Wang, Y., Sun, C., Wu, Y., Zhou, H., Li, L., Yan, J.: UniRE: a unified label space for entity relation extraction. In: Proceedings of the ACL, pp. 220–231 (2021)
Wang, Y., Yu, B., Zhang, Y., Liu, T., Zhu, H., Sun, L.: TPLinker: single-stage joint extraction of entities and relations through token pair linking. In: Proceedings of the COLING, pp. 1572–1582 (2020)
Wei, Z., Su, J., Wang, Y., Tian, Y.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the ACL, pp. 1476–1488 (2020)
Yan, Z., Zhang, C., Fu, J., Zhang, Q., Wei, Z.: A partition filter network for joint entity and relation extraction. In: Proceedings of the EMNLP, pp. 185–197 (2021)
Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the ACL, pp. 1227–1236 (2017)
Zhong, Z., Chen, D.: A frustratingly easy approach for entity and relation extraction. In: Proceedings of the NAACL, pp. 50–61 (2021)
Acknowledgments
This work is supported by grant from the National Natural Science Foundation of China (No. 62076048), the Science and Technology Innovation Foundation of Dalian (2020JJ26GX035).
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Li, L., Wang, Z., Qin, X., Lu, H. (2022). Dual Interactive Attention Network for Joint Entity and Relation Extraction. In: Lu, W., Huang, S., Hong, Y., Zhou, X. (eds) Natural Language Processing and Chinese Computing. NLPCC 2022. Lecture Notes in Computer Science(), vol 13551. Springer, Cham. https://doi.org/10.1007/978-3-031-17120-8_21
Download citation
DOI: https://doi.org/10.1007/978-3-031-17120-8_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-17119-2
Online ISBN: 978-3-031-17120-8
eBook Packages: Computer ScienceComputer Science (R0)