Skip to main content

Dual Interactive Attention Network for Joint Entity and Relation Extraction

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13551))

Abstract

The joint entity and relation extraction method establishes a bond between tasks, surpassing sequential extraction based on the pipeline method. Many joint works focus on learning a unified representation for both tasks to explore the correlations between Named Entity Recognition (NER) and Relation Extraction (RE). However, they suffer from the feature confusion that features extracted from one task may conflict with those from the other. To address this issue, we propose a novel Dual Interactive Attention Network to learn independent representations and meanwhile guarantee bidirectional and fine-grained interaction between NER and RE. Specifically, we propose a Fine-grained Attention Cross-Unit to model interaction at the token level, which fully explores the correlation between entity and relation. To obtain task-specific representation, we introduce a novel attention mechanism that can capture the correlations among multiple sequences from the specific task and performs better than the traditional self-attention network. We conduct extensive experiments on five standard benchmarks (ACE04, ACE05, ADE, CoNLL04, SciERC) and achieve state-of-the-art performance, demonstrating the effectiveness of our approach in joint entity and relation extraction.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Joint entity recognition and relation extraction as a multi-head selection problem. Expert Syst. Appl. 114, 34–45 (2018)

    Article  Google Scholar 

  2. Cabot, P.L.H., Navigli, R.: Rebel: relation extraction by end-to-end language generation. In: Proceedings of the EMNLP 2021, pp. 2370–2381 (2021)

    Google Scholar 

  3. Walker, C., Strassel, S., Medero, J., Maeda, K.: ACE 2005 multilingual training corpus. J. Biomed. Inf. 45(5), 57–45 (2005)

    Google Scholar 

  4. Doddington, G.R., Mitchell, A., Przybocki, M.A., Ramshaw, L.A., Strassel, S.M., Weischedel, R.M.: The automatic content extraction (ACE) program-tasks, data, and evaluation. In: LREC, vol. 2, pp. 837–840 (2004)

    Google Scholar 

  5. Eberts, M., Ulges, A.: Span-based joint entity and relation extraction with transformer pre-training. In: ECAI 2020, pp. 2006–2013 (2019)

    Google Scholar 

  6. Guo, M.H., Liu, Z.N., Mu, T.J., Hu, S.M.: Beyond self-attention: external attention using two linear layers for visual tasks. arXiv preprint arXiv:2105.02358

  7. Gurulingappa, H., Rajput, A.M., Roberts, A., Fluck, J., Hofmann-Apitius, M., Toldo, L.: Development of a benchmark corpus to support the automatic extraction of drug-related adverse effects from medical case reports. J. Biomed. Inf. 45(5), 885–892 (2012)

    Google Scholar 

  8. Luan, Y., He, L., Ostendorf, M., Hajishirzi, H.: Multi-task identification of entities, relations, and coreference for scientific knowledge graph construction. In: Proceedings of the EMNLP, pp. 3219–3232 (2018)

    Google Scholar 

  9. Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the ACL, pp. 1105–1116 (2016)

    Google Scholar 

  10. Ren, F., et al.: A novel global feature-oriented relational triple extraction model based on table filling. In: Proceedings of the EMNLP, pp. 2646–2656 (2021)

    Google Scholar 

  11. Roth, D., Yih, W.: A linear programming formulation for global inference in natural language tasks. In: Proceedings of the HLT-NAACL, pp. 1–8 (2004)

    Google Scholar 

  12. Vaswani, A., et al.: Attention is all you need. In: Proceedings of the NIPS, pp. 5998–6008 (2017)

    Google Scholar 

  13. Wang, J., Lu, W.: Two are better than one: joint entity and relation extraction with table-sequence encoders. In: Proceedings of the EMNLP, pp. 1706–1721 (2020)

    Google Scholar 

  14. Wang, Y., Sun, C., Wu, Y., Zhou, H., Li, L., Yan, J.: UniRE: a unified label space for entity relation extraction. In: Proceedings of the ACL, pp. 220–231 (2021)

    Google Scholar 

  15. Wang, Y., Yu, B., Zhang, Y., Liu, T., Zhu, H., Sun, L.: TPLinker: single-stage joint extraction of entities and relations through token pair linking. In: Proceedings of the COLING, pp. 1572–1582 (2020)

    Google Scholar 

  16. Wei, Z., Su, J., Wang, Y., Tian, Y.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the ACL, pp. 1476–1488 (2020)

    Google Scholar 

  17. Yan, Z., Zhang, C., Fu, J., Zhang, Q., Wei, Z.: A partition filter network for joint entity and relation extraction. In: Proceedings of the EMNLP, pp. 185–197 (2021)

    Google Scholar 

  18. Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the ACL, pp. 1227–1236 (2017)

    Google Scholar 

  19. Zhong, Z., Chen, D.: A frustratingly easy approach for entity and relation extraction. In: Proceedings of the NAACL, pp. 50–61 (2021)

    Google Scholar 

Download references

Acknowledgments

This work is supported by grant from the National Natural Science Foundation of China (No. 62076048), the Science and Technology Innovation Foundation of Dalian (2020JJ26GX035).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Lishuang Li or Zehao Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, L., Wang, Z., Qin, X., Lu, H. (2022). Dual Interactive Attention Network for Joint Entity and Relation Extraction. In: Lu, W., Huang, S., Hong, Y., Zhou, X. (eds) Natural Language Processing and Chinese Computing. NLPCC 2022. Lecture Notes in Computer Science(), vol 13551. Springer, Cham. https://doi.org/10.1007/978-3-031-17120-8_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-17120-8_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-17119-2

  • Online ISBN: 978-3-031-17120-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics