Skip to main content

Joint Extraction of Entities and Relations: An Advanced BERT-based Decomposition Method

  • Chapter
  • First Online:
MDATA: A New Knowledge Representation Model

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12647))

  • 671 Accesses

Abstract

Joint extraction of entities and relations is an important task in the field of Natural Language Processing (NLP) and the basis of many NLP high-level tasks. However, most existing joint models cannot solve the problem of overlapping triples well. We propose an efficient end-to-end model for joint extraction of entities and overlapping relations in this chapter. Firstly, the BERT pre-training model is introduced to model the text more finely. Next, we decompose triples extraction into two subtasks: head entity extraction and tail entity extraction, which solves the problem of single entity overlap in the triples. Then, We divide the tail entity extraction into three parallel extraction sub-processes to solve entity pair overlap problem of triples, that is the relation overlap problem. Finally, we transform each extraction sub-process into a sequence tag task. We evaluate our model on the New York Times (NYT) dataset and achieve overwhelming results (Precise = 0.870, Recall = 0.851, and F1 = 0.860) compared with most of the current models. The experimental results show that our model is effective in dealing with triples overlap problem.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Shaalan, K.: A survey of Arabic named entity recognition and classification. Comput. Linguist. 40(2), 469–510 (2014)

    Article  Google Scholar 

  2. Rink, B., Harabagiu, S.M.: UTD: classifying semantic relations by combining lexical and semantic resources. In: Erk, K., Strapparava, C. (eds.) Proceedings of the 5th International Workshop on Semantic Evaluation, SemEval@ACL 2010, Uppsala University, Uppsala, Sweden, 15–16 July 2010, pp. 256–259. The Association for Computer Linguistics (2010)

    Google Scholar 

  3. Li, Q., Ji, H.: Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014, 22–27 June 2014, Baltimore, MD, USA, Volume 1: Long Papers, pp. 402–412. The Association for Computer Linguistics (2014)

    Google Scholar 

  4. Miwa, M., Sasaki, Y.: Modeling joint entity and relation extraction with table representation. In: Moschitti, A., Pang, B., Daelemans, W. (eds.) Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, EMNLP 2014, 25–29 October 2014, Doha, Qatar, A meeting of SIGDAT, a Special Interest Group of the ACL, pp. 1858–1869, ACL (2014)

    Google Scholar 

  5. Ren, X., et al.: Cotype: joint extraction of typed entities and relations with knowledge bases. In: Barrett, R., Cummings, R., Agichtein, E., Gabrilovich, E. (eds.) Proceedings of the 26th International Conference on World Wide Web, WWW 2017, Perth, Australia, 3–7 April 2017, pp. 1015–1024. ACM (2017)

    Google Scholar 

  6. Yu, X., Lam, W.: Jointly identifying entities and extracting relations in encyclopedia text via A graphical model approach. In: Huang, C., Jurafsky, D. (eds.) COLING 2010, 23rd International Conference on Computational Linguistics, Posters Volume, 23–27 August 2010, Beijing, China, pp. 1399–1407. Chinese Information Processing Society of China (2010)

    Google Scholar 

  7. Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMS on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016, 7–12 August 2016, Berlin, Germany, Volume 1: Long Papers, The Association for Computer Linguistics (2016)

    Google Scholar 

  8. Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMS on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016, 7–12 August 2016, Berlin, Germany, Volume 1: Long Papers, The Association for Computer Linguistics (2016)

    Google Scholar 

  9. Zeng, X., Zeng, D., He, S., Liu, K., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Gurevych, I., Miyao, Y. (eds.) Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018, Melbourne, Australia, 15–20 July 2018, Volume 1: Long Papers, pp. 506–514. Association for Computational Linguistics (2018)

    Google Scholar 

  10. Yu, B., et al.: Joint extraction of entities and relations based on a novel decomposition strategy. In: Giacomo, G.D., Catalá, A., Dilkina, B., Milano, M., Barro, S., BugarínBugarín, A., Lang, J. (eds.). ECAI 2020–24th European Conference on Artificial Intelligence, 29 August–8 September 2020, Santiago de Compostela, Spain, 29 August - 8 September 2020 - Including 10th Conference on Prestigious Applications of Artificial Intelligence (PAIS 2020), vol. 325 of Frontiers in Artificial Intelligence and Applications, pp. 2282–2289. IOS Press (2020)

    Google Scholar 

  11. Borthwick, A., Grishman, R.: A maximum entropy approach to named entity recognition. Ph. D. thesis, Citeseer (1999)

    Google Scholar 

  12. Isozaki, H., Kazawa, H.: Efficient support vector classifiers for named entity recognition. In: 19th International Conference on Computational Linguistics, COLING 2002, Howard International House and Academia Sinica, Taipei, Taiwan, 24 August - 1 September 2002 (2002)

    Google Scholar 

  13. Bikel, D.M., Miller, S., Schwartz, R.M., Weischedel, R.M.: Nymble: a high-performance learning name-finder, CoRR, vol. cmp-lg/9803003 (1998)

    Google Scholar 

  14. McCallum, A., Li, W.: Early results for named entity recognition with conditional random fields, feature induction and web-enhanced lexicons. In: Daelemans, W., Osborne, M. (eds.) Proceedings of the Seventh Conference on Natural Language Learning, CoNLL 2003, Held in cooperation with HLT-NAACL 2003, Edmonton, Canada, 31 May - 1 June 2003, pp. 188–191. ACL (2003)

    Google Scholar 

  15. Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM-CRF models for sequence tagging, CoRR, vol. abs/1508.01991 (2015)

    Google Scholar 

  16. Tomori, S., Ninomiya, T., Mori, S.: Domain specific named entity recognition referring to the real world by deep neural networks. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016, 7–12 August 2016, Berlin, Germany, Volume 2: Short Papers, The Association for Computer Linguistics (2016)

    Google Scholar 

  17. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Hajic, J., Tsujii, J. (eds.) COLING 2014, 25th International Conference on Computational Linguistics, Proceedings of the Conference: Technical Papers, 23–29 August 2014, Dublin, Ireland, pp. 2335–2344. ACL (2014)

    Google Scholar 

  18. Nguyen, T.H., Grishman, R.: Combining neural networks and log-linear models to improve relation extraction, CoRR, vol. abs/1511.05926 (2015)

    Google Scholar 

  19. Zeng, D., Zhang, H., Liu, Q.: CopyMTL: copy mechanism for joint extraction of entities and relations with multi-task learning. In: The Thirty-Fourth AAAI Conference on Artificial Intelligence, AAAI 2020, The Thirty-Second Innovative Applications of Artificial Intelligence Conference, IAAI 2020, The Tenth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2020, New York, NY, USA, 7–12 February 2020, pp. 9507–9514. AAAI Press (2020)

    Google Scholar 

  20. Dai, D., Xiao, X., Lyu, Y., Dou, S., She, Q., Wang, H.: Joint extraction of entities and overlapping relations using position-attentive sequence labeling. In: The Thirty-Third AAAI Conference on Artificial Intelligence, AAAI 2019, The Thirty-First Innovative Applications of Artificial Intelligence Conference, IAAI 2019, The Ninth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, Honolulu, Hawaii, USA, 27 January - 1 February 2019, pp. 6300–6308. AAAI Press (2019)

    Google Scholar 

  21. Fu, T., Li, P., Ma, W.: Graphrel: modeling text as relational graphs for joint entity and relation extraction. In: Korhonen, A., Traum, D.R., Màrquez, L. (eds.) Proceedings of the 57th Conference of the Association for Computational Linguistics, ACL 2019, Florence, Italy, 28 July- 2 August 2019, Volume 1: Long Papers, pp. 1409–1418. Association for Computational Linguistics (2019)

    Google Scholar 

  22. Li, X., et al.: Entity-relation extraction as multi-turn question answering. In: Korhonen, A., Traum, D.R., Màrquez, L. (eds.) Proceedings of the 57th Conference of the Association for Computational Linguistics, ACL 2019, Florence, Italy, 28 July- 2 August 2019, Volume 1: Long Papers, pp. 1340–1350. Association for Computational Linguistics (2019)

    Google Scholar 

  23. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Burstein, J., Doran, C., Solorio, T. (eds.) Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2019, Minneapolis, MN, USA, 2–7 June 2019, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics (2019)

    Google Scholar 

  24. Radford, A., Narasimhan, K., Salimans, T., Sutskever, I.: Improving language understanding by generative pre-training (2018)

    Google Scholar 

  25. Wu, Y., et al.: Google’s neural machine translation system: bridging the gap between human and machine translation, CoRR, vol. abs/1609.08144 (2016)

    Google Scholar 

  26. Vaswani, A., et al.: Attention is all you need. In: Guyon, I., et al. (eds.) Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4–9 December 2017, Long Beach, CA, USA, pp. 5998–6008 (2017)

    Google Scholar 

  27. Hochreiter, S., Schmidhuber, J.: LSTM can solve hard long time lag problems. In: Mozer, M., Jordan, I., Petsche, T. (eds.) Advances in Neural Information Processing Systems 9, NIPS, Denver, CO, USA, 2–5 December 1996, pp. 473–479. MIT Press (1996)

    Google Scholar 

  28. Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds.) ECML PKDD 2010. LNCS (LNAI), vol. 6323, pp. 148–163. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15939-8_10

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aiping Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Wang, C., Li, A. (2021). Joint Extraction of Entities and Relations: An Advanced BERT-based Decomposition Method. In: Jia, Y., Gu, Z., Li, A. (eds) MDATA: A New Knowledge Representation Model. Lecture Notes in Computer Science(), vol 12647. Springer, Cham. https://doi.org/10.1007/978-3-030-71590-8_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-71590-8_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-71589-2

  • Online ISBN: 978-3-030-71590-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics