Abstract
Relationship extraction means automatically extracting the relationship triples from a large amount of unstructured text, which is an important subtask for building knowledge graphs (KG). However, a traditional method of extracting triples obtains many challenges, especially for the overlapping triples problem. In this paper, we propose a relation extraction model based on a global pointer network and potential relation embedding called GPPR. Unlike the common approach, we use global pointers to decode entities and relations and use potential relation embeddings in the input text for feature fusion when decoding relations. GPPR can completely solve the overlapping problems because of the global pointer network, and the use of potential relation embeddings can reduce the computation of the network. Experiments demonstrate that our proposed model outperforms the state-of-the-art (SOTA) baselines on public benchmarks with higher efficiency and performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chan, Y.S., Roth, D.: Exploiting Syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Portland, Oregon, USA, June 2011, pp. 551–560. Accessed: 26 Oct 2022. [Online]. Available: https://aclanthology.org/P11-1056
Zelenko, D., Aone, C., Richardella, A.: Kernel methods for relation extraction. In: Proceedings of the ACL-02 Conference on Empirical Methods in Natural Language Processing—EMNLP ’02, Not Known, 2002, vol. 10, pp. 71–78. https://doi.org/10.3115/1118693.1118703
Li, Q., Ji, H.: Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (vol. 1: Long Papers), Baltimore, Maryland, June 2014, pp. 402–412. https://doi.org/10.3115/v1/P14-1038
Ren, X., et al.: CoType: joint extraction of typed entities and relations with knowledge bases. In: Proceedings of the 26th International Conference on World Wide Web, Perth Australia, Apr 2017, pp. 1015–1024. https://doi.org/10.1145/3038912.3052708
Gupta, P., Schütze, H., Andrassy, B.: Table filling multi-task recurrent neural network for joint entity and relation extraction. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, Dec 2016, pp. 2537–2547. Accessed: 26 Oct 2022. [Online]. Available: https://aclanthology.org/C16-1239
Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (vol. 1: Long Papers), Vancouver, Canada, July 2017, pp. 1227–1236. https://doi.org/10.18653/v1/P17-1113
Zeng, X., Zeng, D., He, S., Liu, K., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (vol. 1: Long Papers), Melbourne, Australia, July 2018, pp. 506–514. https://doi.org/10.18653/v1/P18-1047
Fu, T.-J., Li, P.-H., Ma, W.-Y.: GraphRel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, July 2019, pp. 1409–1418. https://doi.org/10.18653/v1/P19-1136
Wei, Z., Su, J., Wang, Y., Tian, Y., Chang, Y.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online, July 2020, pp. 1476–1488. https://doi.org/10.18653/v1/2020.acl-main.136
Wang, Y., Yu, B., Zhang, Y., Liu, T., Zhu, H., Sun, L.: TPLinker: single-stage joint extraction of entities and relations through token pair linking. In: Proceedings of the 28th International Conference on Computational Linguistics, Barcelona, Spain (Online), Dec 2020, pp. 1572–1582. https://doi.org/10.18653/v1/2020.coling-main.138
Yu, B., et al.: Joint extraction of entities and relations based on a novel decomposition strategy. arXiv, 18 Feb 2020. Accessed: 26 Oct 2022. (Online). Available: http://arxiv.org/abs/1909.04273
Mintz, M., Bills, S., Snow, R., Jurafsky, D.: Distant supervision for relation extraction without labeled data. In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: vol. 2—ACL-IJCNLP ’09, Suntec, Singapore, 2009, vol. 2, p. 1003. https://doi.org/10.3115/1690219.1690287
Hoffmann, R., Zhang, C., Ling, X., Zettlemoyer, L., Weld, D.S.: Knowledge-Based weak supervision for information extraction of overlapping relations. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Portland, Oregon, USA, June 2011, pp. 541–550. Accessed: 26 Oct 2022. (Online). Available: https://aclanthology.org/P11-1055
Zeng, D, Liu, K., Chen, Y., Zhao, J.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, Sept 2015, pp. 1753–1762. https://doi.org/10.18653/v1/D15-1203
Qin, P., Xu, W., Wang, W.Y.: Robust distant supervision relation extraction via deep reinforcement learning. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (vol. 1: Long Papers), Melbourne, Australia, July 2018, pp. 2137–2147. https://doi.org/10.18653/v1/P18-1199
Trisedya, B.D., Weikum, G., Qi, J., Zhang, R.: Neural relation extraction for knowledge base enrichment. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, July 2019, pp. 229–240. https://doi.org/10.18653/v1/P19-1023
Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), Minneapolis, Minnesota, June 2019, pp. 4171–4186. https://doi.org/10.18653/v1/N19-1423
Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: Machine Learning and Knowledge Discovery in Databases, Berlin, Heidelberg, 2010, pp. 148–163. https://doi.org/10.1007/978-3-642-15939-8_10
Gardent, C., Shimorina, A., Narayan, S., Perez-Beltrachini, L.: Creating training corpora for NLG micro-planners. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (vol. 1: Long Papers), Vancouver, Canada, July 2017, pp. 179–188. https://doi.org/10.18653/v1/P17-1017
Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv, 04 Jan 2019. Accessed: 26 Oct 2022. (Online). Available: http://arxiv.org/abs/1711.05101
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yang, H., Qiao, Y. (2023). Relation Extraction Model Based on Global Pointer and Potential Relation Embedding. In: Kondo, K., Horng, MF., Pan, JS., Hu, P. (eds) Advances in Intelligent Information Hiding and Multimedia Signal Processing. IIHMSP 2022. Smart Innovation, Systems and Technologies, vol 339. Springer, Singapore. https://doi.org/10.1007/978-981-99-0105-0_19
Download citation
DOI: https://doi.org/10.1007/978-981-99-0105-0_19
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-0104-3
Online ISBN: 978-981-99-0105-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)