Abstract
Relation extraction is an important task in knowledge graph completion, information extraction and retrieval task. Recent neural models (especially with attention mechanism) have been shown to perform reasonably well. However, they sometimes fail to: (i) understand the semantic similarity of words with the given entities; and (ii) capture the long-distance dependencies among the words and entities such as co-reference. Moreover, this paper proposes a novel relation extraction model, which leverages syntactic dependency and lexical similarity for enhancing attention mechanism, to get rid of dependence for rich labeled training data. We conduct experiments on widely-used real-world datasets and the experimental results demonstrate the efficiency of the proposed model, even compared with latest state-of-the-art Transformer-based models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Note that, although many text conceptualization algorithms could be adopted here, we choose the state-of-the-art one [6], because it is not central to this study.
References
Bilan, I., Roth, B.: Position-aware self-attention with relative positional encodings for slot filling. arXiv abs/1807.03052 (2018)
Bowen, Y., Zhang, Z., Liu, T., Wang, B., Li, S., Li, Q.: Beyond word attention: using segment attention in neural relation extraction. In: IJCAI (2019)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT (2019)
Guo, Z., Zhang, Y., Lu, W.: Attention guided graph convolutional networks for relation extraction. arXiv abs/1906.07510 (2019)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9, 1735–1780 (1997)
Huang, H., Wang, Y., Feng, C., Liu, Z., Zhou, Q.: Leveraging conceptualization for short-text embedding. IEEE Trans. Knowl. Data Eng. 30(7), 1282–1295 (2018)
Joshi, M., Chen, D., Liu, Y., Weld, D.S., Zettlemoyer, L., Levy, O.: SpanBERT: improving pre-training by representing and predicting spans. Trans. Assoc. Comput. Linguist. 8, 64–77 (2019)
Kim, Y.: Convolutional neural networks for sentence classification. In: EMNLP (2014)
Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: ALBERT: a lite BERT for self-supervised learning of language representations. arXiv abs/1909.11942 (2020)
Li, P., Mao, K., Yang, X., Li, Q.: Improving relation extraction with knowledge-attention. In: EMNLP/IJCNLP (2019)
Lin, Y., Shen, S., Liu, Z., Luan, H., Sun, M.: Neural relation extraction with selective attention over instances. In: ACL (2016)
Nayak, T.: Effective attention modeling for neural relation extraction. In: CoNLL 2019 (2019)
Nguyen, T.H., Grishman, R.: Relation extraction: perspective from convolutional neural networks. In: VS@HLT-NAACL (2015)
Rink, B., Harabagiu, S.M.: UTD: classifying semantic relations by combining lexical and semantic resources. In: SemEval@ACL (2010)
Song, Y., Wang, H., Wang, Z., Li, H., Chen, W.: Short text conceptualization using a probabilistic knowledgebase. In: Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence - Volume Volume Three, pp. 2330–2336 (2011)
Song, Y., Wang, S., Wang, H.: Open domain short text conceptualization: a generative + descriptive modeling approach. In: Proceedings of the 24th International Conference on Artificial Intelligence (2015)
Wang, F., Wang, Z., Li, Z., Wen, J.R.: Concept-based short text classification and ranking. In: The ACM International Conference, pp. 1069–1078 (2014)
Wang, J., Wang, Z., Zhang, D., Yan, J.: Combining knowledge with deep convolutional neural networks for short text classification. In: Twenty-Sixth International Joint Conference on Artificial Intelligence, pp. 2915–2921 (2017)
Wang, Y., Huang, H., Feng, C.: Query expansion based on a feedback concept model for microblog retrieval. In: International Conference on World Wide Web, pp. 559–568 (2017)
Wang, Y., Huang, H., Feng, C., Zhou, Q., Gu, J., Gao, X.: CSE: conceptual sentence embeddings based on attention model. In: 54th Annual Meeting of the Association for Computational Linguistics, pp. 505–515 (2016)
Wang, Y., Liu, Y., Zhang, H., Xie, H.: Leveraging lexical semantic information for learning concept-based multiple embedding representations for knowledge graph completion. In: APWeb/WAIM (2019)
Wu, W., Li, H., Wang, H., Zhu, K.Q.: Probase: a probabilistic taxonomy for text understanding. In: ACM SIGMOD International Conference on Management of Data, pp. 481–492 (2012)
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: COLING (2014)
Zhang, Y., Qi, P., Manning, C.D.: Graph convolution over pruned dependency trees improves relation extraction. In: EMNLP (2018)
Zhang, Y., Zhong, V., Chen, D., Angeli, G., Manning, C.D.: Position-aware attention and supervised data improve slot filling. In: EMNLP (2017)
Zhao, Y., Wan, H., Gao, J., Lin, Y.: Improving relation classification by entity pair graph. In: ACML (2019)
Acknowledgements
We thank anonymous reviewers for valuable comments. This work is funded by: (i) the National Natural Science Foundation of China (No. U19B2026); (ii) the New Generation of Artificial Intelligence Special Action Project (No. AI20191125008); (iii) the National Integrated Big Data Center Pilot Project (No. 20500908, 17111001,17111002).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, Y. (2021). Leveraging Syntactic Dependency and Lexical Similarity for Neural Relation Extraction. In: U, L.H., Spaniol, M., Sakurai, Y., Chen, J. (eds) Web and Big Data. APWeb-WAIM 2021. Lecture Notes in Computer Science(), vol 12858. Springer, Cham. https://doi.org/10.1007/978-3-030-85896-4_23
Download citation
DOI: https://doi.org/10.1007/978-3-030-85896-4_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-85895-7
Online ISBN: 978-3-030-85896-4
eBook Packages: Computer ScienceComputer Science (R0)