Abstract
Document-level relation extraction aims to identify semantic relations between target entities from the document.Most of the existing work roughly treats the document as a long sequence and produces target-agnostic representation for relation prediction, limiting the model’s ability to focus on the relevant context of target entities. In this paper, we reformulate the document-level relation extraction task and propose a NA-aware machine Reading Comprehension (NARC) model to tackle this problem. Specifically, the input sequence formulated as the concatenation of a head entity and a document is fed into the encoder to obtain comprehensive target-aware representations for each entity. In this way, the relation extraction task is converted into a reading comprehension problem by taking all the tail entities as candidate answers. Then, we add an artificial answer \(\texttt {NO-ANSWER}\) (NA) for each query and dynamically generate a NA score based on the decomposition and composition of all candidate tail entity features, which finally weighs the prediction results to alleviate the negative effect of having too many no-answer instances after task reformulation. Experimental results on DocRED with extensive analysis demonstrate the effectiveness of NARC.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
For more details about the construction process of Entity Graph, we recommend readers to reference the original paper [5].
- 2.
References
Cao, Q., Trivedi, H., Balasubramanian, A., Balasubramanian, N.: Deformer: Decomposing pre-trained transformers for faster question answering. In: Proceedings of ACL, pp. 4487–4497 (2020)
Cao, Y., Fang, M., Tao, D.: Bag: Bi-directional attention entity graph convolutional network for multi-hop reasoning question answering. In: Proceedings of NAACL, pp. 357–362 (2019)
Christopoulou, F., Miwa, M., Ananiadou, S.: A walk-based model on entity graphs for relation extraction. In: Proceedings of ACL, pp. 81–88 (2018)
Christopoulou, F., Miwa, M., Ananiadou, S.: Connecting the dots: document-level neural relation extraction with edge-oriented graphs. In: Proceedings of EMNLP, pp. 4927–4938 (2019)
De Cao, N., Aziz, W., Titov, I.: Question answering by reasoning across documents with graph convolutional networks. In: Proceedings of NAACL, pp. 2306–2317 (2019)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL, pp. 4171–4186 (2019)
Feng, R., Yuan, J., Zhang, C.: Probing and fine-tuning reading comprehension models for few-shot event extraction. arXiv preprint arXiv:2010.11325 (2020)
Guo, Z., Zhang, Y., Lu, W.: Attention guided graph convolutional networks for relation extraction. In: Proceedings of ACL, pp. 241–251 (2019)
Gupta, P., Rajaram, S., Schütze, H., Runkler, T.: Neural relation extraction within and across sentence boundaries. In: Proceedings of AAAI, pp. 6513–6520 (2019)
He, Z., Chen, W., Li, Z., Zhang, M., Zhang, W., Zhang, M.: See: syntax-aware entity embedding for neural relation extraction. In: Proceedings of AAAI, pp. 5795–5802 (2018)
Hu, M., Wei, F., Peng, Y., Huang, Z., Yang, N., Li, D.: Read+verify: machine reading comprehension with unanswerable questions. In: Proceedings of AAAI, pp. 6529–6537 (2019)
Jia, R., Wong, C., Poon, H.: Document-level n-ary relation extraction with multiscale representation learning. In: Proceedings of NAACL, pp. 3693–3704 (2019)
Li, X., Feng, J., Meng, Y., Han, Q., Wu, F., Li, J.: A unified mrc framework for named entity recognition. In: Proceedings of ACL, pp. 5849–5859 (2019)
Li, X., et al.: Entity-relation extraction as multi-turn question answering. In: Proceedings of ACL, pp. 1340–1350 (2019)
Li, Y., et al.: Self-attention enhanced selective gate with entity-aware embedding for distantly supervised relation extraction. In: Proceedings of AAAI, pp. 8269–8276 (2020)
Liu, Y., et al.: Roberta: a robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
Nan, G., Guo, Z., Sekulić, I., Lu, W.: Reasoning with latent structure refinement for document-level relation extraction. In: Proceedings of ACL, pp. 1546–1557 (2020)
Peng, N., Poon, H., Quirk, C., Toutanova, K., Yih, W.T.: Cross-sentence n-ary relation extraction with graph lstms. TACL 5, 101–115 (2017)
Rajpurkar, P., Jia, R., Liang, P.: Know what you don’t know: unanswerable questions for squad. In: Proceedings of ACL, pp. 784–789 (2018)
Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Proceedings of ESWC, pp. 593–607 (2018)
Seo, M., Kembhavi, A., Farhadi, A., Hajishirzi, H.: Bidirectional attention flow for machine comprehension. arXiv preprint arXiv:1611.01603 (2016)
Song, L., Zhang, Y., Wang, Z., Gildea, D.: N-ary relation extraction using graph-state lstm. In: Proceedings of EMNLP, pp. 2226–2235 (2018)
Sun, F., Li, L., Qiu, X., Liu, Y.: U-net: machine reading comprehension with unanswerable questions. arXiv preprint arXiv:1810.06638 (2018)
Tang, H., et al.: Hin: Hierarchical inference network for document-level relation extraction. In: Proceedings of PAKDD, pp. 197–209 (2020)
Wang, D., Hu, W., Cao, E., Sun, W.: Global-to-local neural networks for document-level relation extraction. In: Proceedings of EMNLP, pp. 3711–3721 (2020)
Wang, H., Focke, C., Sylvester, R., Mishra, N., Wang, W.: Fine-tune bert for docred with two-step process. arXiv preprint arXiv:1909.11898 (2019)
Wang, L., Cao, Z., De Melo, G., Liu, Z.: Relation classification via multi-level attention cnns. In: Proceedings of ACL, pp. 1298–1307 (2016)
Wang, Z., Mi, H., Ittycheriah, A.: Sentence similarity learning by lexical decomposition and composition. In: Proceedings of COLING, pp. 1340–1349 (2016)
Welbl, J., Stenetorp, P., Riedel, S.: Constructing datasets for multi-hop reading comprehension across documents. TACL 6, 287–302 (2018)
Wu, W., Wang, F., Yuan, A., Wu, F., Li, J.: Coreference resolution as query-based span prediction. In: Proceedings of ACL, pp. 6953–6963 (2020)
Yao, Y., et al.: Docred: a large-scale document-level relation extraction dataset. In: Proceedings of ACL (2019)
Ye, D., Lin, Y., Du, J., Liu, Z., Sun, M., Liu, Z.: Coreferential reasoning learning for language representation. In: Proceedings of EMNLP (2020)
Zeng, D., Liu, K., Chen, Y., Zhao, J.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of EMNLP, pp. 1753–1762 (2015)
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of COLING, pp. 2335–2344 (2014)
Zhang, Y., Zhong, V., Chen, D., Angeli, G., Manning, C.D.: Position-aware attention and supervised data improve slot filling. In: Proceedings of EMNLP, pp. 35–45 (2017)
Zhang, Z., Shu, X., Yu, B., Liu, T., Zhao, J., Li, Q., Guo, L.: Distilling knowledge from well-informed soft labels for neural relation extraction. In: Proceedings of AAAI, pp. 9620–9627 (2020)
Zhang, Z., Yu, B., Shu, X., Liu, T., Tang, H., Wang, Y., Guo, L.: Document-level relation extraction with dual-tier heterogeneous graph. In: Proceedings of COLING, pp. 1630–1641 (2020)
Acknowledgments
We would like to thank all reviewers for their insightful comments and suggestions. The work is supported by the Strategic Priority Research Program of Chinese Academy of Sciences (grant No.XDC02040400), and the Youth Innovation Promotion Association of Chinese Academy of Sciences (grant No.2021153).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, Z., Yu, B., Shu, X., Liu, T. (2021). NA-Aware Machine Reading Comprehension for Document-Level Relation Extraction. In: Oliver, N., Pérez-Cruz, F., Kramer, S., Read, J., Lozano, J.A. (eds) Machine Learning and Knowledge Discovery in Databases. Research Track. ECML PKDD 2021. Lecture Notes in Computer Science(), vol 12977. Springer, Cham. https://doi.org/10.1007/978-3-030-86523-8_35
Download citation
DOI: https://doi.org/10.1007/978-3-030-86523-8_35
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86522-1
Online ISBN: 978-3-030-86523-8
eBook Packages: Computer ScienceComputer Science (R0)