Abstract
Few-shot relation extraction involves predicting the relations between entity pairs in a sentence with a limited number of labeled instances for each specific relation. Prototypical network, which is based on the meta-learning framework, has been widely adopted for this task. Existing prototypical network-based approaches typically obtain the relation representation by concatenating the embeddings corresponding to start tokens of two entity mentions. While these methodologies have demonstrated commendable performance, we argue that the current relation representation fails to fully capture semantic nuances within complex scenes, where the identical entity pairs often convey diverse semantic relationship. In this paper, we propose an innovative relation representation approach that integrates textual context and entity mentions through a prompt template. Furthermore, we introduce a gate mechanism to selectively incorporate external relation knowledge into the origin relation prototype derived from support instances. Experimental results on two benchmark datasets demonstrate the effectiveness of our proposed approach.
Supported by National Key R &D Program of China (2021ZD0113902).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Baldini Soares, L., FitzGerald, N., Ling, J., Kwiatkowski, T.: Matching the blanks: distributional similarity for relation learning. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 2895–2905. Association for Computational Linguistics, Florence, Italy, July 2019. https://doi.org/10.18653/v1/P19-1279
Bordes, A., Chopra, S., Weston, J.: Question answering with subgraph embeddings. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 615–620. Association for Computational Linguistics, Doha, Qatar, October 2014. https://doi.org/10.3115/v1/D14-1067
Cho, K., van Merriënboer, B., Bahdanau, D., Bengio, Y.: On the properties of neural machine translation: Encoder-decoder approaches. In: Proceedings of SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation, pp. 103–111. Association for Computational Linguistics, Doha, Qatar, October 2014. https://doi.org/10.3115/v1/W14-4012
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota, Jun 2019. https://doi.org/10.18653/v1/N19-1423
Gao, T., et al.: FewRel 2.0: Towards more challenging few-shot relation classification. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 6250–6255. Association for Computational Linguistics, Hong Kong, China, November 2019. https://doi.org/10.18653/v1/D19-1649
Han, J., Cheng, B., Lu, W.: Exploring task difficulty for few-shot relation extraction. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 2605–2616. Association for Computational Linguistics, Online and Punta Cana, Dominican Republic, November 2021. https://doi.org/10.18653/v1/2021.emnlp-main.204
Han, X., et al.: FewRel: a large-scale supervised few-shot relation classification dataset with state-of-the-art evaluation. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 4803–4809. Association for Computational Linguistics, Brussels, Belgium, October-November 2018. https://doi.org/10.18653/v1/D18-1514
Hospedales, T., Antoniou, A., Micaelli, P., Storkey, A.: Meta-learning in neural networks: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 44(9), 5149–5169 (2021). https://doi.org/10.1109/TPAMI.2021.3079209
Li, Z., Liu, H., Zhang, Z., Liu, T., Xiong, N.N.: Learning knowledge graph embedding with heterogeneous relation attention networks. IEEE Trans. Neural Netw. Learn. Syst. 33(8), 3961–3973 (2022). https://doi.org/10.1109/TNNLS.2021.3055147
Liu, P., Yuan, W., Fu, J., Jiang, Z., Hayashi, H., Neubig, G.: Pre-train, prompt, and predict: a systematic survey of prompting methods in natural language processing. ACM Comput. Surv. 55(9) (2023). https://doi.org/10.1145/3560815
Liu, Y., Hu, J., Wan, X., Chang, T.H.: A simple yet effective relation information guided approach for few-shot relation extraction. In: Findings of the Association for Computational Linguistics: ACL 2022, pp. 757–763. Association for Computational Linguistics, Dublin, Ireland, May 2022. https://doi.org/10.18653/v1/2022.findings-acl.62
Pawar, S., Palshikar, G.K., Bhattacharyya, P.: Relation extraction: a survey. arXiv preprint arXiv:1712.05191 (2017)
Peng, H., et al.: Learning from context or names? An empirical study on neural relation extraction. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 3661–3672. Association for Computational Linguistics, Online, November 2020. https://doi.org/10.18653/v1/2020.emnlp-main.298
Qu, M., Gao, T., Xhonneux, L.P.A.C., Tang, J.: Few-shot relation extraction via Bayesian meta-learning on relation graphs. In: Proceedings of the 37th International Conference on Machine Learning. ICML’20, JMLR.org (2020)
Snell, J., Swersky, K., Zemel, R.: Prototypical networks for few-shot learning. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. NIPS’17, pp. 4080–4090. Curran Associates Inc., Red Hook, NY, USA (2017)
Wang, Y., et al.: Learning to decouple relations: few-shot relation classification with entity-guided attention and confusion-aware training. In: Proceedings of the 28th International Conference on Computational Linguistics, pp. 5799–5809. International Committee on Computational Linguistics, Barcelona, Spain (Online), December 2020. https://doi.org/10.18653/v1/2020.coling-main.510
Yang, K., Zheng, N., Dai, X., He, L., Huang, S., Chen, J.: Enhance prototypical network with text descriptions for few-shot relation classification. In: Proceedings of the 29th ACM International Conference on Information and Knowledge Management. CIKM ’20, pp. 2273–2276. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3340531.3412153
Yang, S., Zhang, Y., Niu, G., Zhao, Q., Pu, S.: Entity concept-enhanced few-shot relation extraction. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 987–991. Association for Computational Linguistics, Online, August 2021. https://doi.org/10.18653/v1/2021.acl-short.124
Acknowledgments
This work is supported by National Key R &D Program of China (2021ZD0113902).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Li, L., Zhang, Y., Zou, J., Huang, Y. (2024). PNPT: Prototypical Network with Prompt Template for Few-Shot Relation Extraction. In: Wu, F., et al. Social Media Processing. SMP 2023. Communications in Computer and Information Science, vol 1945. Springer, Singapore. https://doi.org/10.1007/978-981-99-7596-9_12
Download citation
DOI: https://doi.org/10.1007/978-981-99-7596-9_12
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-7595-2
Online ISBN: 978-981-99-7596-9
eBook Packages: Computer ScienceComputer Science (R0)