Abstract
Knowledge representation learning can realize the representation of entity and relation semantic information by projecting the triple entity or relation in knowledge graph into vector space, and can effectively calculate entity, relation and their complex semantic association. However, most of the existing models only use the structure information of triples, ignoring the semantic rich entity description. In addition, knowledge representation learning methods usually use negative sampling method. Through random sampling, the negative samples generated by this method are easy to be recognized by the model, and with the iteration of training, the contribution to the improvement of the model performance becomes smaller and smaller. Therefore, we propose a Knowledge Representation learning method based on Generative adversarial network and Pre-training language model (KRGP). Specifically, it uses the idea of generative adversarial network, semantic matching model and translation distance model to generate high-quality negative samples. We combine descriptions of head and tail entities and relations corresponding to negative sample triples into text sequences, and then use next sentence prediciton task of BERT to transform text sequences into a binary classification problem. Experimental results show that this method can improve the performance of link prediction and triple classification on multiple data sets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bordes, A., et al.: Translating embeddings for modeling multi-relational data. In: Neural Information Processing Systems (NIPS) (2013)
Wang, Z., et al.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28, no. 1 (2014)
Yang, B., et al.: Embedding entities and relations for learning and inference in knowledge bases. arXiv preprint arXiv:1412.6575 (2014)
Xie, R., et al.: Representation learning of knowledge graphs with entity descriptions. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 30, no. 1 (2016)
Cai, L., Wang, W.Y.: KBGAN: adversarial learning for knowledge graph embeddings. arXiv preprint arXiv:1711.04071 (2017)
Mikolov, T., et al.: Distributed representations of words and phrases and their compositionality. arXiv preprint arXiv:1310.4546 (2013)
Devlin, J., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Zhang, Z., et al.: ERNIE: enhanced language representation with informative entities. arXiv preprint arXiv:1905.07129 (2019)
Yao, L., Mao, C., Luo, Y.: KG-BERT: BERT for knowledge graph completion. arXiv preprint arXiv:1909.03193 (2019)
Han, X., et al.: OpenKE: an open toolkit for knowledge embedding. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations (2018)
Acknowledgements
This work was supported by National Key R&D Program of China (2018YFB1402600), by the National Natural Science Foundation of China (61772083), and by Science and Technology Major Project of Guangxi (GuikeAA18118054).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Ouyang, S., Du, J., Shao, Y., Li, A., Xu, X. (2022). Knowledge Representation Learning Based on GAN and Pre-training. In: Jia, Y., Zhang, W., Fu, Y., Yu, Z., Zheng, S. (eds) Proceedings of 2021 Chinese Intelligent Systems Conference. Lecture Notes in Electrical Engineering, vol 805. Springer, Singapore. https://doi.org/10.1007/978-981-16-6320-8_46
Download citation
DOI: https://doi.org/10.1007/978-981-16-6320-8_46
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-6319-2
Online ISBN: 978-981-16-6320-8
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)