Skip to main content

Knowledge Representation Learning Based on GAN and Pre-training

  • Conference paper
  • First Online:
Proceedings of 2021 Chinese Intelligent Systems Conference

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 805))

  • 1774 Accesses

Abstract

Knowledge representation learning can realize the representation of entity and relation semantic information by projecting the triple entity or relation in knowledge graph into vector space, and can effectively calculate entity, relation and their complex semantic association. However, most of the existing models only use the structure information of triples, ignoring the semantic rich entity description. In addition, knowledge representation learning methods usually use negative sampling method. Through random sampling, the negative samples generated by this method are easy to be recognized by the model, and with the iteration of training, the contribution to the improvement of the model performance becomes smaller and smaller. Therefore, we propose a Knowledge Representation learning method based on Generative adversarial network and Pre-training language model (KRGP). Specifically, it uses the idea of generative adversarial network, semantic matching model and translation distance model to generate high-quality negative samples. We combine descriptions of head and tail entities and relations corresponding to negative sample triples into text sequences, and then use next sentence prediciton task of BERT to transform text sequences into a binary classification problem. Experimental results show that this method can improve the performance of link prediction and triple classification on multiple data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 349.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 449.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 449.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bordes, A., et al.: Translating embeddings for modeling multi-relational data. In: Neural Information Processing Systems (NIPS) (2013)

    Google Scholar 

  2. Wang, Z., et al.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28, no. 1 (2014)

    Google Scholar 

  3. Yang, B., et al.: Embedding entities and relations for learning and inference in knowledge bases. arXiv preprint arXiv:1412.6575 (2014)

  4. Xie, R., et al.: Representation learning of knowledge graphs with entity descriptions. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 30, no. 1 (2016)

    Google Scholar 

  5. Cai, L., Wang, W.Y.: KBGAN: adversarial learning for knowledge graph embeddings. arXiv preprint arXiv:1711.04071 (2017)

  6. Mikolov, T., et al.: Distributed representations of words and phrases and their compositionality. arXiv preprint arXiv:1310.4546 (2013)

  7. Devlin, J., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  8. Zhang, Z., et al.: ERNIE: enhanced language representation with informative entities. arXiv preprint arXiv:1905.07129 (2019)

  9. Yao, L., Mao, C., Luo, Y.: KG-BERT: BERT for knowledge graph completion. arXiv preprint arXiv:1909.03193 (2019)

  10. Han, X., et al.: OpenKE: an open toolkit for knowledge embedding. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations (2018)

    Google Scholar 

Download references

Acknowledgements

This work was supported by National Key R&D Program of China (2018YFB1402600), by the National Natural Science Foundation of China (61772083), and by Science and Technology Major Project of Guangxi (GuikeAA18118054).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Junping Du .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ouyang, S., Du, J., Shao, Y., Li, A., Xu, X. (2022). Knowledge Representation Learning Based on GAN and Pre-training. In: Jia, Y., Zhang, W., Fu, Y., Yu, Z., Zheng, S. (eds) Proceedings of 2021 Chinese Intelligent Systems Conference. Lecture Notes in Electrical Engineering, vol 805. Springer, Singapore. https://doi.org/10.1007/978-981-16-6320-8_46

Download citation

Publish with us

Policies and ethics