Existing methods for knowledge graph embedding do not ensure the high-rank triples predicted by themselves to be as consistent as possible with the logical background which is made up of a knowledge graph and a logical theory. Users must take great effort to filter consistent triples before adding new triples to the knowledge graph. To alleviate users’ burden, we propose an approach to enhancing existing embedding-based methods to encode logical consistency into the learnt distributed representation for the knowledge graph, enforcing high-rank new triples as consistent as possible. To evaluate this approach, four knowledge graphs with logical theories are constructed from the four great classical masterpieces of Chinese literature. Experimental results on these datasets show that our approach is able to guarantee high-rank triples as consistent as possible while preserving a comparable performance as baseline methods in link prediction and triple classification.



This work was partly supported by National Natural Science Foundation of China (61375056 and 61876204), Science and Technology Program of Guangzhou (201804010496), and Scientific Research Innovation Team in Department of Education of Guangdong Province (2017KCXTD013).


  1. 1.
    Bordes, A., Glorot, X., Weston, J., Bengio, Y.: A semantic matching energy function for learning with multi-relational data - application to word-sense disambiguation. Mach. Learn. 94(2), 233–259 (2014)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Bordes, A., Usunier, N., García-Durán, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: NIPS, pp. 2787–2795 (2013)Google Scholar
  3. 3.
    Bordes, A., Weston, J., Collobert, R., Bengio, Y.: Learning structured embeddings of knowledge bases. In: AAAI (2011)Google Scholar
  4. 4.
    Demeester, T., Rocktäschel, T., Riedel, S.: Lifted rule injection for relation embeddings. In: EMNLP, pp. 1389–1399 (2016)Google Scholar
  5. 5.
    Du, J., Qi, K., Wan, H., Peng, B., Lu, S., Shen, Y.: Enhancing knowledge graph embedding from a logical perspective. In: Wang, Z., Turhan, A.-Y., Wang, K., Zhang, X. (eds.) JIST 2017. LNCS, vol. 10675, pp. 232–247. Springer, Cham (2017). Scholar
  6. 6.
    Grau, B.C., Horrocks, I., Motik, B., Parsia, B., Patel-Schneider, P.F., Sattler, U.: OWL 2: the next step for OWL. J. Web Semant. 6(4), 309–322 (2008)CrossRefGoogle Scholar
  7. 7.
    Guo, S., Wang, Q., Wang, L., Wang, B., Guo, L.: Jointly embedding knowledge graphs and logical rules. In: EMNLP, pp. 192–202 (2016)Google Scholar
  8. 8.
    Guo, S., Wang, Q., Wang, L., Wang, B., Guo, L.: Knowledge graph embedding with iterative guidance from soft rules. In: AAAI, pp. 4816–4823 (2018)Google Scholar
  9. 9.
    He, S., Liu, K., Ji, G., Zhao, J.: Learning to represent knowledge graphs with Gaussian embedding. In: CIKM, pp. 623–632 (2015)Google Scholar
  10. 10.
    Jenatton, R., Roux, N.L., Bordes, A., Obozinski, G.: A latent factor model for highly multi-relational data. In: NIPS, pp. 3176–3184 (2012)Google Scholar
  11. 11.
    Ji, G., He, S., Xu, L., Liu, K., Zhao, J.: Knowledge graph embedding via dynamic mapping matrix. In: ACL, pp. 687–696 (2015)Google Scholar
  12. 12.
    Ji, G., Liu, K., He, S., Zhao, J.: Knowledge graph completion with adaptive sparse transfer matrix. In: AAAI, pp. 985–991 (2016)Google Scholar
  13. 13.
    Jia, Y., Wang, Y., Lin, H., Jin, X., Cheng, X.: Locally adaptive translation for knowledge graph embedding. In: AAAI, pp. 992–998 (2016)Google Scholar
  14. 14.
    Lin, Y., Liu, Z., Luan, H., Sun, M., Rao, S., Liu, S.: Modeling relation paths for representation learning of knowledge bases. In: EMNLP, pp. 705–714 (2015)Google Scholar
  15. 15.
    Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: AAAI, pp. 2181–2187 (2015)Google Scholar
  16. 16.
    Liu, H., Wu, Y., Yang, Y.: Analogical inference for multi-relational embeddings. In: ICML, pp. 2168–2178 (2017)Google Scholar
  17. 17.
    Nickel, M., Rosasco, L., Poggio, T.A.: Holographic embeddings of knowledge graphs. In: AAAI, pp. 1955–1961 (2016)Google Scholar
  18. 18.
    Pérez-Urbina, H., Motik, B., Horrocks, I.: Tractable query answering and rewriting under description logic constraints. J. Appl. Logic 8(2), 186–209 (2010)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Rocktäschel, T., Singh, S., Riedel, S.: Injecting logical background knowledge into embeddings for relation extraction. In: NAACL, pp. 1119–1129 (2015)Google Scholar
  20. 20.
    Socher, R., Chen, D., Manning, C.D., Ng, A.Y.: Reasoning with neural tensor networks for knowledge base completion. In: NIPS, pp. 926–934 (2013)Google Scholar
  21. 21.
    Trouillon, T., Welbl, J., Riedel, S., Gaussier, É., Bouchard, G.: Complex embeddings for simple link prediction. In: ICML, pp. 2071–2080 (2016)Google Scholar
  22. 22.
    Wang, Q., Mao, Z., Wang, B., Guo, L.: Knowledge graph embedding: a survey of approaches and applications. IEEE Trans. Knowl. Data Eng. 29(12), 2724–2743 (2017)CrossRefGoogle Scholar
  23. 23.
    Wang, Q., Wang, B., Guo, L.: Knowledge base completion using embeddings and rules. In: IJCAI, pp. 1859–1866 (2015)Google Scholar
  24. 24.
    Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: AAAI, pp. 1112–1119 (2014)Google Scholar
  25. 25.
    Wei, Z., Zhao, J., Liu, K., Qi, Z., Sun, Z., Tian, G.: Large-scale knowledge base completion: Inferring via grounding network sampling over selected instances. In: CIKM, pp. 1331–1340 (2015)Google Scholar
  26. 26.
    Xiao, H., Huang, M., Zhu, X.: TransG: a generative model for knowledge graph embedding. In: ACL, pp. 992–998 (2016)Google Scholar
  27. 27.
    Yang, B., Yih, W., He, X., Gao, J., Deng, L.: Embedding entities and relations for learning and inference in knowledge bases. CoRR abs/1412.6575 (2014).

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Guangdong University of Foreign StudiesGuangzhouChina

Personalised recommendations