Advertisement

Journal of Computer Science and Technology

, Volume 33, Issue 2, pp 323–334 | Cite as

Modeling the Correlations of Relations for Knowledge Graph Embedding

  • Ji-Zhao Zhu
  • Yan-Tao Jia
  • Jun Xu
  • Jian-Zhong Qiao
  • Xue-Qi Cheng
Regular Paper
  • 40 Downloads

Abstract

Knowledge graph embedding, which maps the entities and relations into low-dimensional vector spaces, has demonstrated its effectiveness in many tasks such as link prediction and relation extraction. Typical methods include TransE, TransH, and TransR. All these methods map different relations into the vector space separately and the intrinsic correlations of these relations are ignored. It is obvious that there exist some correlations among relations because different relations may connect to a common entity. For example, the triples (Steve Jobs, PlaceOfBrith, California) and (Apple Inc., Location, California) share the same entity California as their tail entity. We analyze the embedded relation matrices learned by TransE/TransH/TransR, and find that the correlations of relations do exist and they are showed as low-rank structure over the embedded relation matrix. It is natural to ask whether we can leverage these correlations to learn better embeddings for the entities and relations in a knowledge graph. In this paper, we propose to learn the embedded relation matrix by decomposing it as a product of two low-dimensional matrices, for characterizing the low-rank structure. The proposed method, called TransCoRe (Translation-Based Method via Modeling the Correlations of Relations), learns the embeddings of entities and relations with translation-based framework. Experimental results based on the benchmark datasets of WordNet and Freebase demonstrate that our method outperforms the typical baselines on link prediction and triple classification tasks.

Keywords

knowledge graph embedding low-rank matrix decomposition 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

11390_2018_1821_MOESM1_ESM.pdf (613 kb)
ESM 1 (PDF 613 kb)

References

  1. 1.
    Miller G A. WordNet: A lexical database for English. Communications of the ACM, 1995, 38(11): 39-41.CrossRefGoogle Scholar
  2. 2.
    Bollacker K, Cook R, Tufts P. Freebase: A shared database of structured general human knowledge. In Proc. the 22nd National Conf. Artificial Intelligence, July 2007, pp.1962-1963.Google Scholar
  3. 3.
    Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J. Freebase: A collaboratively created graph database for structuring human knowledge. In Proc. ACM SIGMOD Int. Conf. Management of Data, June 2008, pp.1247-1250.Google Scholar
  4. 4.
    Suchanek F M, Kasneci G, Weikum G. YAGO: A core of semantic knowledge unifying WordNet and Wikipedia. In Proc. the 16th Int. World Wide Web Conf., May 2007, pp.697-706.Google Scholar
  5. 5.
    Tang J, Lou T C, Kleinberg J, Wu S. Transfer learning to infer social ties across heterogeneous networks. ACM Trans. Information Systems, 2016, 34(2): Article No. 7.Google Scholar
  6. 6.
    Jia Y T, Wang Y Z, Lin H L, Jin X L, Cheng X Q. Locally adaptive translation for knowledge graph embedding. In Proc. the 30th AAAI Conf. Artificial Intelligence, February 2016, pp.992-998.Google Scholar
  7. 7.
    Wu W T, Li H S, Wang H X, Zhu K Q. Probase: A probabilistic taxonomy for text understanding. In Proc. the ACM Int. Conf. Management of Data, May 2012, pp.481-492.Google Scholar
  8. 8.
    Jayaram N, Khan A, Li C K, Yan X F, Elmasri R. Querying knowledge graphs by example entity tuples. IEEE Trans. Knowledge and Data Engineering, 2015, 27(10): 2797-2811.CrossRefGoogle Scholar
  9. 9.
    Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O. Translating embeddings for modeling multirelational data. In Proc. the 26th Int. Conf. Neural Information Processing Systems, December 2013, pp.2787-2795.Google Scholar
  10. 10.
    Wang Z, Zhang J W, Feng J L, Chen Z. Knowledge graph embedding by translating on hyperplanes. In Proc. the 28th AAAI Conf. Artificial Intelligence, July 2014, pp.1112-1119.Google Scholar
  11. 11.
    Lin Y K, Liu Z Y, Sun M S, Liu Y, Zhu X. Learning entity and relation embeddings for knowledge graph completion. In Proc. the 29th AAAI Conf. Artificial Intelligence, January 2015.Google Scholar
  12. 12.
    Alter O, Brown P O, Botstein D. Singular value decomposition for genome-wide expression data processing and modeling. Proceedings of the National Academy of Sciences of the United States of America, 2000, 97(18): 10101-10106.CrossRefGoogle Scholar
  13. 13.
    de Lathauwer L, de Moor B, Vandewalle J. A multilinear singular value decomposition. SIAM Journal on Matrix Analysis and Applications, 2000, 21(4): 1253-1278.MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Mikolov T, Sutskever I, Chen K, Corrado G, Dean J. Distributed representations of words and phrases and their compositionality. In Proc. the 26th Int. Conf. Neural Information Processing Systems, December 2013, pp.3111-3119.Google Scholar
  15. 15.
    Nickel M, Tresp V, Kriegel H P. Factorizing YAGO: Scalable machine learning for linked data. In Proc. the 21st Int. Conf. World Wide Web, April 2012, pp.271-280.Google Scholar
  16. 16.
    Franz T, Schultz A, Sizov S, Staab S. TripleRank: Ranking semantic Web data by tensor decomposition. In Proc. the 8th Int. Semantic Web Conf., October 2009, pp.213-228.Google Scholar
  17. 17.
    Chang K W, Yih W T, Yang B S, Meek C. Typed tensor decomposition of knowledge bases for relation extraction. In Proc. Conf. Empirical Methods in Natural Language Processing, October 2014, pp.1568-1579.Google Scholar
  18. 18.
    Chang K W, Yih W T, Meek C. Multi-relational latent semantic analysis. In Proc. Conf. Empirical Methods in Natural Language Processing, October 2013, pp.1602-1612.Google Scholar
  19. 19.
    Kiers H A L. Towards a standardized notation and terminology in multiway analysis. Journal of Chemometrics, 2000, 14(3): 105-122.CrossRefGoogle Scholar
  20. 20.
    Bordes A, Glorot X, Weston J, Bengio Y. Joint learning of words and meaning representations for open-text semantic parsing. In Proc. the 15th Int. Conf. Artificial Intelligence and Statistics, April 2012, pp.127-135.Google Scholar
  21. 21.
    Bordes A, Glorot X, Weston J, Bengio Y. A semantic matching energy function for learning with multi-relational data. Machine Learning, 2014, 94(2): 233-259.MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    Bordes A,Weston J, Collobert R, Bengio Y. Learning structured embeddings of knowledge bases. In Proc. the 25th Int. Conf. Artificial Intelligence, August 2011, pp.301-306.Google Scholar
  23. 23.
    Jenatton R, Le Roux N, Bordes A, Obozinski G. A latent factor model for highly multi-relational data. In Proc. the 25th Int. Conf. Neural Information Processing Systems, December 2012, pp.3167-3175.Google Scholar
  24. 24.
    Socher R, Chen D Q, Manning C D, Ng A. Reasoning with neural tensor networks for knowledge base completion. In Proc. the 26th Int. Conf. Neural Information Processing Systems, December 2013, pp.926-934.Google Scholar
  25. 25.
    Pearson K. Note on regression and inheritance in the case of two parents. Proceedings of the Royal Society of London, 1895, 58(347/348/349/350/351/352): 240-242.Google Scholar
  26. 26.
    Ji G L, He S Z, Xu L H, Liu K, Zhao J. Knowledge graph embedding via dynamic mapping matrix. In Proc. the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, July 2015, pp.687-696.Google Scholar
  27. 27.
    Lin Y K, Liu Z Y, Luan H B, Sun M S, Rao S W, Liu S. Modeling relation paths for representation learning of knowledge bases. In Proc. Conf. Empirical Methods in Natural Language Processing, September 2015, pp.705-714.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.College of Computer Science and EngineeringNortheastern UniversityShenyangChina
  2. 2.Key Laboratory of Network Data Science and Technology, Institute of Computing TechnologyChinese Academy of SciencesBeijingChina

Personalised recommendations