Advertisement

Representation Learning with Entity Topics for Knowledge Graphs

  • Xin OuyangEmail author
  • Yan YangEmail author
  • Liang He
  • Qin Chen
  • Jiacheng Zhang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10412)

Abstract

Knowledge representation learning which represents triples as semantic embeddings has achieved tremendous success these years. Recent work aims at integrating the information of triples with texts, which has shown great advantages in alleviating the data sparsity problem. However, most of these methods are based on word-level information such as co-occurrence in texts, while ignoring the latent semantics of entities. In this paper, we propose an entity topic based representation learning (ETRL) method, which enhances the triple representations with the entity topics learned by the topic model. We evaluate our proposed method knowledge graph completion task. The experimental results show that our method outperforms most state-of-the-art methods. Specifically, we achieve a maximum improvement of 7.9% in terms of hits@10.

Keywords

Knowledge representation Entity topics Topic model Knowledge graph completion 

Notes

Acknowledgments

This work was supported by the National Key Technology Support Program (No. 2015BAH01F02), Shanghai Municipal Commission of Economy and Information Under Grant Project (No. 201602024), the Natural Science Foundation of Shanghai (No. 17ZR1444900)and the Science and Technology Commission of Shanghai Municipality (No. 15PJ1401700).

References

  1. 1.
    Bollacker, K., Evans, C., Paritosh, P., Sturge, T., Taylor, J.: Freebase: a collaboratively created graph database for structuring human knowledge. In: ACM SIGMOD International Conference on Management of Data, SIGMOD 2008, Vancouver, BC, Canada, June, pp. 1247–1250 (2008)Google Scholar
  2. 2.
    Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: NIPS, pp. 2787–2795 (2013)Google Scholar
  3. 3.
    Google: Freebase data dumps. https://developers.google.com/freebase/data
  4. 4.
    He, S., Liu, K., Ji, G., Zhao, J.: Learning to represent knowledge graphs with gaussian embedding. In: CIKM, pp. 623–632. ACM (2015)Google Scholar
  5. 5.
    Lee, D.D., Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401(6755), 788–791 (1999)CrossRefGoogle Scholar
  6. 6.
    Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: AAAI, pp. 2181–2187 (2015)Google Scholar
  7. 7.
    Miller, A.H., Fisch, A., Dodge, J., Karimi, A., Bordes, A., Weston, J.: Key-value memory networks for directly reading documents. In: EMNLP (2016)Google Scholar
  8. 8.
    Stevens, K., Kegelmeyer, P., Andrzejewski, D., Buttler, D.: Exploring topic coherence over many models and many topics. In: EMNLP-CoNLL, pp. 952–961 (2012)Google Scholar
  9. 9.
    Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: AAAI, pp. 1112–1119. Citeseer (2014)Google Scholar
  10. 10.
    Wang, Z., Li, J.: Text-enhanced representation learning for knowledge graph. In: IJCAI, pp. 1293–1299. AAAI Press (2016)Google Scholar
  11. 11.
    Xie, R., Liu, Z., Jia, J., Luan, H., Sun, M.: Representation learning of knowledge graphs with entity descriptions. In: AAAI, pp. 2659–2665 (2016)Google Scholar
  12. 12.
    Zhong, H., Zhang, J., Wang, Z., Wan, H., Chen, Z.: Aligning knowledge and text embeddings by entity descriptions. In: EMNLP, pp. 267–272 (2015)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Institute of Computer ApplicationsEast China Normal UniversityShanghaiChina
  2. 2.Shanghai Engineering Research Center of Intelligent Service RobotShanghaiChina

Personalised recommendations