Abstract
Knowledge graph contains a large number of factual triples, but it is still inevitably incomplete. Previous commonsense-based knowledge graph completion models used concepts to replace entities in triplets to generate high-quality negative sampling and link prediction from the perspective of joint facts and commonsense. However, they did not consider the importance of concepts and their correlation with relationships, resulting in a lot of noise and ample room for improvement. To address this problem, we designed commonsense for knowledge graph completion and filtered the commonsense knowledge based on the analytic hierarchy process. The obtained commonsense can further improve the quality of negative samples and the effectiveness of link prediction. Experimental results on four datasets of the knowledge graph completion (KGC) task show that our method can improve the performance of the original knowledge graph embedding (KGE) model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bollacker, K.D., Evans, C., Paritosh, P.K., Sturge, T., Taylor, J.: Freebase: a collaboratively created graph database for structuring human knowledge. In: SIGMOD Conference, pp. 1247–1250. ACM (2008)
Bordes, A., Usunier, N., García-Durán, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: NIPS, pp. 2787–2795 (2013)
Cai, H., Zhao, F., Jin, H.: Commonsense knowledge construction with concept and pretrained model. In: Zhao, X., Yang, S., Wang, X., Li, J. (eds.) WISA 2022. LNCS, vol. 13579, pp. 40–51. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-20309-1_4
Cai, L., Wang, W.Y.: KBGAN: adversarial learning for knowledge graph embeddings. In: NAACL-HLT, pp. 1470–1480. Association for Computational Linguistics (2018)
Chen, J., et al.: Adversarial caching training: unsupervised inductive network representation learning on large-scale graphs. IEEE Trans. Neural Netw. Learn. Syst. 33(12), 7079–7090 (2022)
Ji, S., Pan, S., Cambria, E., Marttinen, P., Yu, P.S.: A survey on knowledge graphs: representation, acquisition, and applications. IEEE Trans. Neural Netw. Learn. Syst. 33(2), 494–514 (2022)
Joulin, A., van der Maaten, L., Jabri, A., Vasilache, N.: Learning visual features from large weakly supervised data. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9911, pp. 67–84. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46478-7_5
Lehmann, J., et al.: DBpedia - a large-scale, multilingual knowledge base extracted from Wikipedia. Semant. Web 6(2), 167–195 (2015)
Li, Z., et al.: Efficient non-sampling knowledge graph embedding. In: WWW, pp. 1727–1736. ACM/IW3C2 (2021)
Lin, B.Y., Chen, X., Chen, J., Ren, X.: KagNet: knowledge-aware graph networks for commonsense reasoning. In: EMNLP/IJCNLP (1), pp. 2829–2839. Association for Computational Linguistics (2019)
Liu, W., Daruna, A.A., Kira, Z., Chernova, S.: Path ranking with attention to type hierarchies. In: AAAI, pp. 2893–2900. AAAI Press (2020)
Meilicke, C., Chekol, M.W., Ruffinelli, D., Stuckenschmidt, H.: Anytime bottom-up rule learning for knowledge graph completion. In: IJCAI, pp. 3137–3143. ijcai.org (2019)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. In: ICLR (Workshop Poster) (2013)
Niu, G., Li, B., Zhang, Y., Pu, S.: CAKE: a scalable commonsense-aware framework for multi-view knowledge graph completion. In: ACL (1), pp. 2867–2877. Association for Computational Linguistics (2022)
Sadeghian, A., Armandpour, M., Ding, P., Wang, D.Z.: DRUM: end-to-end differentiable rule mining on knowledge graphs. In: NeurIPS, pp. 15321–15331 (2019)
Sap, M., et al.: ATOMIC: an atlas of machine commonsense for if-then reasoning. In: AAAI, pp. 3027–3035. AAAI Press (2019)
Speer, R., Chin, J., Havasi, C.: ConceptNet 5.5: an open multilingual graph of general knowledge. In: AAAI, pp. 4444–4451. AAAI Press (2017)
Sun, Z., Deng, Z., Nie, J., Tang, J.: Rotate: knowledge graph embedding by relational rotation in complex space. In: ICLR (Poster). OpenReview.net (2019)
Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: CVSC, pp. 57–66. Association for Computational Linguistics (2015)
Trouillon, T., Welbl, J., Riedel, S., Gaussier, É., Bouchard, G.: Complex embeddings for simple link prediction. In: ICML. JMLR Workshop and Conference Proceedings, vol. 48, pp. 2071–2080. JMLR.org (2016)
Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: AAAI, pp. 1112–1119. AAAI Press (2014)
Xiong, W., Hoang, T., Wang, W.Y.: DeepPath: a reinforcement learning method for knowledge graph reasoning. In: EMNLP, pp. 564–573. Association for Computational Linguistics (2017)
Zang, L., Cao, C., Cao, Y., Wu, Y., Cao, C.: A survey of commonsense knowledge acquisition. J. Comput. Sci. Technol. 28(4), 689–719 (2013)
Zhang, Y., Yao, Q., Shao, Y., Chen, L.: NSCaching: simple and efficient negative sampling for knowledge graph embedding. In: ICDE, pp. 614–625. IEEE (2019)
Zhang, Z., Cai, J., Zhang, Y., Wang, J.: Learning hierarchy-aware knowledge graph embeddings for link prediction. In: AAAI, pp. 3065–3072. AAAI Press (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Liu, C., Tang, J., Zeng, W., Wu, J., Huang, H. (2023). Knowledge Graph Completion with Fused Factual and Commonsense Information. In: Yuan, L., Yang, S., Li, R., Kanoulas, E., Zhao, X. (eds) Web Information Systems and Applications. WISA 2023. Lecture Notes in Computer Science, vol 14094. Springer, Singapore. https://doi.org/10.1007/978-981-99-6222-8_12
Download citation
DOI: https://doi.org/10.1007/978-981-99-6222-8_12
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-6221-1
Online ISBN: 978-981-99-6222-8
eBook Packages: Computer ScienceComputer Science (R0)