Skip to main content
Log in

Multi-Scale Convolutional Neural Network for Temporal Knowledge Graph Completion

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Knowledge graph completion is a critical task in natural language processing. The task becomes more challenging on temporal knowledge graph, where each fact is associated with a timestamp. Currently, cognitive science has revealed that the time-dependent historical experience can activate the neurons, and time-related and the static information should be fused to represent the happened facts. Meanwhile, there are correspondence between the CNN model and the biological cortex in several aspects, correspondingly, different levels of cortex information can be described using different sizes of convolution kernels. Most existing methods for temporal knowledge graph completion learn the time-varying relation embeddings by scaling with the number of entities or timestamps, and then use the dot production between the embeddings of entities and relations as the quadruple’s loss. However, the dot product cannot well describe the complex interaction between the embeddings. Inspired by this theory, this paper proposes multi-scale convolutional neural network (MsCNN), which utilizes both static and dynamic information to represent the relations’ embeddings, and uses convolution operation to learn the mutual information between the embeddings of time-varying relations and entities. Besides, multi-scale convolution kernels are utilized to learn the mutual information at different levels. We also verified that with the increase of the dimension of embeddings, the performance increases. The performance of MsCNN on three benchmark datasets achieves state-of-the-art link prediction results. The MsCNN can well fuse the static and temporal information and explore different levels of mutual information between the input embeddings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Data Availability

Our manuscript has no associated data.

References

  1. Wang H, Zhang F, Wang J, Zhao M, Li W, Xie X, Guo M. Exploring high-order user preference on the knowledge graph for recommender systems. ACM Trans Inf Syst. 2019;37(3):1–26.

    Article  Google Scholar 

  2. Hao Y, Liu H, He S, Liu K, Zhao J. Pattern-revising enhanced simple question answering over knowledge bases. Proceedings of the International Conference on Computational Linguistics. 2018;3272–3282.

  3. Zhou H, Young T, Huang M, Zhao H, Xu J, Zhu X. Commonsense knowledge aware conversation generation with graph attention. Proceedings of the International Joint Conference on Artificial Intelligence. 2018;4623–4629.

  4. Goel R, Kazemi SM, Brubaker M, Poupart P. Diachronic embedding for temporal knowledge graph completion. Proceedings of the AAAI Conference on Artificial Intelligence. 2020;34:3988–95.

    Article  Google Scholar 

  5. Lacroix T, Obozinski G, Usunier N. Tensor decompositions for temporal knowledge base completion. Stat. 2020;1050:10.

    Google Scholar 

  6. Shao P, Zhang D, Yang G, Tao J, Che F, Liu T. Tucker decomposition-based temporal knowledge graph completion. Knowl-Based Syst. 2022;238:107841.

    Article  Google Scholar 

  7. Reddy L, Zoefel B, Possel JK, Peters J, Dijksterhuis DE, Poncet M, van Straaten EC, Baayen JC, Idema S, Self MW. Human hippocampal neurons track moments in a sequence of events. J Neurosci. 2021;41(31):6714–25.

    Article  Google Scholar 

  8. Laskar MNU, Giraldo LGS, Schwartz O. Correspondence of deep neural networks and the brain for visual textures. 2018. arXiv preprint arXiv:1806.02888.

  9. Garcia-Duran A, Dumančić S, Niepert M. Learning sequence encoders for temporal knowledge graph completion. Proceedings of the Conference on Empirical Methods in Natural Language Processing. 2018;4816–4821.

  10. Leetaru K, Schrodt PA. GDELT: global data on events, location, and tone, 1979–2012. International Studies Association Annual Convention. 2013;2:1–49.

    Google Scholar 

  11. Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O. Translating embeddings for modeling multi-relational data. Proceedings of the International Conference on Neural Information Processing Systems. 2013;2787–2795.

  12. Jiang T, Liu T, Ge T, Sha L, Li S, Chang B, Sui Z. Encoding temporal information for time-aware link prediction. Proceedings of the Conference on Empirical Methods in Natural Language Processing. 2016;2350–2354.

  13. Nickel M, Tresp V, Kriegel H-P. A three-way model for collective learning on multi-relational data. Proceedings of the International Conference on International Conference on Machine Learning. 2011;809–816.

  14. Yang B, Yih SW-T, He X, Gao J, Deng L. Embedding entities and relations for learning and inference in knowledge bases. Proceedings of the International Conference on Learning Representations. 2015.

  15. Trouillon T, Welbl J, Riedel S, Gaussier É, Bouchard G. Complex embeddings for simple link prediction. Proceedings of the International Conference on Machine Learning. 2016;2071–2080.

  16. Socher R, Chen D, Manning CD, Ng A. Reasoning with neural tensor networks for knowledge base completion. Adv Neural Inf Proces Syst. 2013:26.

  17. Dettmers T, Minervini P, Stenetorp P, Riedel S. Convolutional 2D knowledge graph embeddings. Proceedings of the AAAI conference on artificial intelligence. 2018;32:1811–8.

    Article  Google Scholar 

  18. Jiang X, Wang Q, Wang B. Adaptive convolution for multi-relational learning. Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019;978–987.

  19. Trivedi R, Dai H, Wang Y, Song L. Know-evolve: deep temporal reasoning for dynamic knowledge graphs. Proceedings of the 34th International Conference on Machine Learning. 2017;3462–3471.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wei Liu.

Ethics declarations

Ethical Approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, W., Wang, P., Zhang, Z. et al. Multi-Scale Convolutional Neural Network for Temporal Knowledge Graph Completion. Cogn Comput 15, 1016–1022 (2023). https://doi.org/10.1007/s12559-023-10134-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-023-10134-7

Keywords

Navigation