Skip to main content
Log in

Complex query answering over knowledge graphs foundation model using region embeddings on a lie group

  • Published:
World Wide Web Aims and scope Submit manuscript

Abstract

Answering complex queries with First-order logical operators over knowledge graphs, such as conjunction (\(\wedge \)), disjunction (\(\vee \)), and negation (\(\lnot \)) is immensely useful for identifying missing knowledge. Recently, neural symbolic reasoning methods have been proposed to map entities and relations into a continuous real vector space and model logical operators as differential neural networks. However, traditional methodss employ negative sampling, which corrupts complex queries to train embeddings. Consequently, these embeddings are susceptible to divergence in the open manifold of \(\mathbb {R}^n\). The appropriate regularization is crucial for addressing the divergence of embeddings. In this paper, we introduces a Lie group as a compact embedding space for complex query embedding, enhancing ability to handle the intricacies of knowledge graphs the foundation model. Our method aims to solve the query of disjunctive and conjunctive problems. Entities and queries are represented as a region of a high-dimensional torus, where the projection, intersection, union, and negation of the torus naturally simulate entities and queries. After simulating the operations on the region of the torus we defined, we found that the resulting geometry remains unchanged. Experiments show that our method achieved a significant improvement on FB15K, FB15K-237, and NELL995. Through extensive experiments on datasets FB15K, FB15K-237, and NELL995, our approach demonstrates significant improvements, leveraging the strengths of knowledge graphs foundation model and complex query processing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Availability of data and materials

The datasets generated and analysed during the current study are available in the FB15K, FB15K-237, NELL995.

References

  1. Ji, S., Pan, S., Cambria, E., Marttinen, P., Philip, S.Y.: A survey on knowledge graphs: Representation, acquisition, and applications. IEEE Trans. Neural Netw. Learn. Syst. 33(2), 494–514 (2021)

  2. Zhang, J., Chen, B., Zhang, L., Ke, X., Ding, H.: Neural, symbolic and neural-symbolic reasoning on knowledge graphs. AI Open 2, 14–35 (2021)

    Article  Google Scholar 

  3. Yu, J., Quan, X., Su, Q., Yin, J.: Generating multi-hop reasoning questions to improve machine reading comprehension. In: International conference on world wide web(WWW), pp. 281–291 (2020)

  4. Bai, L., Yu, W., Chen, M., Ma, X.: Multi-hop reasoning over paths in temporal knowledge graphs using reinforcement learning. Appl. Soft Comput. 103, 107144 (2021)

    Article  Google Scholar 

  5. Yu, J., Su, Q., Quan, X., Yin, J.: Multi-hop reasoning question generation and its application. IEEE Trans. Knowl. Data Eng. (2021)

  6. Hamilton, W., Bajaj, P., Zitnik, M., Jurafsky, D., Leskovec, J.: Embedding logical queries on knowledge graphs. Adv. Neural Inform. Process. Syst. (NeurIPS) 31 (2018)

  7. Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. Adv. Neural Inform. Process. Syst. (NeurIPS) 26 (2013)

  8. He, Y., Zhang, P., Liu, L., Liang, Q., Zhang, W., Zhang, C.: Hip network: Historical information passing network for extrapolation reasoning on temporal knowledge graph. In: IJCAI, pp. 1915–1921 (2021)

  9. Bollacker, K., Evans, C., Paritosh, P., Sturge, T., Taylor, J.: Freebase: a collaboratively created graph database for structuring human knowledge. In: ACM SIGMOD International conference on management of data(SIGMOD), pp. 1247–1250 (2008)

  10. Suchanek, F.M., Kasneci, G., Weikum, G.: Yago: a core of semantic knowledge. In: International conference on world wide web (WWW), pp. 697–706 (2007)

  11. Carlson, A., Betteridge, J., Kisiel, B., Settles, B., Hruschka, E.R., Mitchell, T.M.: Toward an architecture for never-ending language learning. In: AAAI Conference on artificial intelligence (AAAI) (2010)

  12. Lin, Q., Mao, R., Liu, J., Xu, F., Cambria, E.: Fusing topology contexts and logical rules in language models for knowledge graph completion. Inform. Fusion 90, 253–264 (2023)

    Article  Google Scholar 

  13. Glimm, B., Horrocks, I., Motik, B., Stoilos, G., Wang, Z.: Hermit: an owl 2 reasoner. J. Autom. Reason. 53, 245–269 (2014)

    Article  Google Scholar 

  14. Nenov, Y., Piro, R., Motik, B., Horrocks, I., Wu, Z., Banerjee, J.: Rdfox: A highly-scalable rdf store. In: International semantic web conference (ISWC), Springer, pp. 3–20 (2015)

  15. Ren, H., Hu, W., Leskovec, J.: Query2box: Reasoning over knowledge graphs in vector space using box embeddings. In: International conference on learning representations (ICLR) (2020)

  16. Ren, H., Leskovec, J.: Beta embeddings for multi-hop logical reasoning in knowledge graphs. Adv. Neural Inform. Process. Syst. (NeurIPS) 33, 19716–19726 (2020)

  17. Sun, H., Arnold, A., Bedrax Weiss, T., Pereira, F., Cohen, W.W.: Faithful embeddings for knowledge base queries. Adv. Neural Inform. Process. Syst. (NeurIPS) 33, 22505–22516 (2020)

  18. Abboud, R., Ceylan, I., Lukasiewicz, T., Salvatori, T.: Boxe: A box embedding model for knowledge base completion. Adv. Neural. Inf. Process. Syst. 33, 9649–9661 (2020)

    Google Scholar 

  19. Dasgupta, S., Boratko, M., Zhang, D., Vilnis, L., Li, X., McCallum, A.: Improving local identifiability in probabilistic box embeddings. Adv. Neural Inform. Process. Syst. (NeurIPS) 33, 182–192 (2020)

    Google Scholar 

  20. Ebisu, T., Ichise, R.: Toruse: Knowledge graph embedding on a lie group. In: AAAI Conference on artificial intelligence(AAAI) (2018)

  21. Zhu, J., Huang, C., De Meo, P.: Dfmke: A dual fusion multi-modal knowledge graph embedding framework for entity alignment. Inform. Fusion 90, 111–119 (2023)

    Article  Google Scholar 

  22. Jin, D., Wang, L., Zhang, H., Zheng, Y., Ding, W., Xia, F., Pan, S.: A survey on fairness-aware recommender systems. Inform. Fusion 100, 101906 (2023)

    Article  Google Scholar 

  23. Hu, Z., Gutierrez Basulto, V., Xiang, Z., Li, X., Li, R., Z. Pan, J.: Type-aware embeddings for multi-hop reasoning over knowledge graphs. In: IJCAI, pp. 3078–3084 (2022)

  24. Wang, X., He, Q., Liang, J., Xiao, Y.: Language models as knowledge embeddings. In: IJCAI, pp. 2291–2297 (2022)

  25. Zhu, B., Wu, M., Hong, Y., Chen, Y., Xie, B., Liu, F., Bu, C., Ding, W.: Mmiea: Multi-modal interaction entity alignment model for knowledge graphs. Inform. Fusion 100, 101935 (2023)

    Article  Google Scholar 

  26. Rao, X., Wang, H., Zhang, L., Li, J., Shang, S., Han, P.: Fogs: First-order gradient supervision with learning-based graph for traffic flow forecasting. In: Proceedings of international joint conference on artificial intelligence, IJCAI (2022). ijcai. org

  27. Rao, X., Chen, L., Liu, Y., Shang, S., Yao, B., Han, P.: Graph-flashback network for next location recommendation. In: Proceedings of the 28th ACM SIGKDD conference on knowledge discovery and data mining, pp. 1463–1471 (2022)

  28. Shang, S., Shen, J., Wen, J.-R., Kalnis, P.: Deep understanding of big geo-social data for autonomous vehicles. Neural Comput. Appl. 35(5), 3585–3586 (2023)

    Article  Google Scholar 

  29. Shang, S., Zhang, X., Kalnis, P.: Spatiotemporal data management and analytics for recommender systems. World Wide Web, 1–3 (2023)

  30. Shang, S., He, B., Wang, L.: Introduction to Distributed and Parallel Processing of Big Spatiotemporal Data. Elsevier (2023)

  31. Erk, K.: Representing words as regions in vector space. In: Conference on computational natural language learning (CoNLL), pp. 57–65 (2009)

  32. Li, X., Vilnis, L., Zhang, D., Boratko, M., McCallum, A.: Smoothing the geometry of probabilistic box embeddings. In: International conference on learning representations (ICLR) (2018)

  33. Xiong, B., Nayyer, M., Pan, S., Staab, S.: Shrinking embeddings for hyper-relational knowledge graphs. arXiv preprint arXiv:2306.02199 (2023)

  34. Wang, E., Yu, Q., Chen, Y., Slamu, W., Luo, X.: Multi-modal knowledge graphs representation learning via multi-headed self-attention. Inform. Fusion 88, 78–85 (2022)

    Article  Google Scholar 

  35. He, S., Liu, K., Ji, G., Zhao, J.: Learning to represent knowledge graphs with gaussian embedding. In: ACM International on conference on information and knowledge management(CIKM), pp. 623–632 (2015)

  36. Xiao, H., Huang, M., Zhu, X.: Transg: A generative model for knowledge graph embedding. In: Annual meeting of the association for computational linguistics (ACL), pp. 2316–2325 (2016)

  37. Vilnis, L., McCallum, A.: Word representations via gaussian embedding. In: International conference on learning representations (ICLR) (2015)

  38. Athiwaratkun, B., Wilson, A.G.: Hierarchical density order embeddings. In: International conference on learning representations (ICLR) (2018)

  39. Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. Adv. Neural Inform. Process. Syst. (NeurIPS) 26 (2013)

  40. Xiao, H., Huang, M., Zhu, X.: From one point to a manifold: knowledge graph embedding for precise link prediction. In: International joint conference on artificial intelligence (IJCAI), pp. 1315–1321 (2016)

  41. Engl, H.W., Groetsch, C.W.: Inverse and Ill-posed Problems vol. 4. Elsevier, ??? (2014)

  42. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., Polosukhin, I.: Attention is all you need. Adv. Neural Inform. Process. Syst. (NeurIPS) 30 (2017)

  43. Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: Proceedings of the 3rd workshop on continuous vector space models and their compositionality, pp. 57–66 (2015)

  44. Xiong, W., Hoang, T., Wang, W.Y.: Deeppath: A reinforcement learning method for knowledge graph reasoning. In: Conference on empirical methods in natural language processing (EMNLP), pp. 564–573 (2017)

  45. Amayuelas, A., Zhang, S., Rao, X.S., Zhang, C.: Neural methods for logical reasoning over knowledge graphs. In: International conference on learning representations (ICLR) (2022)

Download references

Acknowledgements

This work is partially supported by National Key RD Program of China (2023YFC2705700), the National Natural Science Foundation of China (62206202, 62225113), China Postdoctoral Science Foundation, China (2022M712461), Artificial Intelligence Innovation Project of Wuhan Science and Technology Bureau (No. 2022010702040070).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Guojia Wan or Bo Du.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, Z., Wan, G., Pan, S. et al. Complex query answering over knowledge graphs foundation model using region embeddings on a lie group. World Wide Web 27, 23 (2024). https://doi.org/10.1007/s11280-024-01254-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11280-024-01254-7

Keywords

Navigation