Skip to main content

Link Prediction Based on the Sub-graphs Learning with Fused Features

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2023)

Abstract

As one of the important research methods in the area of the knowledge graph completion, link prediction aims to capture the structural information or the attribute information of nodes in the network to predict the link probability between nodes, In particular, the graph neural networks based on the sub-graphs provide a popular approach for the learning representation to the link prediction tasks. However, they cannot solve the resource consumption in large graphs, nor do they combine global structural features since they often simply stitch attribute features and embedding to predict. Therefore, this paper proposes a novel link prediction model based on the Sub-graphs Learning with the Fused Features, named SLFF in short. In particular, the proposed model utilizes random walks to extract the sub-graphs to reduce the overhead in the process. Moreover, it utilizes the Node2Vec to process the entire graph and obtain the global structure characteristics of the node. Afterward, the SLFF model utilizes the existing embedding to reconstruct the embedding according to the neighborhood defined by the graph structure and node attribute space. Finally, the SLFF model can combine the attribute characteristics of the node with the structural characteristics of the node together. The extensive experiments on datasets demonstrates that the proposed SLFF has better performance than that of the state-of-the-art approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Zou, X.: A survey on application of knowledge graph. In: Conference 2020, CCEAI, vol. 1487, Singapore (2020)

    Google Scholar 

  2. Chen, Y., Ma, T., Yang, X., Wang, J., Song, B., Zeng, X.: MUFFIN: multi-scale feature fusion for drug-frug interaction prediction. Bioinformatics 37(17), 2651–2658 (2021)

    Google Scholar 

  3. Chen, L., Xie, Y., Zheng, Z., Zheng, H., Xie, J.: Friend recommendation based on multi-social graph convolutional network. IEEE Access 8, 43618–43629 (2020)

    Google Scholar 

  4. Oh, S., Choi, J., Ko, N., Yoon, J.: Predicting product development directions for new product planning using patent classification-based link prediciton. Scientometrics 125(3), 1833–1876 (2020)

    Google Scholar 

  5. Newman, M.E.J.: Clustering and preferential attachment in growing networks. Phys. Rev. E 64, 025102(R) (2001)

    Google Scholar 

  6. Adamic, L.A., Adar, E.: Friends and neighbors on the web. Soc. Netw. 25(3), 211–230 (2003)

    Google Scholar 

  7. Fitz-Gerald, S.J., Wiggins, B.: Introduction to Modern Information Retrieval. McGraw-Hill, Inc., New York (1986)

    Google Scholar 

  8. Katz, L.: A new status index derived from sociometric analysis. Psychometrika 18(1), 39–43 (1953)

    Google Scholar 

  9. Perozzi, B., AI-Rfou, R., Skiena, S.: DeepWalk: online learning of social representations. In: CONFERENCE 2014, KDD, vol. 14, pp. 701–710 (2014)

    Google Scholar 

  10. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. Computation and Language (cs.CL). arXiv preprint arXiv:1301.3781 (2013)

  11. Grover, A., Leskovec, J.: node2vec: scalable feature learning for networks. Social and Information Networks. arXiv preprint arXiv:1607.00653 (2016)

  12. Tang, J., Qu, M., Wang, M., Zhang, M., Yan, J., Mei, Q.: LINE: large-scale information network embedding. Machine Learning. arXiv preprint arXiv:1503.03578 (2015)

  13. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: Conference 2017, ICLR. arXiv preprint arXiv:1609.02907 (2017)

  14. Hamilton, W.L., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. Social and Information Networks. arXiv preprint arXiv:1706.02216 (2017)

  15. Zhang, M., Cui, Z., Neumann, M., Chen, Y.: An end-to-end deep learning architectire for graph classification. In: Conference 2018, AAAI, vol. 554, pp. 4438–4445 (2018)

    Google Scholar 

  16. Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. Machine Learning. arXiv preprint arXiv:1710.10903 (2017)

  17. Louis, P., Jacob, S.A., Salehi-Abari, A.: Sampling enclosing sub-graphs for link prediction. Machine Learning. arXiv preprint arXiv:2206.12004 (2022)

  18. Bielak, P., Puchalska, D., Kajdanowicz, T.: Retrofitting structural graph embeddings with node attribute information. In: Conference 2022, ICCS, London, part 1, pp. 178–191 (2022)

    Google Scholar 

  19. Ai, B., Qin, Z., Shen, W., Li, Y.: Structure enhanced graph neural networks for link prediction. Machine Learning. arXiv preprint arXiv:2201.05293 (2022)

  20. Li, P., Wang, Y., Wang, H., Leskovec, J.: Distance encoding: design provably more powerful neural networks for graph representation learning. Machine Learning. arXiv preprint arXiv:2009.00142 (2020)

  21. Zhang, M., Chen, Y.: Link prediction based on graph neural networks. Machine Learning. arXiv preprint arXiv:1802.09691 (2018)

  22. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. Computer Vision and Pattern Recognition. arXiv preprint arXiv:1512.03385 (2015)

  23. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. Computation and Language. arXiv preprint arXiv:1409.0473 (2014)

  24. Vaswani, A., et al.: Attention is all you need. Computation and Language. arXiv preprint arXiv:1706.03762 (2017)

  25. Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? Machine Learning. arXiv preprint arXiv:1810.00826 (2018)

  26. Xue, H.-J., Dai, X., Zhang, J., Huang, S., Chen, J.: Deep matrix factorization models for recommender systems. In: Conference 2017, IJCAI, Melbourne, vol. 17, pp. 3203–3209 (2017)

    Google Scholar 

Download references

Acknowledgments

This work is supported by National Natural Science Foundation of China (61902116).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haoran Chen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, H. et al. (2024). Link Prediction Based on the Sub-graphs Learning with Fused Features. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Lecture Notes in Computer Science, vol 14449. Springer, Singapore. https://doi.org/10.1007/978-981-99-8067-3_19

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8067-3_19

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8066-6

  • Online ISBN: 978-981-99-8067-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics