Skip to main content

DEGNN: Dual Experts Graph Neural Network Handling both Edge and Node Feature Noise

  • Conference paper
  • First Online:
Advances in Knowledge Discovery and Data Mining (PAKDD 2024)

Abstract

Graph Neural Networks (GNNs) have achieved notable success in various applications over graph data. However, recent research has revealed that real-world graphs often contain noise, and GNNs are susceptible to noise in the graph. To address this issue, several Graph Structure Learning (GSL) models have been introduced. While GSL models are tailored to enhance robustness against edge noise through edge reconstruction, a significant limitation surfaces: their high reliance on node features. This inherent dependence amplifies their susceptibility to noise within node features. Recognizing this vulnerability, we present DEGNN, a novel GNN model designed to adeptly mitigate noise in both edges and node features. The core idea of DEGNN is to design two separate experts: an edge expert and a node feature expert. These experts utilize self-supervised learning techniques to produce modified edges and node features. Leveraging these modified representations, DEGNN subsequently addresses downstream tasks, ensuring robustness against noise present in both edges and node features of real-world graphs. Notably, the modification process can be trained end-to-end, empowering DEGNN to adjust dynamically and achieves optimal edge and node representations for specific tasks. Comprehensive experiments demonstrate DEGNN’s efficacy in managing noise, both in original real-world graphs and in graphs with synthetic noise.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Codes are available at: https://github.com/TaiHasegawa/DEGNN.

References

  1. Zhou, J., et al.: Graph neural networks: a review of methods and applications. AI Open 1, 57–81 (2020)

    Google Scholar 

  2. McPherson, M., Smith-Lovin, L., Cook, J.M.: Birds of a feather: homophily in social networks. Ann. Rev. Sociol. 27(1), 415–444 (2001)

    Article  Google Scholar 

  3. Maurya, S.K., Liu, X., Murata T.: Graph neural networks for fast node ranking approximation. In: TKDD (2021)

    Google Scholar 

  4. Zhang, M., et al.: An end-to-end deep learning architecture for graph classification. In: AAAI (2018)

    Google Scholar 

  5. Chung, F.R.K.: Spectral Graph Theory. number 92. American Mathematical Soc (1997)

    Google Scholar 

  6. Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: NeurIPS (2016)

    Google Scholar 

  7. Maurya, S.K., Liu, X., Murata, T.: Fast approximations of betweenness centrality with graph neural networks. In: CIKM (2019)

    Google Scholar 

  8. Marsden, P.V.: Network data and measurement. Ann. Rev. Sociol. 16(1), 435–463 (1990)

    Article  Google Scholar 

  9. Dai, H., et al.: Adversarial attack on graph structured data. In: ICML (2018)

    Google Scholar 

  10. Jin, W., et al.: Adversarial attacks and defenses on graphs. In: SIGKDD (2021)

    Google Scholar 

  11. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR (2017)

    Google Scholar 

  12. Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: NeurIPS (2017)

    Google Scholar 

  13. Veličković, P., Cucurull, G.: Arantxa Casanova. Pietro Lio, and Yoshua Bengio. Graph attention networks. In ICLR, Adriana Romero (2018)

    Google Scholar 

  14. Franceschi, L., Niepert, M., Pontil, M., He, X.: Learning discrete structures for graph neural networks. In: ICML (2019)

    Google Scholar 

  15. Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., Philip, S.Y.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32(1), 4–24 (2020)

    Article  MathSciNet  Google Scholar 

  16. Jin, R., Xia, T., Liu, X., Murata, T.: Predicting emergency medical service demand with bipartite graph convolutional networks. IEEE Access 9, 9903–9915 (2021)

    Article  Google Scholar 

  17. Fan, W., et al.: Graph neural networks for social recommendation. In: WWW, pp. 417–426 (2019)

    Google Scholar 

  18. Jin, W., et al.: Graph structure learning for robust graph neural networks. In: SIGKDD (2020)

    Google Scholar 

  19. Zhao, T., et al.: Data augmentation for graph neural networks. In: AAAI (2021)

    Google Scholar 

  20. Li, K., et al.: Reliable representations make a stronger defender: unsupervised structure refinement for robust GNN. In: SIGKDD (2022)

    Google Scholar 

  21. Berthelot, D., et al.: MixMatch: a holistic approach to semi-supervised learning. In: NeurIPS (2019)

    Google Scholar 

  22. Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Deep graph contrastive representation learning. In: ICML (2020)

    Google Scholar 

  23. You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. In: NeurIPS (2020)

    Google Scholar 

  24. Maurya, S.K., Liu, X., Murata, T.: Simplifying approach to node classification in graph neural networks. J. Comput. Sci. 62, 101695 (2022)

    Article  Google Scholar 

  25. Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: current limitations and effective designs. In: NeurIPS (2020)

    Google Scholar 

  26. Marcheggiani, D., Titov, I.: Encoding sentences with graph convolutional networks for semantic role labeling. In: EMNLP, pp. 1506–1515 (2017)

    Google Scholar 

  27. Rakhimberdina, Z., Liu, X., Murata, T.: Population graph-based multi-model ensemble method for diagnosing autism spectrum disorder. Sensors 20(21), 6001 (2020)

    Article  Google Scholar 

  28. Djenouri, Y., Belhadi, A., Srivastava, G., Lin, J.C.: Hybrid graph convolution neural network and branch-and-bound optimization for traffic flow forecasting. Futur. Gener. Comput. Syst. 139, 100–108 (2023)

    Article  Google Scholar 

  29. Choong, J.J., Liu, X., Murata, T.: Learning community structure with variational autoencoder. In: ICDM, pp. 69–78 (2018)

    Google Scholar 

  30. Pei, H., Wei, B., Kevin, C.-C.C., Yu, L., Yang, B.: Geom-GCN: Geometric graph convolutional networks. In: ICLR (2020)

    Google Scholar 

  31. Suresh, S., Li, P., Hao, C., Neville, J.: Adversarial graph augmentation to improve graph contrastive learning. In: NeurIPS (2021)

    Google Scholar 

  32. Liu, Y., et al.: Graph self-supervised learning: a survey. IEEE Trans. Knowl. Data Eng. 35(6), 5879–5900 (2022)

    Google Scholar 

  33. Shchur, O., Mumme, M., Bojchevski, A., Günnemann, S.: Pitfalls of graph neural network evaluation. In: NeurIPS Workshop (2018)

    Google Scholar 

  34. Zhu, D., Zhang, Z., Cui, P., Zhu, W.: Robust graph convolutional networks against adversarial attacks. In: SIGKDD (2019)

    Google Scholar 

  35. Runwal, B., Kumar, S.: Robust graph neural networks using weighted graph Laplacian (2022). arXiv preprint arXiv:2208.01853

  36. Wu, T., Ren, H., Li, P., Leskovec, J.: Graph information bottleneck. In: NeurIPS (2020)

    Google Scholar 

  37. Zhang, M., Chen, Y.: Link prediction based on graph neural networks. In: NeurIPS (2018)

    Google Scholar 

Download references

Acknowledgements

This work is partly supported by JSPS Grant-in-Aid for Scientific Research (grant number 23H03451, 21K12042) and the New Energy and Industrial Technology Development Organization (Grant Number JPNP20017).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xin Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hasegawa, T., Yun, S., Liu, X., Phua, Y.J., Murata, T. (2024). DEGNN: Dual Experts Graph Neural Network Handling both Edge and Node Feature Noise. In: Yang, DN., Xie, X., Tseng, V.S., Pei, J., Huang, JW., Lin, J.CW. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2024. Lecture Notes in Computer Science(), vol 14646. Springer, Singapore. https://doi.org/10.1007/978-981-97-2253-2_30

Download citation

  • DOI: https://doi.org/10.1007/978-981-97-2253-2_30

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-97-2252-5

  • Online ISBN: 978-981-97-2253-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics