Skip to main content
Log in

Abstract

Graph pooling is a crucial operation in graph neural networks (GNNs) for down-sampling. It is noted that existing methods for graph pooling suffer from two main issues. First, pooling methods based on node dropping only evaluate the importance of nodes from a single perspective, such as attention or node distance. However, the importance scores of a node are different when evaluating it from different aspects, and thus the importance of a node cannot be evaluated comprehensively only based on one single view. Second, necessary information about graph structure will be lost when nodes with lower scores are discarded. It can even result in the coarsened graph being too sparse and further lead to poor performance in related learning tasks, such as graph classification. To address these issues, we propose an attention-based multi-view parallel graph pooling method. Specifically, to comprehensively evaluate the importance of nodes, we propose to evaluate the importance node from its features, local topology structure, and global topology structure via the attention mechanism. Moreover, to alleviate the problem of information loss caused by node discarding in graph pooling, we introduce the concept of multi-view parallel pooling, which conducts graph pooling from node features, local topology, and global topology structure three views, respectively, and then integrates the three generated coarsened graphs and obtains an informative graph. Experimental results on four benchmark datasets demonstrate the effectiveness of our proposed method for graph classification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data Availability

The experimental datasets are available at https://chrsmrrs.github.io/datasets/docs/datasets/.

References

  1. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  2. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017)

  3. Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018)

  4. Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. Adv. Neural Inf. Process. Syst. 30 (2017)

  5. Schlichtkrull, M., Kipf, T.N., Bloem, P., Van Den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: The Semantic Web: 15th International Conference, ESWC 2018, Heraklion, Crete, Greece, June 3–7, 2018, Proceedings 15, pp. 593–607 (2018). Springer

  6. Smalter, A., Huan, J., Lushington, G.: Graph wavelet alignment kernels for drug virtual screening. J. Bioinform. Comput. Biol. 7(03), 473–497 (2009)

    Article  Google Scholar 

  7. Mahé, P., Vert, J.-P.: Graph kernels based on tree patterns for molecules. Mach. Learn. 75(1), 3–35 (2009)

    Article  MATH  Google Scholar 

  8. Duvenaud, D.K., Maclaurin, D., Iparraguirre, J., Bombarell, R., Hirzel, T., Aspuru-Guzik, A., Adams, R.P.: Convolutional networks on graphs for learning molecular fingerprints. Adv. Neural Inf. Process. Syst. 28 (2015)

  9. Ying, Z., You, J., Morris, C., Ren, X., Hamilton, W., Leskovec, J.: Hierarchical graph representation learning with differentiable pooling. Adv. Neural Inf. Process. Syst. 31 (2018)

  10. Lee, J., Lee, I., Kang, J.: Self-attention graph pooling. In: International Conference on Machine Learning, pp. 3734–3743 (2019). PMLR

  11. Gao, H., Ji, S.: Graph u-nets. In: International Conference on Machine Learning, pp. 2083–2092 (2019). PMLR

  12. Zhang, Z., Bu, J., Ester, M., Zhang, J., Yao, C., Yu, Z., Wang, C.: Hierarchical graph pooling with structure learning. arXiv preprint arXiv:1911.05954 (2019)

  13. Gao, H., Liu, Y., Ji, S.: Topology-aware graph pooling networks. IEEE Trans. Pattern Anal. Mach. Intell. 43(12), 4512–4518 (2021)

    Article  Google Scholar 

  14. Henaff, M., Bruna, J., LeCun, Y.: Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163 (2015)

  15. Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013)

  16. Navarin, N., Van Tran, D., Sperduti, A.: Universal readout for graph convolutional neural networks. In: 2019 International Joint Conference on Neural Networks (IJCNN), pp. 1–7 (2019). IEEE

  17. Chen, T., Bian, S., Sun, Y.: Are powerful graph neural nets necessary? A dissection on graph classification. arXiv preprint arXiv:1905.04579 (2019)

  18. Papp, P.A., Martinkus, K., Faber, L., Wattenhofer, R.: Dropgnn: Random dropouts increase the expressiveness of graph neural networks. Adv. Neural. Inf. Process. Syst. 34, 21997–22009 (2021)

    Google Scholar 

  19. Fan, X., Gong, M., Xie, Y., Jiang, F., Li, H.: Structured self-attention architecture for graph-level representation learning. Pattern Recogn. 100, 107084 (2020)

    Article  Google Scholar 

  20. Itoh, T.D., Kubo, T., Ikeda, K.: Multi-level attention pooling for graph neural networks: Unifying graph representations with multiple localities. Neural Netw. 145, 356–373 (2022)

    Article  Google Scholar 

  21. Zhang, M., Cui, Z., Neumann, M., Chen, Y.: An end-to-end deep learning architecture for graph classification. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)

  22. Bai, L., Jiao, Y., Cui, L., Rossi, L., Wang, Y., Yu, P., Hancock, E.: Learning graph convolutional networks based on quantum vertex information propagation. IEEE Trans. Knowl. Data Eng. (2021)

  23. Wang, Z., Ji, S.: Second-order pooling for graph neural networks. IEEE Trans. Pattern Anal. Mach. Intell. (2020)

  24. Chen, K., Song, J., Liu, S., Yu, N., Feng, Z., Han, G., Song, M.: Distribution knowledge embedding for graph pooling. IEEE Trans. Knowl. Data Eng. (2022)

  25. Vinyals, O., Bengio, S., Kudlur, M.: Order matters: sequence to sequence for sets. arXiv preprint arXiv:1511.06391 (2015)

  26. Yuan, H., Ji, S.: Structpool: Structured graph pooling via conditional random fields. In: Proceedings of the 8th International Conference on Learning Representations (2020)

  27. Noutahi, E., Beaini, D., Horwood, J., Giguère, S., Tossou, P.: Towards interpretable sparse graph representation learning with laplacian pooling. arXiv preprint arXiv:1905.11577 (2019)

  28. Bianchi, F.M., Grattarola, D., Alippi, C.: Spectral clustering with graph neural networks for graph pooling. In: International Conference on Machine Learning, pp. 874–883 (2020). PMLR

  29. Dong, X., Huang, J., Qin, F., Xudong, H.: Graph pooling method based on multilevel union. J. Beijing Univ. Aeronaut. Astronaut. (2022)

  30. Du, J., Wang, S., Miao, H., Zhang, J.: Multi-channel pooling graph neural networks. In: IJCAI, pp. 1442–1448 (2021)

  31. Yanardag, P., Vishwanathan, S.: Deep graph kernels. In: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 1365–1374 (2015)

  32. Dobson, P.D., Doig, A.J.: Distinguishing enzyme structures from non-enzymes without alignments. J. Mol. Biol. 330(4), 771–783 (2003)

    Article  Google Scholar 

  33. Kazius, J., McGuire, R., Bursi, R.: Derivation and validation of toxicophores for mutagenicity prediction. J. Med. Chem. 48(1), 312–320 (2005)

    Article  Google Scholar 

  34. Ranjan, E., Sanyal, S., Talukdar, P.: Asap: Adaptive structure aware pooling for learning hierarchical graph representations. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5470–5477 (2020)

  35. Pang, Y., Zhao, Y., Li, D.: Graph pooling via coarsened graph infomax. In: Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2177–2181 (2021)

  36. Baek, J., Kang, M., Hwang, S.J.: Accurate learning of graph representations with graph multiset pooling. arXiv preprint arXiv:2102.11533 (2021)

  37. Zhou, X., Yin, J., Tsang, I.W.: Edge but not least: Cross-view graph pooling. arXiv preprint arXiv:2109.11796 (2021)

  38. Wu, J., Chen, X., Xu, K., Li, S.: Structural entropy guided graph hierarchical pooling. In: International Conference on Machine Learning, pp. 24017–24030 (2022). PMLR

  39. Zhang, M., Cui, Z., Neumann, M., Chen, Y.: An end-to-end deep learning architecture for graph classification. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)

Download references

Acknowledgements

This work is supported by the Natural Science Foundation of China: 61806005, the University Synergy Innovation Program of Anhui Province: GXXT-2022-052 and GXXT-2020-012, the Outstanding Young Talents Support Program of Anhui Province: gxyqZD2022032, and the Natural Science Foundation of the Educational Commission of Anhui Province of China: KJ2021A0373.

Author information

Authors and Affiliations

Authors

Contributions

Yuanyuan Wang constructed the model and experiments, Jun Huang wrote the main manuscript. All authors reviewed the manuscript.

Corresponding author

Correspondence to Jun Huang.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, J., Wang, YY. Multi-view parallel graph pooling. Int J Data Sci Anal (2023). https://doi.org/10.1007/s41060-023-00476-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s41060-023-00476-8

Keywords

Navigation