Skip to main content

Graph Neural Networks

  • 2091 Accesses

Part of the Mathematics in Industry book series (MATHINDUSTRY,volume 37)

Abstract

Many important real-world data sets are available in the form of graphs or networks: social networks, world-wide web (WWW), protein-interaction networks, brain networks, molecule networks, etc. See some examples in Fig. 8.1. In fact, the complex interaction in real systems can be described by different forms of graphs, so that graphs can be a ubiquitous tool for representing complex systems.

Examples of graphs in real life

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-981-16-6046-7_8
  • Chapter length: 20 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   69.99
Price excludes VAT (USA)
  • ISBN: 978-981-16-6046-7
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Hardcover Book
USD   89.99
Price excludes VAT (USA)
Fig. 8.2
Fig. 8.3
Fig. 8.4
Fig. 8.5
Fig. 8.6
Fig. 8.7
Fig. 8.8
Fig. 8.9
Fig. 8.10
Fig. 8.11
Fig. 8.12
Fig. 8.13
Fig. 8.14
Fig. 8.15
Fig. 8.16
Fig. 8.17
Fig. 8.18
Fig. 8.19

References

  1. D. L. Donoho, “Compressed sensing,” IEEE Trans. Information Theory, vol. 52, no. 4, pp. 1289–1306, 2006.

    MathSciNet  CrossRef  Google Scholar 

  2. E. J. Candès and B. Recht, “Exact matrix completion via convex optimization,” Found. Comput. Math., vol. 9, no. 6, pp. 717–772, 2009.

    MathSciNet  CrossRef  Google Scholar 

  3. G. Cybenko, “Approximation by superpositions of a sigmoidal function,” Mathematics of Control, Signals and Systems, vol. 2, no. 4, pp. 303–314, 1989.

    MathSciNet  CrossRef  Google Scholar 

  4. S. Ryu, J. Lim, S. H. Hong, and W. Y. Kim, “Deeply learning molecular structure-property relationships using attention-and gate-augmented graph convolutional network,” arXiv preprint arXiv:1805.10988, 2018.

    Google Scholar 

  5. T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado, and J. Dean, “Distributed representations of words and phrases and their compositionality,” in Advances in Neural Information Processing Systems, 2013, pp. 3111–3119.

    Google Scholar 

  6. T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient estimation of word representations in vector space,” arXiv preprint arXiv:1301.3781, 2013.

    Google Scholar 

  7. W. L. Hamilton, R. Ying, and J. Leskovec, “Representation learning on graphs: Methods and applications,” arXiv preprint arXiv:1709.05584, 2017.

    Google Scholar 

  8. B. Perozzi, R. Al-Rfou, and S. Skiena, “DeepWalk: Online learning of social representations,” in Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2014, pp. 701–710.

    Google Scholar 

  9. A. Grover and J. Leskovec, “Node2vec: Scalable feature learning for networks,” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp. 855–864.

    Google Scholar 

  10. M. M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, and P. Vandergheynst, “Geometric deep learning: going beyond Euclidean data,” IEEE Signal Processing Magazine, vol. 34, no. 4, pp. 18–42, 2017.

    CrossRef  Google Scholar 

  11. T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” arXiv preprint arXiv:1609.02907, 2016.

    Google Scholar 

  12. K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How powerful are graph neural networks?” arXiv preprint arXiv:1810.00826, 2018.

    Google Scholar 

  13. W. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” in Advances in Neural Information Processing Systems, 2017, pp. 1024–1034.

    Google Scholar 

  14. C. Morris, M. Ritzert, M. Fey, W. L. Hamilton, J. E. Lenssen, G. Rattan, and M. Grohe, “Weisfeiler and Leman go neural: Higher-order graph neural networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, 2019, pp. 4602–4609.

    Google Scholar 

  15. Z. Chen, S. Villar, L. Chen, and J. Bruna, “On the equivalence between graph isomorphism testing and function approximation with GNNs,” in Advances in Neural Information Processing Systems, 2019, pp. 15 868–15 876.

    Google Scholar 

  16. P. Barceló, E. V. Kostylev, M. Monet, J. Pérez, J. Reutter, and J. P. Silva, “The logical expressiveness of graph neural networks,” in International Conference on Learning Representations, 2019.

    Google Scholar 

  17. M. Grohe, “word2vec, node2vec, graph2vec, x2vec: Towards a theory of vector embeddings of structured data,” arXiv preprint arXiv:2003.12590, 2020.

    Google Scholar 

  18. N. Shervashidze, P. Schweitzer, E. J. Van Leeuwen, K. Mehlhorn, and K. M. Borgwardt, “Weisfeiler–Lehman graph kernels,” Journal of Machine Learning Research, vol. 12, no. 77, pp. 2539–2561, 2011.

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and Permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Ye, J.C. (2022). Graph Neural Networks. In: Geometry of Deep Learning. Mathematics in Industry, vol 37. Springer, Singapore. https://doi.org/10.1007/978-981-16-6046-7_8

Download citation