Skip to main content

DeepGG: A Deep Graph Generator

  • Conference paper
  • First Online:
Advances in Intelligent Data Analysis XIX (IDA 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12695))

Included in the following conference series:

Abstract

Learning distributions of graphs can be used for automatic drug discovery, molecular design, complex network analysis, and much more. We present an improved framework for learning generative models of graphs based on the idea of deep state machines. To learn state transition decisions we use a set of graph and node embedding techniques as memory of the state machine.

Our analysis is based on learning the distribution of random graph generators for which we provide statistical tests to determine which properties can be learned and how well the original distribution of graphs is represented. We show that the design of the state machine favors specific distributions. Models of graphs of size up to 150 vertices are learned. Code and parameters are publicly available to reproduce our results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Our source code is published at https://github.com/innvariant/deepgg.

  2. 2.

    Like with all representations of graphs there is no easy canonical representation of a graph, otherwise the graph isomorphism problem could be solved in polynomial time.

References

  1. Albert, R., Barabasi, A.L.: Statistical mechanics of complex networks. Rev. Mod. Phys. 74(1), 47 (2002)

    Article  MathSciNet  Google Scholar 

  2. Alibaba: Euler: a distributed graph deep learning framework (2019). https://github.com/alibaba/euler. Accessed 12 May 2020

  3. Bjerrum, E.J., Sattarov, B.: Improving chemical autoencoder latent space and molecular de novo generation diversity with heteroencoders. Biomolecules 8(4), 131 (2018)

    Article  Google Scholar 

  4. Bjerrum, E.J., Threlfall, R.: Molecular generation with recurrent neural networks (rnns). arXiv preprint arXiv:1705.04612 (2017)

  5. Dai, H., Dai, B., Song, L.: Discriminative embeddings of latent variable models for structured data. In: International Conference on Machine Learning, pp. 2702–2711 (2016)

    Google Scholar 

  6. Data61, C.: Stellargraph machine learning library. https://github.com/stellargraph/stellargraph (2018)

  7. Erdos, P.: On random graphs. Publicationes mathematicae 6, 290–297 (1959)

    MathSciNet  MATH  Google Scholar 

  8. Fronczak, A., Fronczak, P., Hołyst, J.A.: Average path length in random networks. Phys. Rev. E 70(5), 056110 (2004)

    Article  Google Scholar 

  9. Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1263–1272. JMLR. org (2017)

    Google Scholar 

  10. Gori, M., Monfardini, G., Scarselli, F.: A new model for learning in graph domains. In: Proceedings of 2005 IEEE International Joint Conference on Neural Networks, 2005, vol. 2, pp. 729–734. IEEE (2005)

    Google Scholar 

  11. Grover, A., Zweig, A., Ermon, S.: Graphite: iterative generative modeling of graphs. arXiv preprint arXiv:1803.10459 (2018)

  12. Gupta, A., Müller, A.T., Huisman, B.J., Fuchs, J.A., Schneider, P., Schneider, G.: Generative recurrent networks for de novo drug design. Mol. Inform. 37(1–2), 1700111 (2018)

    Article  Google Scholar 

  13. Jin, W., Barzilay, R., Jaakkola, T.: Junction tree variational autoencoder for molecular graph generation. arXiv preprint arXiv:1802.04364 (2018)

  14. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  15. Kleinberg, J.M.: Navigation in a small world. Nature 406(6798), 845 (2000)

    Article  Google Scholar 

  16. Leskovec, J., Chakrabarti, D., Kleinberg, J., Faloutsos, C., Ghahramani, Z.: Kronecker graphs: an approach to modeling networks. J. Mach. Learn. Res. 11, 985–1042 (2010)

    MathSciNet  MATH  Google Scholar 

  17. Leskovec, J., Faloutsos, C.: Scalable modeling of real graphs using kronecker multiplication. In: Proceedings of the 24th International Conference on Machine Learning, pp. 497–504 (2007)

    Google Scholar 

  18. Li, X., Yan, X., Gu, Q., Zhou, H., Wu, D., Xu, J.: Deepchemstable: chemical stability prediction with an attention-based graph convolution network. J. Chem. Inf. Model. 59(3), 1044–1049 (2019)

    Article  Google Scholar 

  19. Li, Y., Vinyals, O., Dyer, C., Pascanu, R., Battaglia, P.: Learning deep generative models of graphs. arXiv preprint arXiv:1803.03324 (2018)

  20. Liao, R., et al.: Efficient graph generation with graph recurrent attention networks. In: Advances in Neural Information Processing Systems, pp. 4257–4267 (2019)

    Google Scholar 

  21. Wang, H., et al.: GraphGAN: graph representation learning with generative adversarial nets. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)

    Google Scholar 

  22. Wang, M., et al.: Deep graph library: towards efficient and scalable deep learning on graphs. In: ICLR Workshop on Representation Learning on Graphs and Manifolds (2019). https://arxiv.org/abs/1909.01315

  23. Wasserman, S., Pattison, P.: Logit models and logistic regressions for social networks: I. An introduction to Markov graphs andp. Psychometrika 61(3), 401–425 (1996)

    Article  MathSciNet  Google Scholar 

  24. Watts, D.J., Strogatz, S.H.: Collective dynamics of ‘small-world’ networks. Nature 393(6684), 440 (1998)

    Article  Google Scholar 

  25. You, J., Liu, B., Ying, Z., Pande, V., Leskovec, J.: Graph convolutional policy network for goal-directed molecular graph generation. In: Advances in Neural Information Processing Systems, pp. 6410–6421 (2018)

    Google Scholar 

  26. You, J., Ying, R., Ren, X., Hamilton, W.L., Leskovec, J.: GraphRNN: generating realistic graphs with deep auto-regressive models. arXiv preprint arXiv:1802.08773 (2018)

Download references

Acknowledgements

We thank Jörg Schlötterer for valuable discussions during the time of this research and everyone who contributed valuable feedback, including our reviewers.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Julian Stier .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Stier, J., Granitzer, M. (2021). DeepGG: A Deep Graph Generator. In: Abreu, P.H., Rodrigues, P.P., Fernández, A., Gama, J. (eds) Advances in Intelligent Data Analysis XIX. IDA 2021. Lecture Notes in Computer Science(), vol 12695. Springer, Cham. https://doi.org/10.1007/978-3-030-74251-5_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-74251-5_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-74250-8

  • Online ISBN: 978-3-030-74251-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics