Abstract
Learning distributions of graphs can be used for automatic drug discovery, molecular design, complex network analysis, and much more. We present an improved framework for learning generative models of graphs based on the idea of deep state machines. To learn state transition decisions we use a set of graph and node embedding techniques as memory of the state machine.
Our analysis is based on learning the distribution of random graph generators for which we provide statistical tests to determine which properties can be learned and how well the original distribution of graphs is represented. We show that the design of the state machine favors specific distributions. Models of graphs of size up to 150 vertices are learned. Code and parameters are publicly available to reproduce our results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Our source code is published at https://github.com/innvariant/deepgg.
- 2.
Like with all representations of graphs there is no easy canonical representation of a graph, otherwise the graph isomorphism problem could be solved in polynomial time.
References
Albert, R., Barabasi, A.L.: Statistical mechanics of complex networks. Rev. Mod. Phys. 74(1), 47 (2002)
Alibaba: Euler: a distributed graph deep learning framework (2019). https://github.com/alibaba/euler. Accessed 12 May 2020
Bjerrum, E.J., Sattarov, B.: Improving chemical autoencoder latent space and molecular de novo generation diversity with heteroencoders. Biomolecules 8(4), 131 (2018)
Bjerrum, E.J., Threlfall, R.: Molecular generation with recurrent neural networks (rnns). arXiv preprint arXiv:1705.04612 (2017)
Dai, H., Dai, B., Song, L.: Discriminative embeddings of latent variable models for structured data. In: International Conference on Machine Learning, pp. 2702–2711 (2016)
Data61, C.: Stellargraph machine learning library. https://github.com/stellargraph/stellargraph (2018)
Erdos, P.: On random graphs. Publicationes mathematicae 6, 290–297 (1959)
Fronczak, A., Fronczak, P., Hołyst, J.A.: Average path length in random networks. Phys. Rev. E 70(5), 056110 (2004)
Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1263–1272. JMLR. org (2017)
Gori, M., Monfardini, G., Scarselli, F.: A new model for learning in graph domains. In: Proceedings of 2005 IEEE International Joint Conference on Neural Networks, 2005, vol. 2, pp. 729–734. IEEE (2005)
Grover, A., Zweig, A., Ermon, S.: Graphite: iterative generative modeling of graphs. arXiv preprint arXiv:1803.10459 (2018)
Gupta, A., Müller, A.T., Huisman, B.J., Fuchs, J.A., Schneider, P., Schneider, G.: Generative recurrent networks for de novo drug design. Mol. Inform. 37(1–2), 1700111 (2018)
Jin, W., Barzilay, R., Jaakkola, T.: Junction tree variational autoencoder for molecular graph generation. arXiv preprint arXiv:1802.04364 (2018)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)
Kleinberg, J.M.: Navigation in a small world. Nature 406(6798), 845 (2000)
Leskovec, J., Chakrabarti, D., Kleinberg, J., Faloutsos, C., Ghahramani, Z.: Kronecker graphs: an approach to modeling networks. J. Mach. Learn. Res. 11, 985–1042 (2010)
Leskovec, J., Faloutsos, C.: Scalable modeling of real graphs using kronecker multiplication. In: Proceedings of the 24th International Conference on Machine Learning, pp. 497–504 (2007)
Li, X., Yan, X., Gu, Q., Zhou, H., Wu, D., Xu, J.: Deepchemstable: chemical stability prediction with an attention-based graph convolution network. J. Chem. Inf. Model. 59(3), 1044–1049 (2019)
Li, Y., Vinyals, O., Dyer, C., Pascanu, R., Battaglia, P.: Learning deep generative models of graphs. arXiv preprint arXiv:1803.03324 (2018)
Liao, R., et al.: Efficient graph generation with graph recurrent attention networks. In: Advances in Neural Information Processing Systems, pp. 4257–4267 (2019)
Wang, H., et al.: GraphGAN: graph representation learning with generative adversarial nets. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
Wang, M., et al.: Deep graph library: towards efficient and scalable deep learning on graphs. In: ICLR Workshop on Representation Learning on Graphs and Manifolds (2019). https://arxiv.org/abs/1909.01315
Wasserman, S., Pattison, P.: Logit models and logistic regressions for social networks: I. An introduction to Markov graphs andp. Psychometrika 61(3), 401–425 (1996)
Watts, D.J., Strogatz, S.H.: Collective dynamics of ‘small-world’ networks. Nature 393(6684), 440 (1998)
You, J., Liu, B., Ying, Z., Pande, V., Leskovec, J.: Graph convolutional policy network for goal-directed molecular graph generation. In: Advances in Neural Information Processing Systems, pp. 6410–6421 (2018)
You, J., Ying, R., Ren, X., Hamilton, W.L., Leskovec, J.: GraphRNN: generating realistic graphs with deep auto-regressive models. arXiv preprint arXiv:1802.08773 (2018)
Acknowledgements
We thank Jörg Schlötterer for valuable discussions during the time of this research and everyone who contributed valuable feedback, including our reviewers.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Stier, J., Granitzer, M. (2021). DeepGG: A Deep Graph Generator. In: Abreu, P.H., Rodrigues, P.P., Fernández, A., Gama, J. (eds) Advances in Intelligent Data Analysis XIX. IDA 2021. Lecture Notes in Computer Science(), vol 12695. Springer, Cham. https://doi.org/10.1007/978-3-030-74251-5_25
Download citation
DOI: https://doi.org/10.1007/978-3-030-74251-5_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-74250-8
Online ISBN: 978-3-030-74251-5
eBook Packages: Computer ScienceComputer Science (R0)