Skip to main content

DynGCN: A Dynamic Graph Convolutional Network Based on Spatial-Temporal Modeling

  • Conference paper
  • First Online:
Web Information Systems Engineering – WISE 2020 (WISE 2020)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12342))

Included in the following conference series:

Abstract

Representation learning on graphs has recently attracted a lot of interest with graph convolutional networks (GCN) achieving state-of-the-art performance in many graph mining tasks. However, most of existing methods mainly focus on static graphs while ignoring the fact that real-world graphs may be dynamic in nature. Although a few recent studies have gone a step further to incorporate sequence modeling (e.g., RNN) with the GCN framework, they fail to capture the dynamism of graph structural (i.e., spatial) information over time. In this paper, we propose a Dynamic Graph Convolutional Network (DynGCN) that performs spatial and temporal convolutions in an interleaving manner along with a model adapting mechanism that updates model parameters to adapt to new graph snapshots. The model is able to extract both structural dynamism and temporal dynamism on dynamic graphs. We conduct extensive experiments on several real-world datasets for link prediction and edge classification tasks. Results show that DynGCN outperforms state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://snap.standford.edu/data/soc-sign-bitcoin-otc.html.

  2. 2.

    http://snap.standford.edu/data/soc-sign-bitcoin-alpha.html.

  3. 3.

    http://snap.stanford.edu/data/as-733.html.

References

  1. Bai, S., Kolter, J.Z., Koltun, V.: An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. CoRR abs/1803.01271 (2018). http://arxiv.org/abs/1803.01271

  2. Cangea, C., Veličković, P., Jovanović, N., Kipf, T., Liò, P.: Towards sparse hierarchical graph classifiers (2018)

    Google Scholar 

  3. Cao, S., Lu, W., Xu, Q.: Grarep: Learning graph representations with global structural information. In: Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, pp. 891–900. CIKM 2015, ACM, New York, USA (2015). https://doi.org/10.1145/2806416.2806512

  4. Chen, J., Ma, T., Xiao, C.: Fastgcn: Fast learning with graph convolutional networks via importance sampling (2018)

    Google Scholar 

  5. Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Lee, D.D., Sugiyama, M., Luxburg, U.V., Guyon, I., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 29, pp. 38vol. 44–3852. Curran Associates, Inc. (2016). http://papers.nips.cc/paper/6081-convolutional-neural-networks-on-graphs-with-fast-localized-spectral-filtering.pdf

  6. Grover, A., Leskovec, J.: Node2vec: scalable feature learning for networks. In: Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 855–864. KDD 2016, ACM, New York, USA (2016). https://doi.org/10.1145/2939672.2939754

  7. Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: Guyon, I., et al. (eds.) Advances in Neural Information Processing Systems, vol. 30, pp. 1024–1034. Curran Associates, Inc. (2017). http://papers.nips.cc/paper/6703-inductive-representation-learning-on-large-graphs.pdf

  8. Kazemi, S.M., Goel, R., Jain, K., Kobyzev, I., Sethi, A., Forsyth, P., Poupart, P.: Relational representation learning for dynamic (knowledge) graphs: A survey. CoRR abs/1905.11485 (2019). http://arxiv.org/abs/1905.11485

  9. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. CoRR abs/1609.02907 (2016). http://arxiv.org/abs/1609.02907

  10. Lei, K., Qin, M., Bai, B., Zhang, G., Yang, M.: GCN-GAN: a non-linear temporal link prediction model for weighted dynamic networks. CoRR abs/1901.09165 (2019). http://arxiv.org/abs/1901.09165

  11. Manessi, F., Rozza, A., Manzo, M.: Dynamic graph convolutional networks. Pattern Recogn. 97, 107000 (2020). https://doi.org/10.1016/j.patcog.2019.107000, http://www.sciencedirect.com/science/article/pii/S0031320319303036

  12. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space (2013)

    Google Scholar 

  13. Noh, J.D., Rieger, H.: Random walks on complex networks. Phys. Rev. Lett. 92(11) 3 (2004). https://doi.org/10.1103/physrevlett.92.118701

  14. Ou, M., Cui, P., Pei, J., Zhang, Z., Zhu, W.: Asymmetric transitivity preserving graph embedding. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 1105–1114. KDD 22016, ACM, New York, USA (2016). https://doi.org/10.1145/2939672.2939751

  15. Pareja, A., et al.: Evolvegcn: evolving graph convolutional networks for dynamic graphs. CoRR abs/1902.10191 (2019). http://arxiv.org/abs/1902.10191

  16. Perozzi, B., Al-Rfou, R., Skiena, S.: Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 701–710. KDD 22014, ACM, New York, USA (2014). https://doi.org/10.1145/2623330.2623732

  17. Seo, Y., Defferrard, M., Vandergheynst, P., Bresson, X.: Structured sequence modeling with graph convolutional recurrent networks (2016)

    Google Scholar 

  18. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks (2017)

    Google Scholar 

  19. Wang, Y., Yuan, Y., Ma, Y., Wang, G.: Time-dependent graphs: definitions, applications, and algorithms. Data Sci. Eng. 4(4), 352–366 (2019). https://doi.org/10.1007/s41019-019-00105-0

  20. Xu, D., Cheng, W., Luo, D., Liu, X., Zhang, X.: Spatio-temporal attentive RNN for node classification in temporal attributed graphs. In: Proceedings of the 28th International Joint Conference on Artificial Intelligence, pp. 3947–3953. IJCAI 2019, AAAI Press (2019)

    Google Scholar 

Download references

Acknowledgements

This work was supported by NSFC under grant 61932001, 61961130390. This work was also supported by Beijing Academy of Artificial Intelligence (BAAI).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lei Zou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, J., Liu, Y., Zou, L. (2020). DynGCN: A Dynamic Graph Convolutional Network Based on Spatial-Temporal Modeling. In: Huang, Z., Beek, W., Wang, H., Zhou, R., Zhang, Y. (eds) Web Information Systems Engineering – WISE 2020. WISE 2020. Lecture Notes in Computer Science(), vol 12342. Springer, Cham. https://doi.org/10.1007/978-3-030-62005-9_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-62005-9_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-62004-2

  • Online ISBN: 978-3-030-62005-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics