Skip to main content

Graph Time Series Analysis Using Transfer Entropy

  • Conference paper
  • First Online:
Structural, Syntactic, and Statistical Pattern Recognition (S+SSPR 2018)

Abstract

In this paper, we explore how Schreiber’s transfer entropy can be used to develop a new entropic characterisation of graphs derived from time series data. We use the transfer entropy to weight the edges of a graph where the nodes represent time series data and the edges represent the degree of commonality of pairs of time series. The result is a weighted graph which captures the information transfer between nodes over specific time intervals. From the weighted normalised Laplacian we characterise the network at each time interval using the von Neumann entropy computed from the normalised Laplacian spectrum, and study how this entropic characterisation evolves with time, and can be used to capture temporal changes and anomalies in network structure. We apply the method to stock-market data, which represent time series of closing stock prices on the New York stock exchange and NASDAQ markets. This data is augmented with information concerning the industrial or commercial sector to which the stock belong. We use our method not only to analyse overall market behaviour, but also inter-sector and intra-sector trends.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Razak, F.A., Jensen, H.J.: Quantifying ‘causality’ in complex systems: understanding transfer entropy. PLoS ONE 9(6), 1–14 (2014)

    Google Scholar 

  2. Bai, L., Hancock, E.R., Ren, P.: Jensen-Shannon graph kernel using information functionals. In: Proceedings of the International Conference on Pattern Recognition, ICPR, pp. 2877–2880 (2012)

    Google Scholar 

  3. Bai, L., Zhang, Z., Wang, C., Bai, X., Hancock, E.R.: A Graph kernel based on the Jensen-Shannon representation alignment. In: International Joint Conference on Artificial Intelligence, IJCAI, January 2015, pp. 3322–3328 (2015)

    Google Scholar 

  4. Barnett, L., Barrett, A.B., Seth, A.K.: Granger causality and transfer entropy are equivalent for Gaussian variables. Phys. Rev. Lett. 103(23), 238701 (2009)

    Article  Google Scholar 

  5. Cover, T.M., Thomas, J.A.: Entropy, relative entropy, and mutual information. In: Elements of Information Theory, pp. 13–55. Wiley (2005)

    Google Scholar 

  6. Frenzel, S., Pompe, B.: Partial mutual information for coupling analysis of multivariate time series. Phys. Rev. Lett. 99(20), 1–4 (2007)

    Article  Google Scholar 

  7. Granger, C.W.J.: Investigating causal relations by econometric models and cross-spectral methods. Econometrica 37(3), 424 (1969)

    Article  Google Scholar 

  8. Han, L., Escolano, F., Hancock, E.R., Wilson, R.C.: Graph characterizations from von Neumann entropy. Pattern Recognit. Lett. 33(15), 1958–1967 (2012)

    Article  Google Scholar 

  9. Hlavackovaschindler, K., Palus, M., Vejmelka, M., Bhattacharya, J.: Causality detection based on information-theoretic approaches in time series analysis. Phys. Rep. 441(1), 1–46 (2007). @AssociationMeasure@

    Article  Google Scholar 

  10. Kraskov, A., Stögbauer, H., Grassberger, P.: Estimating mutual information. Phys. Rev. E - Stat. Nonlinear Soft Matter Phys. 69(62), 66138 (2004)

    Article  MathSciNet  Google Scholar 

  11. Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22(1), 79–86 (1951)

    Article  MathSciNet  Google Scholar 

  12. Kwon, O., Yang, J.-S.: Information flow between stock indices. EPL (Europhys. Lett.) 82(6), 68003 (2008)

    Article  Google Scholar 

  13. Lizier, J.T.: JIDT: an information-theoretic toolkit for studying the dynamics of complex systems. Front. Robot. AI 1, 11 (2014)

    Article  Google Scholar 

  14. Passerini, F., Severini, S.: The von Neumann entropy of networks. In: Developments in Intelligent Agent Technologies and Multi-Agent Systems, pp. 66–76, December 2008

    Google Scholar 

  15. Schreiber, T.: Measuring information transfer. Phys. Rev. Lett. 85(2), 461–464 (2000)

    Article  MathSciNet  Google Scholar 

  16. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–423 (1948)

    Article  MathSciNet  Google Scholar 

  17. Smith, S.M.: Overview of fMRI analysis. In: Functional Magnetic Resonance Imaging, pp. 216–230. Oxford University Press, November 2001

    Google Scholar 

  18. Ye, C., et al.: Thermodynamic characterization of networks using graph polynomials. Phys. Rev. E 92(3), 032810 (2015)

    Article  Google Scholar 

  19. Ye, C., Wilson, R.C., Comin, C.H., Costa, L.D.F., Hancock, E.R.: Approximate von Neumann entropy for directed graphs. Phys. Rev. E - Stat. Nonlinear Soft Matter Phys. 89(5), 52804 (2014)

    Article  Google Scholar 

  20. Ye, C., Wilson, R.C., Hancock, E.R.: Graph characterization from entropy component analysis. In: Proceedings of the International Conference on Pattern Recognition, pp. 3845–3850. IEEE, August 2014

    Google Scholar 

  21. Ye, C., Wilson, R.C., Hancock, E.R.: A Jensen-Shannon divergence kernel for directed graphs. In: Robles-Kelly, A., Loog, M., Biggio, B., Escolano, F., Wilson, R. (eds.) S+SSPR 2016. LNCS, vol. 10029, pp. 196–206. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-49055-7_18

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ibrahim Caglar .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Caglar, I., Hancock, E.R. (2018). Graph Time Series Analysis Using Transfer Entropy. In: Bai, X., Hancock, E., Ho, T., Wilson, R., Biggio, B., Robles-Kelly, A. (eds) Structural, Syntactic, and Statistical Pattern Recognition. S+SSPR 2018. Lecture Notes in Computer Science(), vol 11004. Springer, Cham. https://doi.org/10.1007/978-3-319-97785-0_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-97785-0_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-97784-3

  • Online ISBN: 978-3-319-97785-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics