Abstract
In this paper, we explore how Schreiber’s transfer entropy can be used to develop a new entropic characterisation of graphs derived from time series data. We use the transfer entropy to weight the edges of a graph where the nodes represent time series data and the edges represent the degree of commonality of pairs of time series. The result is a weighted graph which captures the information transfer between nodes over specific time intervals. From the weighted normalised Laplacian we characterise the network at each time interval using the von Neumann entropy computed from the normalised Laplacian spectrum, and study how this entropic characterisation evolves with time, and can be used to capture temporal changes and anomalies in network structure. We apply the method to stock-market data, which represent time series of closing stock prices on the New York stock exchange and NASDAQ markets. This data is augmented with information concerning the industrial or commercial sector to which the stock belong. We use our method not only to analyse overall market behaviour, but also inter-sector and intra-sector trends.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Razak, F.A., Jensen, H.J.: Quantifying ‘causality’ in complex systems: understanding transfer entropy. PLoS ONE 9(6), 1–14 (2014)
Bai, L., Hancock, E.R., Ren, P.: Jensen-Shannon graph kernel using information functionals. In: Proceedings of the International Conference on Pattern Recognition, ICPR, pp. 2877–2880 (2012)
Bai, L., Zhang, Z., Wang, C., Bai, X., Hancock, E.R.: A Graph kernel based on the Jensen-Shannon representation alignment. In: International Joint Conference on Artificial Intelligence, IJCAI, January 2015, pp. 3322–3328 (2015)
Barnett, L., Barrett, A.B., Seth, A.K.: Granger causality and transfer entropy are equivalent for Gaussian variables. Phys. Rev. Lett. 103(23), 238701 (2009)
Cover, T.M., Thomas, J.A.: Entropy, relative entropy, and mutual information. In: Elements of Information Theory, pp. 13–55. Wiley (2005)
Frenzel, S., Pompe, B.: Partial mutual information for coupling analysis of multivariate time series. Phys. Rev. Lett. 99(20), 1–4 (2007)
Granger, C.W.J.: Investigating causal relations by econometric models and cross-spectral methods. Econometrica 37(3), 424 (1969)
Han, L., Escolano, F., Hancock, E.R., Wilson, R.C.: Graph characterizations from von Neumann entropy. Pattern Recognit. Lett. 33(15), 1958–1967 (2012)
Hlavackovaschindler, K., Palus, M., Vejmelka, M., Bhattacharya, J.: Causality detection based on information-theoretic approaches in time series analysis. Phys. Rep. 441(1), 1–46 (2007). @AssociationMeasure@
Kraskov, A., Stögbauer, H., Grassberger, P.: Estimating mutual information. Phys. Rev. E - Stat. Nonlinear Soft Matter Phys. 69(62), 66138 (2004)
Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22(1), 79–86 (1951)
Kwon, O., Yang, J.-S.: Information flow between stock indices. EPL (Europhys. Lett.) 82(6), 68003 (2008)
Lizier, J.T.: JIDT: an information-theoretic toolkit for studying the dynamics of complex systems. Front. Robot. AI 1, 11 (2014)
Passerini, F., Severini, S.: The von Neumann entropy of networks. In: Developments in Intelligent Agent Technologies and Multi-Agent Systems, pp. 66–76, December 2008
Schreiber, T.: Measuring information transfer. Phys. Rev. Lett. 85(2), 461–464 (2000)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–423 (1948)
Smith, S.M.: Overview of fMRI analysis. In: Functional Magnetic Resonance Imaging, pp. 216–230. Oxford University Press, November 2001
Ye, C., et al.: Thermodynamic characterization of networks using graph polynomials. Phys. Rev. E 92(3), 032810 (2015)
Ye, C., Wilson, R.C., Comin, C.H., Costa, L.D.F., Hancock, E.R.: Approximate von Neumann entropy for directed graphs. Phys. Rev. E - Stat. Nonlinear Soft Matter Phys. 89(5), 52804 (2014)
Ye, C., Wilson, R.C., Hancock, E.R.: Graph characterization from entropy component analysis. In: Proceedings of the International Conference on Pattern Recognition, pp. 3845–3850. IEEE, August 2014
Ye, C., Wilson, R.C., Hancock, E.R.: A Jensen-Shannon divergence kernel for directed graphs. In: Robles-Kelly, A., Loog, M., Biggio, B., Escolano, F., Wilson, R. (eds.) S+SSPR 2016. LNCS, vol. 10029, pp. 196–206. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-49055-7_18
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Caglar, I., Hancock, E.R. (2018). Graph Time Series Analysis Using Transfer Entropy. In: Bai, X., Hancock, E., Ho, T., Wilson, R., Biggio, B., Robles-Kelly, A. (eds) Structural, Syntactic, and Statistical Pattern Recognition. S+SSPR 2018. Lecture Notes in Computer Science(), vol 11004. Springer, Cham. https://doi.org/10.1007/978-3-319-97785-0_21
Download citation
DOI: https://doi.org/10.1007/978-3-319-97785-0_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-97784-3
Online ISBN: 978-3-319-97785-0
eBook Packages: Computer ScienceComputer Science (R0)