Keywords

1 Introduction

Complex systems can be regarded as a collection of in homogeneously and generically interacting units and are ubiquitous both in nature and man-made systems. They appear in a wide range of scenarios, varying from biological and ecological to social and technological fields and can refer to any phenomena properties, from the molecular level to the scale of large communications infrastructures [4]. Notable examples are the World Wide Web, metabolic reaction networks, financial market stock correlations, scientific collaboration, coauthorship and citation relations, and social interactions [9].

Therefore, the study of the dynamics of these structures has become increasingly important and a main subject for interdisciplinary research. Especially, network evolution mechanisms play a central role in science. This follows a crucial change of viewpoint in network analysis, from a rather unnatural static view of the networks to a more realistic characterization of the dynamics of the system, in order to predict their behavior and explain processes acting over the networks. Indeed now the majority of efforts aims at identifying relations between the system structure and network performance, e.g., how the network evolves with time to respond to structural needs, whereas previously efforts were aimed at representing the problem, i.e. the characterization of network structure [3].

An excellent framework for the study of complex networks relies on statistical physics and thermodynamics, connecting the macroscopic properties of a system to the behavior of microscopic particles [68]. In particular, thermodynamics defines the macroscopic properties of a system through three variables, subject to constraints imposed by the four laws of thermodynamics. For instance, in the case of graph representation of complex networks, Escolano et al. [2] provide a thermodynamic characterization based on the variation of local histories over graphs.

In this paper we present a novel method to characterize the behaviour of the evolving systems based on a thermodynamic framework for graphs. Specifically, the graph Laplacian of each time slice is seen as a quantum mixed state undergoing free evolution - through the Schrödinger equation - under an unknown time-dependent Hamiltonian representing the change in potential due to external factors, and entropy and energy changing direct interaction with the environment.

In this way, the evolution of the network allows to estimate the hidden time-varying Hamiltonian and consequently the Energy-exchange at each time interval - and the variation in entropy of the underlying structure as well. From these we derive all the thermodynamic variables of networks, including the free energy and temperature.

The consequent characterization is utilized to represent two real-world time-varying networks: the price correlation of selected stocks in the New York Stock Exchange (NYSE) [12], and the gene expression of the life cycle of the Fruit Fly (Drosophila melanogaster) [1, 13].

2 Quantum Thermodynamics of the Network

Let G(VE) be an undirected graph with node set V and edges set \(E\subseteq V \times V\) and let \(A = a_{ij}\) be the adjacency matrix, where

$$ a_{ij} = {\left\{ \begin{array}{ll} 1, &{} {v_i\sim v_j,} \\ 0, &{} \text {otherwise.} \end{array}\right. } $$

The degree d of a node is the number of edges incident to the node and it can be represented through the degree matrix \(D = (d_{ij})\) which is a diagonal matrix with \(d_{ii} = \sum _i a_{ij}\). The graph Laplacian is then defined as \(L=D-A\), and it can be interpreted as a combinatorial analogue of the discrete Laplace-Beltrami operator. The normalized Laplacian matrix \(\tilde{L} \) is defined as

$$\begin{aligned} \tilde{L} = D^{- 1/2}\big (D - A\big )D^{-1/2} \end{aligned}$$
(1)

If we divide the normalized Laplacian by the number of vertices in the graph we obtain a unit-trace positive semidefinite matrix that Passerini and Severini [11] suggest can be seen as a density matrix in a quantum system representing a quantum superposition of the transition steps of quantum walk over the graph.

The continuous-time quantum walk is the quantum counterpart of the continuous-time random walk, and it is similarly defined as a dynamical process over the vertices of the graph [5]. Here the classical state vector is replaced by a vector of complex amplitudes over V, and a general state of the walk is a complex linear combination of the basis states \(\left| v\right\rangle , v\in V\), such that the state of the walk at time t is defined as

$$\begin{aligned} \left| \psi _t\right\rangle = \sum _{u\in V} \alpha _u (t) \left| u\right\rangle \end{aligned}$$
(2)

where the amplitude \(\alpha _u (t) \in \mathbb {C}\) and \(\left| \psi _t\right\rangle \in \mathbb {C}^{|V|}\) are both complex. Moreover, we have that \(\alpha _u (t) \alpha _u^* (t)\) gives the probability that at time t the walker is at the vertex u, and thus \(\sum _{u \in V} \alpha _u (t) \alpha ^{*}_u(t) = 1\) and \(\alpha _u (t) \alpha ^{*}_u(t) \in [0,1]\), for all \(u \in V\), \(t \in \mathbb {R}^{+}\).

The evolution of the walk is then given by the Schrödinger equation, where we denote the time-independent Hamiltonian as \(\mathcal {H}\).

$$\begin{aligned} \dfrac{\partial }{\partial t} \left| \psi _t\right\rangle = -i\mathcal {H}\left| \psi _t\right\rangle . \end{aligned}$$
(3)

Given an initial state \(\left| \psi _0\right\rangle \), we can solve Eq. (3) to determine the state vector at time t

$$\begin{aligned} \left| \psi _t\right\rangle = e^{-i\mathcal {H}t}\left| \psi _0\right\rangle . \end{aligned}$$
(4)

The density operator (or density matrix) is introduced in quantum mechanics to describe a system whose state is an ensemble of pure quantum states \(\left| \psi _i\right\rangle \), each with probability \(p_i\). The density operator of such a system is a positive unit-trace matrix defined as

$$\begin{aligned} \rho = \sum _i p_i \left| \psi _i\right\rangle \left\langle \psi _i\right| . \end{aligned}$$
(5)

The von Neumann entropy [10] \(H_N\) of a mixed state is defined in terms of the trace and logarithm of the density operator \(\rho \)

$$\begin{aligned} H_N = -{{\mathrm{Tr}}}(\rho \log \rho )=-\sum _i \xi _i \ln \,\xi _i \end{aligned}$$
(6)

where \(\xi _1,\ldots ,\xi _n\) are the eigenvalues of \(\rho \). The von Neumann entropy is related to the distiguishability of the states, i.e., the amount of information that can be extracted from an observation on the mixed state.

The observation process for a quantum system is defined in terms of projections onto orthogonal subspaces associated with operators on the quantum state-space called observables. Let O be an observable of the system, with spectral decomposition

$$\begin{aligned} O=\sum _i a_i P_i \end{aligned}$$
(7)

where the \(a_i\) are the (distinct) eigenvalues of O and the \(P_i\) the orthogonal projectors onto the corresponding eigenspaces. The outcome of an observation, or projective measurement, of a quantum state \(\left| \psi \right\rangle \) is one of the eigenvalues \(a_i\) of O, with probability

$$\begin{aligned} P(a_i)= \left\langle \psi \right| P_i \left| \psi \right\rangle \end{aligned}$$
(8)

After the measurement, the state of the quantum systems becomes

$$\begin{aligned} \left| \bar{\psi }\right\rangle = \frac{P_i \left| \psi \right\rangle }{||P_i \left| \psi \right\rangle ||}, \end{aligned}$$
(9)

where is the norm of the vector \(\left| \psi \right\rangle \).

Density operators play an important role in the quantum observation process. The observation probability of \(a_i\) is \(P(a_i)= {{\mathrm{Tr}}}(\rho P_i)\), with the mixed state being projected by the observation process onto the state represented by the modified density matrix \(\rho ' = \sum _i P_i \rho P_i\). The expectation of the measurement is . The projective properties of quantum observation means that an observation actively modifies the system, both by altering its entropy and forcing an energy exchange between quantum system and observer.

Thermodynamics describes the behavior of a composite system in terms of macroscopic variables such as energy, entropy and temperature. These are linked together by the thermodynamic identity

$$\begin{aligned} dU = TdS - PdV \end{aligned}$$
(10)

where U is the internal energy, S the entropy, V the volume, T the temperature, and P the pressure.

Following Passerini and Severini [11] in their use of the normalized Laplacian matrix as a density operator defining the current state of the network, we derive the network entropy in terms of the von Neumann entropy

$$\begin{aligned} S_{VN} = - \sum ^{|V|}_{i = 1}\frac{\tilde{\lambda _i}}{|V|}\text {ln}\frac{\tilde{\lambda _i}}{|V|} \end{aligned}$$
(11)

With this we can measure dS the change in entropy as the network evolves. Previous work used similar entropic measure to define thermodynamic variables on networks, but linked energy to the number of edges in the graph [15] or derived it through the Boltzmann partition function of the network [14]. However, in these approaches the structure of the graph has the dual function of state (determining the density operator) and operator. Here we opt for a different approach that does away with this duality, assuming that the energy operator is unknown and estimated from the evolution. We assume that the dynamics of the network is governed by a free evolution following the Schrödinger equation under an unknown time-varying Hamitonian \(\mathcal {H}_t\), and an interaction with the outside world which acts as an observer. The free evolution does not change the thermodynamic variables, while the cause of the variation in Entropy has to be sought from the interaction process which also causes an energy exchange.

To measure the energy exchange we need to recover the potential term expressed by the unknown Hamiltonian. In fact, the Hamiltonian acts as an energy operator, resulting the following expression for the change in energy between state \(\rho _t\) and \(\rho _{t+1}\)

$$\begin{aligned} dU = {{\mathrm{Tr}}}(\mathcal {H}_t \rho _{t+1}) - {{\mathrm{Tr}}}(\mathcal {H}_t \rho _t) \end{aligned}$$
(12)

We estimate the Hamiltonian \(\mathcal {H}_t\) as the one that minimizes the exchange of energy through the interaction with the environment. To this end we assume that the interaction intervenes at the end of the free evolution, where \(\rho _t\) is transformed by the Schrödinger equation into

$$\begin{aligned} \hat{\rho }_{t+1} = \exp (-i\mathcal {H}_t) \rho _t \exp (i\mathcal {H}_t) \end{aligned}$$
(13)

The exchange of energy in the interaction is then

$$\begin{aligned} \mathcal {H}= & {} \mathop {{{\mathrm{arg\,min}}}}\limits _{H} {{\mathrm{Tr}}}(H \rho _{t+1}) - {{\mathrm{Tr}}}(H \hat{\rho }_{t+1})\\= & {} \mathop {{{\mathrm{arg\,min}}}}\limits _{H} {{\mathrm{Tr}}}\left( H \big (\rho _{t+1} - \exp (-i H) \rho _t \exp (i H)\big ) \right) \nonumber \end{aligned}$$
(14)

Let \(\rho _t = \varPhi _t\varLambda _t\varPhi _t^T\) be the spectral decomposition of the state of the network at time t, Eq. (14) can be solved by noting that the minimum energy exchange intervenes when the interaction changes the eigenvalues of the density matrices, and with them the entropy, but does not change the corresponding eigenspaces. In other words, the Hamiltonian is the cause of the eigenvector rotation and can be recovered by it:

$$\begin{aligned} \mathcal {H}_t \approx i\log (\varPhi _{t+1}\varPhi _t^T) \end{aligned}$$
(15)

It is worth noting that we have computed a lower bound of the Hamiltonian, since we cannot observe components on the null spaces of \(\rho \)s. Furthermore, we have

$$\begin{aligned} \underbrace{\varPhi _{t+1}\varPhi _t^T}_{\mathcal {U}}\rho _0\underbrace{\varPhi _t\varPhi _{t+1}^T}_{\mathcal {U}} = \hat{\rho }_{t+1}, \end{aligned}$$
(16)

where \(\mathcal {U}=\varPhi _{t+1}\varPhi _t^T\) is the unitary evolution matrix. The final change in internal energy is then

$$\begin{aligned} dU = Tr(\mathcal {H}_t\rho _{t+1})- Tr(\mathcal {H}_t\rho _t) \end{aligned}$$
(17)

The thermodynamic temperature T can then be recovered through the fundamental thermodynamic relation \(dU = TdS - PdV\) but where we assume that the volume is constant, i.e. \(dV=0\) (isochoric process). As a result, the reciprocal of the temperature T is the rate of change of internal energy with entropy

$$\begin{aligned} T=\frac{dU}{dS} \end{aligned}$$
(18)

This definition can be applied to evolving complex networks which do not change the number of nodes during their evolution.

3 Experimental Evaluation

In this section we evaluate the ability of the thermodynamic variables to describe the overall dynamics of a system and to characterize significant changes of network’s state. Especially, we will investigate how the estimated Energy-exchange describes the temporal trend of the network and whether the approach turns out efficient to detect critical events of a complex phenomena (e.g. financial crises or crashes). To this aim, we focused on two real-world time-evolving networks, representing the stock price correlation of the New York Stock Exchange (NYSE) and the gene expression of the Fruit Fly (Drosophila melanogaster).

3.1 Datasets

NYSE: The dataset is extracted from a database containing the daily prices of 3799 stocks traded on the New York Stock Exchange (NYSE). The dynamic network is built by selecting 347 stocks with historical data from May 1987 to February 2011 [12]. To obtain an evolving network, a time window of 28 days is used and moved along time to obtain a sequence (from day 29 to day 6004); so doing every temporal window becomes a subsequence of the daily return stock values over a 28 day period. Then, to set trades among the different stocks in the form of a network, for each time window, the cross correlation coefficient between the time-series for each pair of stocks is computed. We create connections between them if the absolute value of the correlation coefficient exceeds a threshold and in this way we construct a stock market network which changes over the time, with a fixed number of 347 nodes and varying edge structure for each of trading days.

Drosophila: The dataset belongs to the biology field and collects interactions among genes of Fruit Fly - Drosophila melanogaster - during its life cycle. The fruit fly life cycle is divided into four stages; data is sampled at 66 sequential developmental time points. Early embryos are sampled hourly and adults are sampled at multiday intervals, according to the speed of the morphological changes. Each stage gathers a set of samples: the embryonic phase contains samples from time point 1 to time point 30, larval has samples 31–40, pupal 41–58 and the remaining samples concerns the adulthood. To represent data using a time evolving network, the following steps are followed [13]. At each developmental point the 588 genes that are known to play an important role in the development of the Drosophila are selected. These genes are the nodes of the network, and edges are established based on the microarray gene expression measurements reported in [1]. To make more tractable the normalized Laplacian any self-loop in the obtained undirect graph - at each time - has been removed. This dataset yields a time-evolving network with a fixed number of 588 nodes, sampled at 66 developmental time points.

3.2 Experiments

To carry out our analysis, firstly we computed the normalized Laplacian of the network at each step (e.g. the time interval in the NYSE is a day) and then the thermodynamic variables, entropy and Energy-exchange (i.e. the change in internal energy), as shown in Eqs. (6) and (17), respectively. Then, by means of the entropy variation dS, we computed the temperature (Eq. (18)) and finally we derived the Energy, from the energy variation. Initial investigations were oriented towards a general analysis of three main variables’ behaviour and afterward we shifted the focus on the one with the best (qualitative) performance.

Fig. 1.
figure 1

Up-Bottom: entropy variation, energy variation and temperature versus time (May 1987 - October 1996), for the dynamic stock correlation network. The vertical colored lines refer to the most important and devasting events for the trade market. Left-Right: Black Monday (19th October 1987), Friday the 13th Mini-Crash (13rd October 1989), Persian Gulf War (2nd August 1990 - 17th January 1991). (Color figure online)

Fig. 2.
figure 2

Up-Bottom: entropy variation, energy variation and temperature versus time (March 1997 - December 2001), for the NYSE dataset. The vertical colored lines signal important events. Left-Right: Asian Financial Crisis (July 1997 – October 1997), Russian Financial Crisis – Ruble devaluation (17th August 1998), Dot-com bubble - climax (10th March 2000), September 11 attacks. (Color figure online)

Fig. 3.
figure 3

Scatter plot of Energy vs Entropy (New York Stock Exchange data). Each dot is a day and grey dots are the background. Dots of the same color belong to the same network phase. Horizontal lines represent cluster centroids for the energy dimension. (Color figure online)

We commenced by examining the energy variation dU, the entropy variation dS and the temperature T, as fluctuation indicators for the NYSE dataset (more suitable at this exploratory level since presenting many phase oscillations to be detected). Figures 1 and 2 are of help to compare the three quantities, throughout two slices of the time series including well-distinct occurrences. We can see that both signals tend to exhibit clear alterations in proximity of some major events even if the entropy variation appears slightly noisier than the energy. For instance, in Fig. 2, the Asian financial crisis is well-defined within boundaries (for the energy variation) as well as the Persian Gulf War in Fig. 1, while the entropy’s signal lightly errs in terms of precision, still remaining acceptable. Consequently the temperature, strongly affected by the entropy variation, sometimes oscillates even if none financial incident influences the system. An example of these unjustified swings is in Fig. 1, after January 1995.

Now we turn our attention to the energy dimension, which has proven to be the one with the lowest volatility from the preliminary examinations. In Fig. 3, we show the scatter plot of the Entropy over Energy for the NYSE dataset. Exploiting this kind of representation, we were able to assess the effectiveness of the Energy-exchange in characterizing the network state. Indeed, from the chart, an interesting feature of the network emerges: it exists a clustering-like behavior of the market when the system endures strong modifications. However, each pattern presents a wide entropy variation but a low energy variation. Thus, we conclude network’s states are better identified by the energy, which effectively catches cluster compactness, rather than the entropy, more dispersive. A further evidence of such energetic typifying comes from Fig. 5, concerning the Drosophila melanogaster data. Here again the entropy over energy plot modality was adopted; we can observe that stages of the fruit fly life cycle, seen as phase transitions, are being recognized by the Energy-exchange, in a more succinct way than the entropy dimension. Qualitative comparisons with other approaches adopting thermodynamic characterizations, such as in [14], confirm that a clear distinction is not always straightforward, above all when the amount of data is scarce (e.g., time epochs in the time-series).

Finally, in Fig. 4, the temporal trend of the energy (recovered from Energy-exchange), distinctly proves how the estimation of the hidden time-varying Hamiltonial successfully extracts information from data and how the energy can be considered a decisive state function.

Fig. 4.
figure 4

Energy-exchange versus time of the NYSE network (May 1987 - December 2003). Left-Right: Black Monday (19th October 1987), Friday the 13th Mini-Crash (13rd October 1989), Persian Gulf War (2nd August 1990 - 17th January 1991), Asian Financial Crisis (July 1997 – October 1997), Russian Financial Crisis – Ruble devaluation (17th August 1998), Dot-com bubble - climax (10th March 2000), September 11 attacks, Downturn of 2002–2003. (Color figure online)

Fig. 5.
figure 5

Scatter plot of Energy vs Entropy of the Drosophila melanogaster network. Each dot represents a sample. Numbers and colors are used to identify samples and life cycle stages. Horizontal lines represent cluster centroids for the energy dimension. (Color figure online)

4 Discussion and Conclusion

In this paper, we adopt a thermodynamic characterization of temporal network structure in order to represent and understand the evolution of time-varying networks. We provide expressions for thermodynamic variables, i.e. entropy, energy and temperature, and in addition we derive a measure of Energy-exchange. This analysis is based on quantum thermodynamics and connected to recent works on the von Neumann entropy of networks. The Energy-exchange is derived by estimating an unknown Hamiltonian operator governing the free evolution through the Schrödinger equation. We have evaluated the approach experimentally using real-world data, representing time-varying complex systems taken from the financial and biological fields. The experimental results prove that the Energy-exchage is a convenient and efficient measure for analyzing the evolutionary properties of dynamic networks, able to detect phase transitions and abrupt changes occurring in complex phenomena.