Abstract
What do we mean when by saying that a given system shows “complex behavior”, can we provide precise measures for the degree of complexity? This chapter offers an account of several common measures of complexity and the relation of complexity to predictability and emergence.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In some areas, like the neurosciences or artificial intelligence, the term “Bayesian” is used for approaches using statistical methods, in particular in the context of hypothesis building, when estimates of probability distribution functions are derived from observations.
- 2.
The expression p(x i ) is therefore context specific and can denote both a properly normalized discrete distribution function as well as the value of a continuous probability distribution function.
- 3.
In formal texts on statistics and information theory the notation μ = E(X) is often used for the mean μ, the expectation value E(X) and a random variable X, where X represents the abstract random variable, whereas x denotes its particular value and p X (x) the probability distribution.
- 4.
Please take note of the difference between a cumulative stochastic process, when adding the results of individual trials, and the “cumulative PDF” F(x) defined by \(F(x) =\int _{ -\infty }^{x}p(x^{\prime})\mathit{dx}^{\prime}\).
- 5.
For continuous-time data, as for an electrocardiogram, an additional symbolization step is necessary, the discretization of time. Here we consider however only discrete-time series.
- 6.
Remember, that \(\mathrm{XOR}(0, 0) = 0 =\mathrm{ XOR}(1, 1)\) and \(\mathrm{XOR}(0, 1) = 1 =\mathrm{ XOR}(1, 0)\).
- 7.
A function f(x) is a function of a variable x; a functional F[f] is, on the other hand, functionally dependent on a function f(x). In formal texts on information theory the notation H(X) is often used for the Shannon entropy and a random variable X with probability distribution p X (x).
- 8.
For a proof consider the generic substitution x → q(x) and a transformation of variables x → q via \(\mathit{dx} = \mathit{dq}/q^{\prime}\), with \(q^{\prime} = dq(x)/\mathit{dx}\), for the integration in Eq. (3.48).
References
Adami, C. 2002 What is complexity? BioEssays 24, 1085–1094.
Binder, P.-M. 2008 Frustration in complexity. Science 320, 322–323.
Binder, P.-M. 2009 The edge of reductionism. Nature 459, 332–334.
Boccara, N. 2003 Modeling Complex Systems. Springer, Berlin.
Boffetta, G., Cencini, M., Falcioni, M., Vulpiani, A. 2002 Predictability: A way to characterize complexity. Physics Reports 356, 367–474.
Bolstad, W.M. 2004 Introduction to Bayesian Statistics. Wiley-IEEE, Hoboken, NJ.
Cover, T.M., Thomas, J.A. 2006 Elements of Information Theory. Wiley-Interscience, Hoboken, NJ.
Li, M., Vitanyi, P.M.B. 1997 An introduction to Kolmogorov Complexity and its Applications. Springer, Berlin.
Manson, S.M. 2001 Simplifying complexity: A review of complexity theory. Geoforum 32, 405–414.
Olbrich, E., Bertschinger, N., Ay, N., Jost, J. 2008 How should complexity scale with system size? The European Physical Journal B 63, 407–415.
Tononi, G., Edelman, G.M. 1998 Consciousness and complexity. Science 282, 1846.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Gros, C. (2013). Complexity and Information Theory. In: Complex and Adaptive Dynamical Systems. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-36586-7_3
Download citation
DOI: https://doi.org/10.1007/978-3-642-36586-7_3
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-36585-0
Online ISBN: 978-3-642-36586-7
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)