Abstract
How can the information that a set {X 1,…,X n } of random variables contains about another random variable S be decomposed? To what extent do different subgroups provide the same, i.e. shared or redundant, information, carry unique information or interact for the emergence of synergistic information?
Recently Williams and Beer proposed such a decomposition based on natural properties for shared information. While these properties fix the structure of the decomposition, they do not uniquely specify the values of the different terms. Therefore, we investigate additional properties such as strong symmetry and left monotonicity. We find that strong symmetry is incompatible with the properties proposed by Williams and Beer. Although left monotonicity is a very natural property for an information measure it is not fulfilled by any of the proposed measures.
We also study a geometric framework for information decompositions and ask whether it is possible to represent shared information by a family of posterior distributions.
Finally, we draw connections to the notions of shared knowledge and common knowledge in game theory. While many people believe that independent variables cannot share information, we show that in game theory independent agents can have shared knowledge, but not common knowledge. We conclude that intuition and heuristic arguments do not suffice when arguing about information.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
Here, p(X) denotes the probability distribution of the random variable X. When referring to the probability of a particular outcome \(x \in \mathcal {X}\) of this random variable, we write p(x).
- 2.
A related notion has been developed in the context of cryptography to quantify the secret information. Although the secret information has a clear operational interpretation it cannot be computed directly, but is upper bounded by the intrinsic mutual information I(S:X 1↓X 2) [9, 10]. Unfortunately, the intrinsic mutual information does not obey the consistency condition (35.5), and hence it cannot be interpreted as unique information in our sense.
- 3.
Note the similarity to the definition of a random variable as a measurable map from a probability space to outcomes. In fact, if we choose an arbitrary probability distribution on \(\mathcal {S}\), then the partition X i , considered as a function \(\mathcal {S}\to \mathcal {X}_{i}\), becomes a random variable.
References
Schneidman E, Bialek W, Berry MJ (2003) Synergy, redundancy, and independence in population codes. J Neurosci 23(37):11539–11553. 17
Latham PE, Nirenberg S (2005) Synergy, redundancy, and independence in population codes, revisited. J Neurosci 25(21):5195–5206. 25
Pessoa LG, Ungerleider L (2008) What and where pathways. Scholarpedia 3(10):5342
Williams P, Beer R (2010) Nonnegative decomposition of multivariate information. arXiv:1004.2515v1
Ay N, Olbrich E, Bertschinger N, Jost J (2011) A geometric approach to complexity. Chaos 21(3):037103
Harder M, Salge C, Polani D (2012) A bivariate measure of redundant information. CoRR. arXiv:1207.2080 [cs.IT]
Cover T, Thomas J (1991) Elements of information theory, 1st edn. Wiley, New York
Bell AJ (2003) The co-information lattice. In: Proc fourth int symp independent component analysis and blind signal separation (ICA 03)
Maurer U, Wolf S (1997) The intrinsic conditional mutual information and perfect secrecy. In: IEEE international symposium on information theory
Christandl M, Renner R, Wolf S (2003) A property of the intrinsic mutual information. In: IEEE international symposium on information theory
Aumann RJ (1976) Agreeing to disagree. Ann Stat 4(6):1236–1239
Halpern JY (1995) Reasoning about knowledge: a survey. In: Gabbay D, Hogger CJ, Robinson JA (eds) Handbook of logic in artificial intelligence and logic programming, vol 4. Oxford University Press, London, pp 1–34
Acknowledgements
This work was supported by the VW Foundation (J. Rauh) and has received funding from the European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 258749 (to E. Olbrich).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer International Publishing Switzerland
About this paper
Cite this paper
Bertschinger, N., Rauh, J., Olbrich, E., Jost, J. (2013). Shared Information—New Insights and Problems in Decomposing Information in Complex Systems. In: Gilbert, T., Kirkilionis, M., Nicolis, G. (eds) Proceedings of the European Conference on Complex Systems 2012. Springer Proceedings in Complexity. Springer, Cham. https://doi.org/10.1007/978-3-319-00395-5_35
Download citation
DOI: https://doi.org/10.1007/978-3-319-00395-5_35
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-00394-8
Online ISBN: 978-3-319-00395-5
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)