Skip to main content

Shared Information—New Insights and Problems in Decomposing Information in Complex Systems

  • Conference paper

Part of the book series: Springer Proceedings in Complexity ((SPCOM))

Abstract

How can the information that a set {X 1,…,X n } of random variables contains about another random variable S be decomposed? To what extent do different subgroups provide the same, i.e. shared or redundant, information, carry unique information or interact for the emergence of synergistic information?

Recently Williams and Beer proposed such a decomposition based on natural properties for shared information. While these properties fix the structure of the decomposition, they do not uniquely specify the values of the different terms. Therefore, we investigate additional properties such as strong symmetry and left monotonicity. We find that strong symmetry is incompatible with the properties proposed by Williams and Beer. Although left monotonicity is a very natural property for an information measure it is not fulfilled by any of the proposed measures.

We also study a geometric framework for information decompositions and ask whether it is possible to represent shared information by a family of posterior distributions.

Finally, we draw connections to the notions of shared knowledge and common knowledge in game theory. While many people believe that independent variables cannot share information, we show that in game theory independent agents can have shared knowledge, but not common knowledge. We conclude that intuition and heuristic arguments do not suffice when arguing about information.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Here, p(X) denotes the probability distribution of the random variable X. When referring to the probability of a particular outcome \(x \in \mathcal {X}\) of this random variable, we write p(x).

  2. 2.

    A related notion has been developed in the context of cryptography to quantify the secret information. Although the secret information has a clear operational interpretation it cannot be computed directly, but is upper bounded by the intrinsic mutual information I(S:X 1X 2) [9, 10]. Unfortunately, the intrinsic mutual information does not obey the consistency condition (35.5), and hence it cannot be interpreted as unique information in our sense.

  3. 3.

    Note the similarity to the definition of a random variable as a measurable map from a probability space to outcomes. In fact, if we choose an arbitrary probability distribution on \(\mathcal {S}\), then the partition X i , considered as a function \(\mathcal {S}\to \mathcal {X}_{i}\), becomes a random variable.

References

  1. Schneidman E, Bialek W, Berry MJ (2003) Synergy, redundancy, and independence in population codes. J Neurosci 23(37):11539–11553. 17

    Google Scholar 

  2. Latham PE, Nirenberg S (2005) Synergy, redundancy, and independence in population codes, revisited. J Neurosci 25(21):5195–5206. 25

    Article  Google Scholar 

  3. Pessoa LG, Ungerleider L (2008) What and where pathways. Scholarpedia 3(10):5342

    Google Scholar 

  4. Williams P, Beer R (2010) Nonnegative decomposition of multivariate information. arXiv:1004.2515v1

  5. Ay N, Olbrich E, Bertschinger N, Jost J (2011) A geometric approach to complexity. Chaos 21(3):037103

    Article  ADS  Google Scholar 

  6. Harder M, Salge C, Polani D (2012) A bivariate measure of redundant information. CoRR. arXiv:1207.2080 [cs.IT]

  7. Cover T, Thomas J (1991) Elements of information theory, 1st edn. Wiley, New York

    Book  MATH  Google Scholar 

  8. Bell AJ (2003) The co-information lattice. In: Proc fourth int symp independent component analysis and blind signal separation (ICA 03)

    Google Scholar 

  9. Maurer U, Wolf S (1997) The intrinsic conditional mutual information and perfect secrecy. In: IEEE international symposium on information theory

    Google Scholar 

  10. Christandl M, Renner R, Wolf S (2003) A property of the intrinsic mutual information. In: IEEE international symposium on information theory

    Google Scholar 

  11. Aumann RJ (1976) Agreeing to disagree. Ann Stat 4(6):1236–1239

    Article  MathSciNet  MATH  Google Scholar 

  12. Halpern JY (1995) Reasoning about knowledge: a survey. In: Gabbay D, Hogger CJ, Robinson JA (eds) Handbook of logic in artificial intelligence and logic programming, vol 4. Oxford University Press, London, pp 1–34

    Google Scholar 

Download references

Acknowledgements

This work was supported by the VW Foundation (J. Rauh) and has received funding from the European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 258749 (to E. Olbrich).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nils Bertschinger .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer International Publishing Switzerland

About this paper

Cite this paper

Bertschinger, N., Rauh, J., Olbrich, E., Jost, J. (2013). Shared Information—New Insights and Problems in Decomposing Information in Complex Systems. In: Gilbert, T., Kirkilionis, M., Nicolis, G. (eds) Proceedings of the European Conference on Complex Systems 2012. Springer Proceedings in Complexity. Springer, Cham. https://doi.org/10.1007/978-3-319-00395-5_35

Download citation

Publish with us

Policies and ethics