Shared Information—New Insights and Problems in Decomposing Information in Complex Systems

  • Nils Bertschinger
  • Johannes Rauh
  • Eckehard Olbrich
  • Jürgen Jost
Part of the Springer Proceedings in Complexity book series (SPCOM)

Abstract

How can the information that a set {X 1,…,X n } of random variables contains about another random variable S be decomposed? To what extent do different subgroups provide the same, i.e. shared or redundant, information, carry unique information or interact for the emergence of synergistic information?

Recently Williams and Beer proposed such a decomposition based on natural properties for shared information. While these properties fix the structure of the decomposition, they do not uniquely specify the values of the different terms. Therefore, we investigate additional properties such as strong symmetry and left monotonicity. We find that strong symmetry is incompatible with the properties proposed by Williams and Beer. Although left monotonicity is a very natural property for an information measure it is not fulfilled by any of the proposed measures.

We also study a geometric framework for information decompositions and ask whether it is possible to represent shared information by a family of posterior distributions.

Finally, we draw connections to the notions of shared knowledge and common knowledge in game theory. While many people believe that independent variables cannot share information, we show that in game theory independent agents can have shared knowledge, but not common knowledge. We conclude that intuition and heuristic arguments do not suffice when arguing about information.

Notes

Acknowledgements

This work was supported by the VW Foundation (J. Rauh) and has received funding from the European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 258749 (to E. Olbrich).

References

  1. 1.
    Schneidman E, Bialek W, Berry MJ (2003) Synergy, redundancy, and independence in population codes. J Neurosci 23(37):11539–11553. 17 Google Scholar
  2. 2.
    Latham PE, Nirenberg S (2005) Synergy, redundancy, and independence in population codes, revisited. J Neurosci 25(21):5195–5206. 25 CrossRefGoogle Scholar
  3. 3.
    Pessoa LG, Ungerleider L (2008) What and where pathways. Scholarpedia 3(10):5342 Google Scholar
  4. 4.
    Williams P, Beer R (2010) Nonnegative decomposition of multivariate information. arXiv:1004.2515v1
  5. 5.
    Ay N, Olbrich E, Bertschinger N, Jost J (2011) A geometric approach to complexity. Chaos 21(3):037103 ADSCrossRefGoogle Scholar
  6. 6.
    Harder M, Salge C, Polani D (2012) A bivariate measure of redundant information. CoRR. arXiv:1207.2080 [cs.IT]
  7. 7.
    Cover T, Thomas J (1991) Elements of information theory, 1st edn. Wiley, New York MATHCrossRefGoogle Scholar
  8. 8.
    Bell AJ (2003) The co-information lattice. In: Proc fourth int symp independent component analysis and blind signal separation (ICA 03) Google Scholar
  9. 9.
    Maurer U, Wolf S (1997) The intrinsic conditional mutual information and perfect secrecy. In: IEEE international symposium on information theory Google Scholar
  10. 10.
    Christandl M, Renner R, Wolf S (2003) A property of the intrinsic mutual information. In: IEEE international symposium on information theory Google Scholar
  11. 11.
    Aumann RJ (1976) Agreeing to disagree. Ann Stat 4(6):1236–1239 MathSciNetMATHCrossRefGoogle Scholar
  12. 12.
    Halpern JY (1995) Reasoning about knowledge: a survey. In: Gabbay D, Hogger CJ, Robinson JA (eds) Handbook of logic in artificial intelligence and logic programming, vol 4. Oxford University Press, London, pp 1–34 Google Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  • Nils Bertschinger
    • 1
  • Johannes Rauh
    • 1
  • Eckehard Olbrich
    • 1
  • Jürgen Jost
    • 1
  1. 1.MPI für Mathematik in den NaturwissenschaftenLeipzigGermany

Personalised recommendations