Skip to main content

Quantifying Synergistic Mutual Information

  • Chapter
Book cover Guided Self-Organization: Inception

Part of the book series: Emergence, Complexity and Computation ((ECC,volume 9))

Abstract

Synergy is a fundamental concept in complex systems that has received much attention in computational biology (Narayanan et al. 2005; Balduzzi and Tononi 2008). Several papers (Schneidman et al. 2003a; Bell 2003; Nirenberg et al. 2001;Williams and Beer 2010) have proposed measures for quantifying synergy, but there remains no consensus which measure is most valid.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Amari, S.: Information geometry on hierarchical decomposition of stochastic interactions. IEEE Transaction on Information Theory 47, 1701–1711 (1999)

    Article  MathSciNet  Google Scholar 

  • Anastassiou, D.: Computational analysis of the synergy among multiple interacting genes. Molecular Systems Biology 3, 83 (2007)

    Article  Google Scholar 

  • Balduzzi, D., Tononi, G.: Integrated information in discrete dynamical systems: motivation and theoretical framework. PLoS Computational Biology 4(6), e1000091 (2008)

    Google Scholar 

  • Bell, A.J.: The co-information lattice. In: Amari, S., Cichocki, A., Makino, S., Murata, N. (eds.) Fifth International Workshop on Independent Component Analysis and Blind Signal Separation, Springer (2003)

    Google Scholar 

  • Bertschinger, N., Rauh, J., Olbrich, E., Jost, J.: Shared information – new insights and problems in decomposing information in complex systems. CoRR, abs/1210.5902 (2012)

    Google Scholar 

  • Chechik, G., Globerson, A., Anderson, M.J., Young, E.D., Nelken, I., Tishby, N.: Group redundancy measures reveal redundancy reduction in the auditory pathway. In: Dietterich, T.G., Becker, S., Ghahramani, Z. (eds.) NIPS 2002, pp. 173–180. MIT Press, Cambridge (2002)

    Google Scholar 

  • Comtet, L.: Advanced Combinatorics: The Art of Finite and Infinite Expansions. Reidel, Dordrecht (1998)

    Google Scholar 

  • Cover, T.M., Thomas, J.A.: Elements of Information Theory. John Wiley, New York (1991)

    Book  MATH  Google Scholar 

  • DeWeese, M.R., Meister, M.: How to measure the information gained from one symbol. Network 10, 325–340 (1999)

    Article  MATH  Google Scholar 

  • Gat, I., Tishby, N.: Synergy and redundancy among brain cells of behaving monkeys. In: Advances in Neural Information Proceedings systems, pp. 465–471. MIT Press (1999)

    Google Scholar 

  • Gawne, T.J., Richmond, B.J.: How independent are the messages carried by adjacent inferior temporal cortical neurons? Journal of Neuroscience 13, 2758–2771 (1993)

    Google Scholar 

  • Han, T.S.: Nonnegative entropy measures of multivariate symmetric correlations. Information and Control 36(2), 133–156 (1978)

    Article  MATH  MathSciNet  Google Scholar 

  • Harder, M., Salge, C., Polani, D.: A bivariate measure of redundant information. Physical Review E 87(1), 012130 (2013)

    Google Scholar 

  • Latham, P.E., Nirenberg, S.: Synergy, redundancy, and independence in population codes, revisited. Journal of Neuroscience 25(21), 5195–5206 (2005)

    Article  Google Scholar 

  • Lei, W., Xu, G., Chen, B.: The common information of n dependent random variables. In: Forty-Eighth Annual Allerton Conference on Communication, Control, and Computing (2010), doi:abs/1010.3613:836–843

    Google Scholar 

  • Lizier, J.T., Flecker, B., Williams, P.L.: Towards a synergy-based approach to measuring information modification. In: IEEE Symposium Series on Computational Intelligence (SSCI 2013) — IEEE Symposium on Artificial Life, Singapore. IEEE Press (April 2013)

    Google Scholar 

  • Maurer, U.M., Wolf, S.: Unconditionally secure key agreement and the intrinsic conditional information. IEEE Transactions on Information Theory 45(2), 499–514 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  • Narayanan, N.S., Kimchi, E.Y., Laubach, M.: Redundancy and synergy of neuronal ensembles in motor cortex. The Journal of Neuroscience 25(17), 4207–4216 (2005)

    Article  Google Scholar 

  • Nirenberg, S., Carcieri, S.M., Jacobs, A.L., Latham, P.E.: Retinal ganglion cells act largely as independent encoders. Nature 411(6838), 698–701 (2001)

    Article  Google Scholar 

  • Nirenberg, S., Latham, P.E.: Decoding neuronal spike trains: How important are correlations? Proceedings of the National Academy of Sciences 100(12), 7348–7353 (2003)

    Article  Google Scholar 

  • Panzeri, S., Treves, A., Schultz, S., Rolls, E.T.: On decoding the responses of a population of neurons from short time windows. Neural Comput. 11(7), 1553–1577 (1999)

    Article  Google Scholar 

  • Pola, G., Thiele, A., Hoffmann, K.P., Panzeri, S.: An exact method to quantify the information transmitted by different mechanisms of correlational coding. Network 14(1), 35–60 (2003)

    Article  Google Scholar 

  • Schneidman, E., Bialek, W., Berry II, M.: Synergy, redundancy, and independence in population codes. Journal of Neuroscience 23(37), 11539–11553 (2003a)

    Google Scholar 

  • Schneidman, E., Still, S., Berry, M.J., Bialek, W.: Network information and connected correlations. Phys. Rev. Lett. 91(23), 238701–238705 (2003b)

    Article  Google Scholar 

  • Weisstein, E.W.: Antichain (2011), http://mathworld.wolfram.com/Antichain.html

  • White, D., Rabago-Smith, M.: Genotype-phenotype associations and human eye color. Journal of Human Genetics 56(1), 5–7 (2011)

    Article  Google Scholar 

  • Williams, P.L., Beer, R.D.: Nonnegative decomposition of multivariate information. CoRR, abs/1004.2515 (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Virgil Griffith .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Griffith, V., Koch, C. (2014). Quantifying Synergistic Mutual Information. In: Prokopenko, M. (eds) Guided Self-Organization: Inception. Emergence, Complexity and Computation, vol 9. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-53734-9_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-53734-9_6

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-53733-2

  • Online ISBN: 978-3-642-53734-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics