Skip to main content

Information Theory and Complexity in Probability and Statistics

  • Conference paper
Soft Methodology and Random Information Systems

Part of the book series: Advances in Soft Computing ((AINSC,volume 26))

Abstract

Information theory is becoming more and more important for many fields. This is true for engineering- and technology-based areas but also for more theoretically oriented sciences such as probability and statistics. Aspects of this development is discussed at the non-technical level with emphasis on the role of information theoretical games.1

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amari, S. (2001). Information geometry on hierarchy of probability distributions. IEEE Trans. Inform. Theory 47, 1701–1711.

    Article  MathSciNet  MATH  Google Scholar 

  2. Cencov, N.N. (1972). Statistical decision rules and optimal inference. Nauka, Moscow. In russian, translation in “Translations of Mathematical Monographs”, 53. American Mathematical Society, 1982.

    Google Scholar 

  3. Chiang, M. and Boyd, S. (2004). Geometric programming duals of channel capacity and rate distoprtion. IEEE Trans. Inform. Theory 50, 245–258.

    Article  MathSciNet  Google Scholar 

  4. Csiszâr, I. (1975). I-divergence geometry of probability distributions and minimization problems. Ann. Probab. 3, 146–158.

    Article  MATH  Google Scholar 

  5. Csiszâr, I. (1984). Sanov property, generalized I-projection and a conditional limit theorem. Ann. Probab. 12, 768–793.

    Article  MathSciNet  MATH  Google Scholar 

  6. Csiszâr, I. and Matús, F. (2003). Information projections revisited. IEEE Trans. Inform. Theory 49 (6), 1474–1490.

    Article  MathSciNet  MATH  Google Scholar 

  7. de Finetti, B. (1974). Theory of Probability. Wiley, London. Italian original 1970.

    MATH  Google Scholar 

  8. Dembo, A. and Zeitouni, O. (1993). Large Deviations Techniques and Applications. Jones and Bartlett Publishers International, Boston.

    MATH  Google Scholar 

  9. Grünwald, P.D. and Dawid, A.P. (2004). Game Theory, Maximum Entropy, Minimum Discrepancy, and Robust Bayesian Decision Theory. Annals of Statistics, (to appear).

    Google Scholar 

  10. Harremoës, P. (2004). Information Topologies with Applications, (accepted for publication in a volume of the Bolyai Studies, Springer).

    Google Scholar 

  11. Harremoës, P. (2001). Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Inform. Theory, 47 (5), 2039–2041.

    Article  MathSciNet  MATH  Google Scholar 

  12. Harremoës, P. (2003). Probability theory assisted by information theory. (In preparation, draft available at http://www.math.ku.dk/moes/).

  13. Harremoës, P. and Topste, F. (2001). Maximum entropy fundamentals. Entropy 3, 191–226. (http://www.unibas.ch/mdpi/entropy/[ONLINE]).

  14. Harremoës, P. and Topste, F. (2002). Unified approach to optimization techniques in shannon theory. In: Proceedings 2002 IEEE International Symposium on Information Theory, p. 238.

    Google Scholar 

  15. Harremoës, P. and Vignat, C. (2004). A short information theoretic proof of CLT. (In preparation).

    Google Scholar 

  16. Haussler, D. (1997). A general minimax result for relative entropy. IEEE Trans. Inform. Theory 43, 1276–1280.

    Article  MathSciNet  MATH  Google Scholar 

  17. Jaynes, E.T. (1957). Information theory and statistical mechanics, I and II. Physical Reviews 106 and 108, 620–630 and 171–190.

    Google Scholar 

  18. Jaynes, E.T. (2003). Probability Theory-The Logic of Science. Cambridge University Press, Cambridge.

    Book  MATH  Google Scholar 

  19. Johnson, O. and Barron, A.R. Fisher information inequalities and the central limit theorem, (to appear).

    Google Scholar 

  20. Kazakos, D. (1983). Robust noiceless source coding through a game theoretic approach. IEEE Trans. Inform. Theory 29, 577–583.

    Article  Google Scholar 

  21. Kelly, J.L. (1956). A new interpretation of information rate. Bell System Technical Journal 35, 917–926.

    MathSciNet  Google Scholar 

  22. Kullback, S. (1959). Informaton Theory and Statistics. Wiley, New York.

    Google Scholar 

  23. Kullback, S. and Leibler, R. (1951). On information and sufficiency. Ann. Math. Statist. 22, 79–86.

    Article  MathSciNet  MATH  Google Scholar 

  24. Shafer, G. and Vovk, V. (2001). Probability and finance. It’s only a game Wiley, Chichester.

    Google Scholar 

  25. Shannon, C.E. (1948). A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423 and 623–656.

    Google Scholar 

  26. Topsoe, F. Entropy and equilibrium via games of complexity. Phyaica A.

    Google Scholar 

  27. Topsoe, F. (1979). Information theoretical optimization techniques. Kybernetika 15 (1), 8–27.

    MathSciNet  Google Scholar 

  28. Topsoe, F. (1993). Game theoretical equilibrium, maximum entropy and minimum information discrimination. In: Mohammad-Djafari, A. and Demoments, G. (Eds.), Maximum Entropy and Bayesian Methods. Kluwer Academic Publishers, Dordrecht-Boston-London, pp. 15–23.

    Chapter  Google Scholar 

  29. Topsoe, F. (2002). Maximum entropy versus minimum risk and applications to some classical discrete distributions. IEEE 21nans. Inform. Theory 48 (8), 2368–2376.

    Article  MathSciNet  Google Scholar 

  30. Topsoe, F. (2004). Information Theory at the Service of Science, (to appear in a special volume of the Janos Bolyai Mathematical Society).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Topsøe, F. (2004). Information Theory and Complexity in Probability and Statistics. In: Soft Methodology and Random Information Systems. Advances in Soft Computing, vol 26. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-44465-7_44

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-44465-7_44

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22264-4

  • Online ISBN: 978-3-540-44465-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics