Abstract
Information theory is becoming more and more important for many fields. This is true for engineering- and technology-based areas but also for more theoretically oriented sciences such as probability and statistics. Aspects of this development is discussed at the non-technical level with emphasis on the role of information theoretical games.1
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Amari, S. (2001). Information geometry on hierarchy of probability distributions. IEEE Trans. Inform. Theory 47, 1701–1711.
Cencov, N.N. (1972). Statistical decision rules and optimal inference. Nauka, Moscow. In russian, translation in “Translations of Mathematical Monographs”, 53. American Mathematical Society, 1982.
Chiang, M. and Boyd, S. (2004). Geometric programming duals of channel capacity and rate distoprtion. IEEE Trans. Inform. Theory 50, 245–258.
Csiszâr, I. (1975). I-divergence geometry of probability distributions and minimization problems. Ann. Probab. 3, 146–158.
Csiszâr, I. (1984). Sanov property, generalized I-projection and a conditional limit theorem. Ann. Probab. 12, 768–793.
Csiszâr, I. and Matús, F. (2003). Information projections revisited. IEEE Trans. Inform. Theory 49 (6), 1474–1490.
de Finetti, B. (1974). Theory of Probability. Wiley, London. Italian original 1970.
Dembo, A. and Zeitouni, O. (1993). Large Deviations Techniques and Applications. Jones and Bartlett Publishers International, Boston.
Grünwald, P.D. and Dawid, A.P. (2004). Game Theory, Maximum Entropy, Minimum Discrepancy, and Robust Bayesian Decision Theory. Annals of Statistics, (to appear).
Harremoës, P. (2004). Information Topologies with Applications, (accepted for publication in a volume of the Bolyai Studies, Springer).
Harremoës, P. (2001). Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Inform. Theory, 47 (5), 2039–2041.
Harremoës, P. (2003). Probability theory assisted by information theory. (In preparation, draft available at http://www.math.ku.dk/moes/).
Harremoës, P. and Topste, F. (2001). Maximum entropy fundamentals. Entropy 3, 191–226. (http://www.unibas.ch/mdpi/entropy/[ONLINE]).
Harremoës, P. and Topste, F. (2002). Unified approach to optimization techniques in shannon theory. In: Proceedings 2002 IEEE International Symposium on Information Theory, p. 238.
Harremoës, P. and Vignat, C. (2004). A short information theoretic proof of CLT. (In preparation).
Haussler, D. (1997). A general minimax result for relative entropy. IEEE Trans. Inform. Theory 43, 1276–1280.
Jaynes, E.T. (1957). Information theory and statistical mechanics, I and II. Physical Reviews 106 and 108, 620–630 and 171–190.
Jaynes, E.T. (2003). Probability Theory-The Logic of Science. Cambridge University Press, Cambridge.
Johnson, O. and Barron, A.R. Fisher information inequalities and the central limit theorem, (to appear).
Kazakos, D. (1983). Robust noiceless source coding through a game theoretic approach. IEEE Trans. Inform. Theory 29, 577–583.
Kelly, J.L. (1956). A new interpretation of information rate. Bell System Technical Journal 35, 917–926.
Kullback, S. (1959). Informaton Theory and Statistics. Wiley, New York.
Kullback, S. and Leibler, R. (1951). On information and sufficiency. Ann. Math. Statist. 22, 79–86.
Shafer, G. and Vovk, V. (2001). Probability and finance. It’s only a game Wiley, Chichester.
Shannon, C.E. (1948). A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423 and 623–656.
Topsoe, F. Entropy and equilibrium via games of complexity. Phyaica A.
Topsoe, F. (1979). Information theoretical optimization techniques. Kybernetika 15 (1), 8–27.
Topsoe, F. (1993). Game theoretical equilibrium, maximum entropy and minimum information discrimination. In: Mohammad-Djafari, A. and Demoments, G. (Eds.), Maximum Entropy and Bayesian Methods. Kluwer Academic Publishers, Dordrecht-Boston-London, pp. 15–23.
Topsoe, F. (2002). Maximum entropy versus minimum risk and applications to some classical discrete distributions. IEEE 21nans. Inform. Theory 48 (8), 2368–2376.
Topsoe, F. (2004). Information Theory at the Service of Science, (to appear in a special volume of the Janos Bolyai Mathematical Society).
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Topsøe, F. (2004). Information Theory and Complexity in Probability and Statistics. In: Soft Methodology and Random Information Systems. Advances in Soft Computing, vol 26. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-44465-7_44
Download citation
DOI: https://doi.org/10.1007/978-3-540-44465-7_44
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22264-4
Online ISBN: 978-3-540-44465-7
eBook Packages: Springer Book Archive