Skip to main content

Statistical View of Information Theory

  • Reference work entry
  • First Online:
International Encyclopedia of Statistical Science

Information Theory has origins and applications in several fields such as: thermodynamics, communication theory, computer science, economics, biology, mathematics, probability and statistics. Due to this diversity, there are numerous information measures in the literature. Kullback (1978), Sakamoto et al. (1986), and Pardo (2006) have applied several of these measures to almost all statistical inference problems.

According to The Likelihood Principle, all experimental information relevant to a parameter θ is mainly contained in the likelihood function L(θ) of the underlying distribution. Bartlett’s information measure is given by − log(L(θ)). Entropy measures (see Entropy) are expectations of functions of the likelihood. Divergence measures are also expectations of functions of likelihood ratios. In addition, Fisher-like information measures are expectations of functions of derivatives of the log-likelihood. DasGupta (2008, Chap. 2) reported several relations among members of these...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 1,100.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 549.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References and Further Reading

  • Awad AM (1987) A statistical information measure. Dirasat (Science) 14(12):7–20

    Google Scholar 

  • Bartlett MS (1936) Statistical information and properties of sufficiency. Proc R Soc London A 154:124–137

    Google Scholar 

  • Basu D (1975) Statistical information and likelihood. Sankhya A 37(1):1–71

    MATH  Google Scholar 

  • DasGupta A (2008) Asymptotic theory of statistics and probability. Springer Science Media, LLC

    MATH  Google Scholar 

  • Fisher RA (1925) Theory of statistical estimation. Proc Cambridge Philos Soc 22:700–725

    MATH  Google Scholar 

  • Jaynes ET (1957) Information theory and statistical mechanics. Phys Rev 106:620–630; 180:171–197

    Google Scholar 

  • Kullback S (1978) Information theory and statistics. Gloucester, Peter Smith, MA

    Google Scholar 

  • Lindley DV (1956) On the measure of information provided by an experiment. Ann Stat 27:986–1005

    MathSciNet  MATH  Google Scholar 

  • Pardo L (2006) Statistical inference based on divergence measures. Chapman and Hall, New York

    MATH  Google Scholar 

  • Sakamoto Y, Ishiguro M, Kitagawa G (1986) Akaike information criterion statistics. KTK

    Google Scholar 

  • Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27(3):379–423 and 623–656

    Google Scholar 

  • Wald A (1947) Sequential analysis. Dover, New York

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this entry

Cite this entry

Awad, A.M. (2011). Statistical View of Information Theory. In: Lovric, M. (eds) International Encyclopedia of Statistical Science. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04898-2_554

Download citation

Publish with us

Policies and ethics