Kybernetik

, Volume 12, Issue 2, pp 55–57 | Cite as

Estimation of entropy of a binary variable: Satisfying a reliability criterion

  • Roger C. Conant
Article

Abstract

Entropy of a binary variable X can be estimated from frequencies calculated on the basis of N observations. The paper considers a reliability criterion of the form Prob(¦H−HN¦<ɛ)≧1−δ where H and HN are the true and estimated entropies respectively, and ɛ and δ define the reliability criterion. A procedure is suggested in which N is incremented until the probability is at least 1−δ but less than 1−δ/2, satisfying the criterion but avoiding a needlessly extravagent number of observations.

Keywords

Entropy Binary Variable Form Prob Reliability Criterion Estimate Entropy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Miller,G.A.: On the bias of information estimates. In: Quastler,H. (Ed.): Information Theory in Psychology. Glencoe: Free Press 1956.Google Scholar
  2. Pfaffelhuber,E.: Error estimation for the determination of entropy and information rate from relative frequencies. Kybernetik 8, 50–51 (1970).Google Scholar
  3. Pearson,E.S., Hartley,H.O.: Biometrika tables for statisticians, Vol. I, 3rd Ed. Cambridge: University Press 1966.Google Scholar

Copyright information

© Springer-Verlag 1973

Authors and Affiliations

  • Roger C. Conant
    • 1
    • 2
  1. 1.Department of Information EngineeringUniversity of Illinois at Chicago CircleChicagoUSA
  2. 2.Dept. of Information Eng.College of Engin.ChicagoUSA

Personalised recommendations