Estimation of entropy of a binary variable: Satisfying a reliability criterion
Article
Received:
Abstract
Entropy of a binary variable X can be estimated from frequencies calculated on the basis of N observations. The paper considers a reliability criterion of the form Prob(¦H−HN¦<ɛ)≧1−δ where H and HN are the true and estimated entropies respectively, and ɛ and δ define the reliability criterion. A procedure is suggested in which N is incremented until the probability is at least 1−δ but less than 1−δ/2, satisfying the criterion but avoiding a needlessly extravagent number of observations.
Keywords
Entropy Binary Variable Form Prob Reliability Criterion Estimate Entropy
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Preview
Unable to display preview. Download preview PDF.
References
- Miller,G.A.: On the bias of information estimates. In: Quastler,H. (Ed.): Information Theory in Psychology. Glencoe: Free Press 1956.Google Scholar
- Pfaffelhuber,E.: Error estimation for the determination of entropy and information rate from relative frequencies. Kybernetik 8, 50–51 (1970).Google Scholar
- Pearson,E.S., Hartley,H.O.: Biometrika tables for statisticians, Vol. I, 3rd Ed. Cambridge: University Press 1966.Google Scholar
Copyright information
© Springer-Verlag 1973