Summary
Experimental determination of the entropy H and information rate T of experiments or information sources rely on measurements of the relative frequencies of events and thus furnish only approximations to the exact values of H and T, defined by probabilities rather than relative frequencies. We derive an error estimate for the measured entropy and information rate, which depends only upon the number of possible events and not upon the numerical values of their probabilities, and thereby answer the question of how often experiments should be repeated independently in order that the measured entropy and information rate come sufficiently close to the exact values of H and T with probability sufficiently close to 1.
Similar content being viewed by others
References
Fano, R. M.: Transmission of information. Cambridge, Massachusetts: M.I.T. Press 1966.
Färber, G.: Berechnung und Messung des Informationsflusses der Nervenfaser. Kybernetik 5, 17–29 (1968).
Jaglom, A. M., Jaglom, I. M.: Wahrscheinlichkeit und Information. Berlin: VEB Deutscher Verlag der Wissenschaften 1967.
Halmos, P. R.: Measure theory. New York: D. van Nostrand Company 1950.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Pfaffelhuber, E. Error estimation for the determination of entropy and information rate from relative frequencies. Kybernetik 8, 50–51 (1971). https://doi.org/10.1007/BF00288732
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF00288732