Basic Concepts in Information Theory and Coding pp 207-242 | Cite as

# Infinite Discrete Sources

## Abstract

Agent 00111 was happiest taking cases that had only a finite number of outcomes. These were well-behaved in terms of pricing and computation. Even if the finite number of outcomes were extremely large, the Shannon-McMillan Theorem ( Theorem 1.2a.) often made the costing tractable. Sometimes, however, there were not a finite number of possibilities but two other possibilities. The number of outcomes could be countably or uncountably infinite. In Chapter 4, we consider infinite discrete distributions. For example, Agent 00111 may be asked to estimate how many days he would spend researching the true state of a country’s secret committees and to justify the time in terms of the uncertainty eliminated. In such cases, Agent 00111’s department could *theoretically* spend from here to eternity on the project, since the outcomes (in days) are countably infinite. In practice, this did not bother Agent 00111 too much—an obvious point of diminishing returns and increasing boredom caused him to quit after a period of time. However, as his scientists pointed out to him, just suppose that each day he were on a project he obtained more information than the last; how tempting it would be to stay.

### Keywords

Entropy Agent 00111 Prefix Padding## Preview

Unable to display preview. Download preview PDF.

### References

- P. Billingsley. 1961. “On the Coding Theorem for the Noiseless Channel.”
*Ann. Math. Stat*.**32**: 576–601.Google Scholar - H. Cramer. 1946.
*Mathematical Methods of Statistics*. Princeton, NJ: Princeton University Press.MATHGoogle Scholar - L. D. Davisson. 1973. “Universal Noiseless Coding.”
*IEEE Trans. Inform. Theory***IT-19**: 783–95.MathSciNetMATHCrossRefGoogle Scholar - P. Elias. 1975. “Universal Code-Word Sets and Representations of the Integers.”
*IEEE Trans. Inform. Theory***IT-21**: pp. 194–2037.MathSciNetMATHCrossRefGoogle Scholar - W. Feller. 1957.
*An Introduction to Probability Theory and Its Applications*. Wiley, New York.MATHGoogle Scholar - R. G. Gallager and D. C. van Vorhis. 1975. “Optimal Source Codes for Geometrically Distributed Integer Alphabets.”
*IEEE Trans. Inform. Theory***IT-21**: 228–30.MATHCrossRefGoogle Scholar - S. W. Golomb. 1966a. “Run-Length Encodings.”
*IEEE Trans. Inform. Theory***IT-12**: pp. 309–401.MathSciNetGoogle Scholar - S. W. Golomb. 1966b. “The Information-Generating Function of a Probability Distribution.”
*IEEE Trans. Inform. Theory***IT-12**: pp. 75–7.MathSciNetCrossRefGoogle Scholar - S. W. Golomb. 1970. “A Class of Probability Distributions on the Integers.”
*J. Number Th*.**2**: 189–92.MathSciNetMATHCrossRefGoogle Scholar - S. W. Golomb. 1980. “Sources Which Maximize the Choice of a Huffman Coding Tree.”
*Inform. Contr*.**45**: 263–72.MathSciNetMATHCrossRefGoogle Scholar - D. A. Huffman. 1951. “A Method for the Construction of Minimum Redundancy Codes.”
*Proc. IRE***40**: pp. 1098–1101.CrossRefGoogle Scholar - F. Jelinek, 1968. “Buffer Overflow in Variable Length Coding of Fixed Rate Sources.”
*IEEE Trans. Inform. Theory***IT-14**:490–501.MATHCrossRefGoogle Scholar - L. H. Loomis. 1953.
*Abstract Harmonic Analysis*. Van Nostrand.MATHGoogle Scholar - F. J. MacWilliams and N. J. A. Sloane. 1977.
*The Theory of Error-Correcting Codes*. North-Holland.MATHGoogle Scholar