Infinite Discrete Sources

  • Solomon W. Golomb
  • Robert E. Peile
  • Robert A. Scholtz
Part of the Applications of Communications Theory book series (ACTH)


Agent 00111 was happiest taking cases that had only a finite number of outcomes. These were well-behaved in terms of pricing and computation. Even if the finite number of outcomes were extremely large, the Shannon-McMillan Theorem ( Theorem 1.2a.) often made the costing tractable. Sometimes, however, there were not a finite number of possibilities but two other possibilities. The number of outcomes could be countably or uncountably infinite. In Chapter 4, we consider infinite discrete distributions. For example, Agent 00111 may be asked to estimate how many days he would spend researching the true state of a country’s secret committees and to justify the time in terms of the uncertainty eliminated. In such cases, Agent 00111’s department could theoretically spend from here to eternity on the project, since the outcomes (in days) are countably infinite. In practice, this did not bother Agent 00111 too much—an obvious point of diminishing returns and increasing boredom caused him to quit after a period of time. However, as his scientists pointed out to him, just suppose that each day he were on a project he obtained more information than the last; how tempting it would be to stay.


Entropy Agent 00111 Prefix Padding 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. P. Billingsley. 1961. “On the Coding Theorem for the Noiseless Channel.” Ann. Math. Stat. 32: 576–601.Google Scholar
  2. H. Cramer. 1946. Mathematical Methods of Statistics. Princeton, NJ: Princeton University Press.MATHGoogle Scholar
  3. L. D. Davisson. 1973. “Universal Noiseless Coding.” IEEE Trans. Inform. Theory IT-19: 783–95.MathSciNetMATHCrossRefGoogle Scholar
  4. P. Elias. 1975. “Universal Code-Word Sets and Representations of the Integers.” IEEE Trans. Inform. Theory IT-21: pp. 194–2037.MathSciNetMATHCrossRefGoogle Scholar
  5. W. Feller. 1957. An Introduction to Probability Theory and Its Applications. Wiley, New York.MATHGoogle Scholar
  6. R. G. Gallager and D. C. van Vorhis. 1975. “Optimal Source Codes for Geometrically Distributed Integer Alphabets.” IEEE Trans. Inform. Theory IT-21: 228–30.MATHCrossRefGoogle Scholar
  7. S. W. Golomb. 1966a. “Run-Length Encodings.” IEEE Trans. Inform. Theory IT-12: pp. 309–401.MathSciNetGoogle Scholar
  8. S. W. Golomb. 1966b. “The Information-Generating Function of a Probability Distribution.” IEEE Trans. Inform. Theory IT-12: pp. 75–7.MathSciNetCrossRefGoogle Scholar
  9. S. W. Golomb. 1970. “A Class of Probability Distributions on the Integers.” J. Number Th. 2: 189–92.MathSciNetMATHCrossRefGoogle Scholar
  10. S. W. Golomb. 1980. “Sources Which Maximize the Choice of a Huffman Coding Tree.” Inform. Contr. 45: 263–72.MathSciNetMATHCrossRefGoogle Scholar
  11. D. A. Huffman. 1951. “A Method for the Construction of Minimum Redundancy Codes.” Proc. IRE 40: pp. 1098–1101.CrossRefGoogle Scholar
  12. F. Jelinek, 1968. “Buffer Overflow in Variable Length Coding of Fixed Rate Sources.” IEEE Trans. Inform. Theory IT-14:490–501.MATHCrossRefGoogle Scholar
  13. L. H. Loomis. 1953. Abstract Harmonic Analysis. Van Nostrand.MATHGoogle Scholar
  14. F. J. MacWilliams and N. J. A. Sloane. 1977. The Theory of Error-Correcting Codes. North-Holland.MATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 1994

Authors and Affiliations

  • Solomon W. Golomb
    • 1
  • Robert E. Peile
    • 2
  • Robert A. Scholtz
    • 3
  1. 1.Departments of Electrical Engineering and MathematicsUniversity of Southern CaliforniaLos AngelesUSA
  2. 2.Racal Research, LimitedReading, BerkshireUK
  3. 3.Department of Electrical EngineeringUniversity of Southern CaliforniaLos AngelesUSA

Personalised recommendations