Advertisement

Introduction

  • Solomon W. Golomb
  • Robert E. Peile
  • Robert A. Scholtz
Chapter
Part of the Applications of Communications Theory book series (ACTH)

Abstract

Secret Agent 00111 was in a most uncharacteristic mood; he was thinking about his career and remembering details that he was trained to forget. With some particularly sordid exceptions, it was not a story of universal appeal. However, as he neared the end of his service, he had been approached with several financial offers for the secrets of his legendary success. The process was all wrong, he thought glumly. The true secret of his success was more second nature than a mathematical equation, and it was probably not so salable as his backers believed. Oh well, he could always lie. . . . However, since he had been asked, he pondered what precisely he had bought and brought to the field of espionage.

Keywords

Mutual Information Event Sequence State Diagram Entropy Function Transition Probability Matrix 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. N. M. Abramson. 1963. Information Theory and Coding. McGraw-Hill, New York.Google Scholar
  2. A. V. Aho and J. D. Ullman. 1968. “The Theory of Languages.” Math. Systems Theory 2: 97125.MathSciNetCrossRefGoogle Scholar
  3. R. L. Adler, D. Coppersmith, M. Hasner. 1983. “Algorithms for Sliding Block Codes-An Application of Symbol Dynamics to Information Theory.” IEEE Trans. Inform. Theory: IT29: 5–22, 1983: 5–22.MathSciNetMATHCrossRefGoogle Scholar
  4. R. B. Ash. 1965. Information Theory. Interscience, New York.MATHGoogle Scholar
  5. P. Billingsley. 1965. Ergodic Theory and Information. Wiley, New York.MATHGoogle Scholar
  6. R. E. Blahut. 1987. Principles and Practice of Information Theory. Addison Wesley, New York.MATHGoogle Scholar
  7. L. Breiman. 1957. “The Individual Ergodic Theory of Information Theory.” Ann. Math. Stat. 28: 809–11; errata in Ann. Math. Stat. 31: 809–10.MathSciNetMATHCrossRefGoogle Scholar
  8. G. Burton and J. C. R. Licklider. 1955. “Long-Range Constraints in the Statistical Structure of Printed English.” Amer. Jour. Psych. 68: 650–53.CrossRefGoogle Scholar
  9. N. Chomsky. 1956. “Three Models for the Description of Languages.” IEEE Trans. Inform. Theory 2: 113–24.MATHCrossRefGoogle Scholar
  10. N. Chomsky 1959. “On Certain Formal Properties of Grammars.” Inf. Contr. 2: 137–67.MathSciNetMATHCrossRefGoogle Scholar
  11. N. Chomsky1969. Aspects of the Theory of Syntax. MIT Press, Cambridge, MA.Google Scholar
  12. T. M. Cover and J. A. Thomas. 1991. Elements of Information Theory. Wiley, New York.MATHCrossRefGoogle Scholar
  13. I. Csiszar and T. Korner. 1981. Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, New York.MATHGoogle Scholar
  14. R. M. Fano. 1961. Transmission of Information: A Statistical Theory of Communication, MIT Press and Wiley, New York.Google Scholar
  15. A. Feinstein. 1958. Foundations of Information Theory. New York: McGraw-Hill.MATHGoogle Scholar
  16. W. Feller. 1950. An Introduction to Probability Theory and Its Applications. Vol. 1. Wiley, New York.MATHGoogle Scholar
  17. J. A. Fodor and J. Katz. 1964. The Structure of Language. Prentice-Hall.Google Scholar
  18. R. G. Gallager. 1968. Information Theory and Reliable Communication. Wiley, New York.MATHGoogle Scholar
  19. S. W. Golomb. 1967. Shift Register Sequences. Holden-Day, San Francisco. Revised edition, Aegean Park Press, Laguna Hills, California, 1982.MATHGoogle Scholar
  20. S. Guiasu. 1976. Information Theory with Applications. McGraw-Hill, New York.Google Scholar
  21. R. V. L. Hartley. 1928. “Transmission of Information,” Bell Sys. Tech. J. 7: 535–63.Google Scholar
  22. W. J. Hurd. 1965. “Coding for English As a Second-Order Markov source with New Trigram Statistics.” Report No. 24. Electrical Engineering Dept., University of Southern California.Google Scholar
  23. F. Jelinek. 1968. Probabilistic Information Theory: Discrete and Memoryless Models. McGrawHill, New York.MATHGoogle Scholar
  24. S. Karlin and H. M. Taylor. 1975. A First Course in Stochastic Processes. Academic Press, San Diego, CA.MATHGoogle Scholar
  25. S. Karlin and H. M. Taylor. 1981. A Second Course in Stochastic Processes. Academic Press, San Diego, CA.MATHGoogle Scholar
  26. J. G. Kemeny. J. L. Snell. 1960. Finite Markov Chains. Van Nostrand, Princeton, NJ.MATHGoogle Scholar
  27. A. L. Khintchin. 1953. “The Entropic Concept of Probability.” Uspekhi Mat. Nauk. 8: 3–20.Google Scholar
  28. A. L. Khintchin. 1957. Mathematical Foundations of Information Theory. Dover, New York.Google Scholar
  29. L. Kleinrock. 1975. Queueing Systems. Wiley, New York.MATHGoogle Scholar
  30. S. Kullback. 1959. Information Theory and Statistics. Wiley, New York.MATHGoogle Scholar
  31. H. J. Larson and B. O. Shubert. 1979. Probabilistic Models in Engineering Sciences. Vol 2. Wiley, New York.Google Scholar
  32. P. M. Lee. 1964. “On the Axioms of Information Theory.” Ann. Math. Stat. 35: 415–18.MATHCrossRefGoogle Scholar
  33. B. Mcmillan. 1953. “The Basic Theorems of Information Theory.” Ann. Math. Stat. 24: 196–219.MathSciNetMATHCrossRefGoogle Scholar
  34. M. Mansuripur. 1987. Introduction to Information Theory. Prentice Hall, Englewood Cliffs, NJ.Google Scholar
  35. R. J. McEliece. 1977. The Theory of Information and Coding. Addison Wesley. ,Reading MA.MATHGoogle Scholar
  36. A. Papoulis. 1965. Probability, Random Variables, and Stochastic Processes. McGraw-Hill, New York.MATHGoogle Scholar
  37. E. Parzen. 1962. Stochastic Processes. Holden-Day, San Francisco, CA.MATHGoogle Scholar
  38. M. S. Pinsker. 1964. Information and Information Stability ofRandom Variables and Processes. Holden-Day, San Francisco, CA.Google Scholar
  39. N. U. Prabhu. 1965. Stochastic Processes. Holden-Day, San Francisco, CA.Google Scholar
  40. A. Renyi. 1961. “On Measures of Entropy and Information.” Fourth Berkeley Symposium on Math. Stat. and Prob. 1: 547–61.MathSciNetGoogle Scholar
  41. 1965. “On the Foundations of Information Theory.” RISI 33, no. 1: pp. 1–14.Google Scholar
  42. F. M. Reza. 1961. An Introduction to Information Theory. McGraw-Hill, New York.Google Scholar
  43. C. E. Shannon. 1948. “A Mathematical Theory of Communication.” BSTJ 27: 379–423, 62456.MathSciNetMATHGoogle Scholar
  44. C. E. Shannon. 1951. “Prediction and Entropy of Printed English.” BSTJ 30: 50–65.MATHGoogle Scholar
  45. G. Strang. 1988. LinearAlgebra and ItsApplications. 3d ed. Harcourt Brace Jovanovich, Orlando, FL.Google Scholar
  46. H. Tverberg.,1958. “A New Derivation of the Information Function.” Math. Scand. 6: 297–98.MathSciNetMATHGoogle Scholar
  47. N. Wiener. 1948. Cybernetics. Wiley.Google Scholar
  48. N. Wiener. 1949. Extrapolation, Interpolation, and Smoothing ofStationary Time Series. MIT Press, Cambridge, MA, and Wiley, New York.Google Scholar
  49. J. Wolfowitz. 1961. Coding Theorems of Information Theory. Springer: Berlin-Heidelberg.MATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 1994

Authors and Affiliations

  • Solomon W. Golomb
    • 1
  • Robert E. Peile
    • 2
  • Robert A. Scholtz
    • 3
  1. 1.Departments of Electrical Engineering and MathematicsUniversity of Southern CaliforniaLos AngelesUSA
  2. 2.Racal Research, LimitedReading, BerkshireUK
  3. 3.Department of Electrical EngineeringUniversity of Southern CaliforniaLos AngelesUSA

Personalised recommendations