Information Sources

  • Robert M. Gray
Part of the The Kluwer International Series in Engineering and Computer Science book series (SECS, volume 83)

Abstract

An information source is modeled mathematically as a discrete time random process, a sequence of random variables. This permits the use of all of the tools of the theory of probability and random processes. In particular, it allows us to do theory with probabilistic averages or expectations and relate these to actual time average performance through laws of large numbers or ergodic theorems. Such theorems describing the long term behavior of well behaved random systems are crucial to such theoretical analysis. Relating expectations and long term time averages requires an understanding of stationarity and ergodic properties of random processes, properties which are somewhat difficult to define precisely, but which usually have a simple intuitive interpretation. These issues are not simply of concern to mathematical dilettants. For example, stationarity can be violated by such commonly occurring phenomena as transients and variable length coding, yet sample averages may still converge in a useful way. In this chapter we survey some of the key ideas from the theory of random processes. The chapter strongly overlaps portions Chapter 2 of Gersho and Gray [30] and is intended to provide the necessary prerequisites and establish notation. The reader is assumed to be familiar with the general topics of probability and random processes and the chapter is intended primarily for reference and review.

Keywords

Covariance Autocorrelation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Kluwer Academic Publishers 1990

Authors and Affiliations

  • Robert M. Gray
    • 1
  1. 1.Stanford UniversityStanfordUSA

Personalised recommendations