Abstract
An information source or source is a mathematical model for a physical entity that produces a succession of symbols called “outputs” in a random manner. The symbols produced may be real numbers such as voltage measurements from a transducer, binary numbers as in computer data, two dimensional intensity fields as in a sequence of images, continuous or discontinuous waveforms, and so on. The space containing all of the possible output symbols is called the alphabet of the source and a source is essentially an assignment of a probability measure to events consisting of sets of sequences of symbols from the alphabet. It is useful, however, to explicitly treat the notion of time as a transformation of sequences produced by the source. Thus in addition to the common random process model we shall also consider modeling sources by dynamical systems as considered in ergodic theory. The material in this chapter is a distillation of [55, 58] and is intended to establish notation.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2011 Springer Science + Business Media, LLC
About this chapter
Cite this chapter
Gray, R.M. (2011). Information Sources. In: Entropy and Information Theory. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-7970-4_1
Download citation
DOI: https://doi.org/10.1007/978-1-4419-7970-4_1
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4419-7969-8
Online ISBN: 978-1-4419-7970-4
eBook Packages: EngineeringEngineering (R0)