General Theory of Noiseless Channels pp 40-47 | Cite as

# Entropy with Respect to a Cost Scale

Chapter

## Abstract

Let us inspect Lemma 3 once more. We may consider ξ(1, ν(N)) in the following way. The time for the information signals is measured according to their code length, and we ask what happens according to this time-scale up to N. H(ξ(1,ν(N))) is the entropy of this event and

is the entropy (per unit time) of the source according to this time-scale. There are many examples of this type. Assume we know the entropy of the written text. The durations of the different letters in the spoken text are different. Thus, if we want to know the entropy (per unit time) of the spoken text, it is suitable to measure it with respect to the time-scale of the durations. If the text is encoded by Morse-alphabet we may also consider the entropy per unit time, since the 3 Morse-signals have different lengths.

## Keywords

Information Source Unit Cost Information Signal General Definition Block Code
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## Preview

Unable to display preview. Download preview PDF.

## Copyright information

© Springer-Verlag Wien 1970