Abstract
Let us inspect Lemma 3 once more. We may consider ξ(1, ν(N)) in the following way. The time for the information signals is measured according to their code length, and we ask what happens according to this time-scale up to N. H(ξ(1,ν(N))) is the entropy of this event and
is the entropy (per unit time) of the source according to this time-scale. There are many examples of this type. Assume we know the entropy of the written text. The durations of the different letters in the spoken text are different. Thus, if we want to know the entropy (per unit time) of the spoken text, it is suitable to measure it with respect to the time-scale of the durations. If the text is encoded by Morse-alphabet we may also consider the entropy per unit time, since the 3 Morse-signals have different lengths.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1970 Springer-Verlag Wien
About this chapter
Cite this chapter
Katona, G. (1970). Entropy with Respect to a Cost Scale. In: General Theory of Noiseless Channels. International Centre for Mechanical Sciences, vol 31. Springer, Vienna. https://doi.org/10.1007/978-3-7091-2872-5_3
Download citation
DOI: https://doi.org/10.1007/978-3-7091-2872-5_3
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-81167-2
Online ISBN: 978-3-7091-2872-5
eBook Packages: Springer Book Archive