Theory of Information and its Value pp 77-101 | Cite as

# First asymptotic theorem and related results

## Abstract

In the previous chapter, for one particular example (see Sections 3.1 and 3.4) we showed that in calculating the maximum entropy (i.e. the capacity of a noiseless channel) the constraint \(c(y) \leqslant a\) imposed on feasible realizations is equivalent, for a sufficiently long code sequence, to the constraint \(\mathbb {E}[c(y)] \leqslant a\) on the mean value \(\mathbb {E}[c(y)]\). In this chapter we prove (Section 4.3) that under certain assumptions such equivalence takes place in the general case; this is the assertion of the first asymptotic theorem. In what follows, we shall also consider the other two asymptotic theorems (Chapters 7 and 11), which are the most profound results of information theory. All of them have the following feature in common: ultimately all these theorems state that, for utmost large systems, the difference between the concepts of discreteness and continuity disappears, and that the characteristics of a large collection of discrete objects can be calculated using a continuous functional dependence involving averaged quantities. For the first variational problem, this feature is expressed by the fact that the discrete function \(H = \ln M\) of *a*, which exists under the constraint *c*(*y*) ≤ *a*, is asymptotically replaced by a continuous function *H*(*a*) calculated by solving the first variational problem. As far as the proof is concerned, the first asymptotic theorem turns out to be related to the theorem on canonical distribution stability (Section 4.2), which is very important in statistical thermodynamics and which is actually proved there when the canonical distribution is derived from the microcanonical one. Here we consider it in a more general and abstract form. The relationship between the first asymptotic theorem and the theorem on the canonical distribution once more underlines the intrinsic unity of the mathematical apparatus of information theory and statistical thermodynamics.