Summary
Two types of necessary conditions are given for the convergence of Kullback-Leibler’s mean information, one of which is connected with an asymptotic equivalence of two sequences of probability measures, and in special cases, with convergence of a sequence of probability distributions. The other is given in terms of the generalized probability density functions.
Similar content being viewed by others
References
S. Ikeda, “An application of the discrimination information measure to the theory of testing hypotheses, Part II,”Ann. Inst. Stat. Math., Vol. 13 (1961), pp. 61–89.
S. Ikeda, “A remark on the convergence of Kullback-Leibler’s mean information,”Ann. Inst. Stat. Math., Vol. 12 (1960), pp. 81–88.
S. Kullback,Information Theory and Statistics, John Wiley and Sons, 1959.
G. H. Hardy, J. E. Littlewood and G. Pólya,Inequalities, Camb. Univ. Press, 1934.
H. Scheffé, “A useful convergence theorem for probability distributions,”Ann. Math. Stat., Vol. 18 (1947). pp. 434–438.
H. Cramér,Mathematical Methods of Statistics, Princeton Univ. Press, 1946.
Author information
Authors and Affiliations
About this article
Cite this article
Ikeda, S. Necessary conditions for the convergence of kullback-leibler’s mean information. Ann Inst Stat Math 14, 107–118 (1962). https://doi.org/10.1007/BF02868631
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/BF02868631