Abstract
A characterization of the normal distribution by a statistical independence on a linear transformation of two mutually independent random variables is proved by using the convolution inequality for the Fisher information.
Similar content being viewed by others
References
Barron, A. R. (1984). Monotone central limit theorem for densities, Tech. Report 50, Department of Statistics, Stanford University.
Barron, A. R. (1986). Entropy and the central limit theorem, Ann. Probab., 14, 336–342.
Blachman, N. M. (1965). The convolution inequality for entropy powers, IEEE Trans. Inform. Theory, IT-11, 267–271.
Brown, L. D. (1982). A proof of the central limit theorem motivated by the Cramér-Rao inequality, Statistics and Probability: Essays in Honor of C. R. Rao., (eds. G., Kallianpur, P. R., Krishnaiah and J. K., Ghosh), North-Holland, Amsterdam.
Feller, W. (1971). Introduction to Probability Theory and Its Applications, Vol. II, Wiley, New York.
Itoh, Y. (1970). The information theoretic proof of Kac's theorem, Proc. Japan Acad., 46, 283–286.
Linnik, Yu. V. (1959). An information-theoretic proof of the central limit theorem with the Lindeberg condition, Theory Probab. Appl., 4, 288–299.
Murata, H. and Tanaka, H. (1974). An inequality for certain functional of multidimensional probability distributions, Hiroshima Math. J., 4, 75–81.
Stam, A. J. (1959). Some inequalities satisfied by the quantities of information of Fisher and Shannon, Inform. and Control, 2, 101–112.
Author information
Authors and Affiliations
About this article
Cite this article
Itoh, Y. An application of the convolution inequality for the Fisher information. Ann Inst Stat Math 41, 9–12 (1989). https://doi.org/10.1007/BF00049105
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF00049105