Abstract.
We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L 2 spaces and Poincaré inequalities, to provide a better understanding of the decrease in Fisher information implied by results of Barron and Brown. We show that if the standardized Fisher information ever becomes finite then it converges to zero.
Article PDF
Similar content being viewed by others
References
Ball, K., Barthe, F., Naor, A.: Entropy jumps in the presence of a spectral gap. Duke Math. J. 119, 41–63 (2002)
Barron, A.R.: Entropy and the central limit theorem. Ann. Probab. 14, 336–342 (1986)
Blachman, N.M.: The convolution inequality for entropy powers. IEEE Trans. Inform. Theory 11, 267–271 (1965)
Borovkov, A.A., Utev, S.A.: On an inequality and a related characterisation of the normal distribution. Theory Probab. Appl. 28, 219–228 (1984)
Brown, L.D.: A proof of the Central Limit Theorem motivated by the Cramér-Rao inequality. In: G. Kallianpur, P.R. Krishnaiah, J.K. Ghosh, (eds), Statistics and Probability: Essays in Honour of C.R. Rao, North-Holland, New York, 1982, pp. 141–148
Brown, L.D., Gajek, L.: Information inequalities for the Bayes risk. Ann. Statist. 18, 1578–1594 (1990)
Cacoullos, Th.: On upper and lower bounds for the variance of a function of a random variable. Ann. Probab. 10, 799–809 (1982)
Casella, G., Berger, R.L.: Statistical inference. Wadsworth & Brooks/Cole Advanced Books & Software, Pacific Grove, CA, 1990
Fabian, V., Hannan, J.: On the Cramér-Rao inequality. Ann. Statist. 5, 197–205 (1977)
Gnedenko, B.V., Kolmogorov, A.N.: Limit distributions for sums of independent random variables. Addsion- Wesley, Cambridge, Mass, 1954
Gross, L.: Logarithmic Sobolev inequalities. Amer. J. Math., 97, 1061–1083 (1975)
Holevo, A.S.: Asymptotic estimation of shift parameter of a quantum state. Preprint, 2003. quant-ph/0307225
Johnson, O.T.: Entropy inequalities and the Central Limit Theorem. Stochastic Process Appl. 88, 291–304 (2000)
Johnson, O.T., Suhov, Y.M.: Entropy and random vectors. J. Statist Phys. 104, 147–167 (2001)
Klaasen, C.A.J.: On an inequality of Chernoff. Ann. Probab. 13, 966–974 (1985)
Lehmann, E., Casella, G.: Theory of point estimation. Springer Texts in Statistics. Second edition, Springer-Verlag, New York, 1998
Linnik, Y.V.: An information-theoretic proof of the Central Limit Theorem with the Lindeberg Condition. Theory Probab. Appl. 4, 288–299 (1959)
Petrov, V.V.: Limit Theorems of Probability: Sequences of Independent Random Variables. Oxford Science Publications, Oxford, 1995
Prohorov, Y.V.: On a local limit theorem for densities. Doklady Akad. Nauk SSSR (N.S.), 83, 797–-800 (1952) In Russian.
Shimizu, R.: On Fisher’s amount of information for location family. In: G.P.Patil et al, (eds), Statistical Distributions in Scientific Work, Vol. 3, Reidel, 1975, pp. 305–312
Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inform. and Control 2, 101–112 (1959)
Author information
Authors and Affiliations
Corresponding author
Additional information
OTJ is a Fellow of Christ’s College, Cambridge, who helped support two trips to Yale University during which this paper was written.
Mathematics Subject Classification (2000):Primary: 62B10 Secondary: 60F05, 94A17
Rights and permissions
About this article
Cite this article
Johnson, O., Barron, A. Fisher information inequalities and the central limit theorem. Probab. Theory Relat. Fields 129, 391–409 (2004). https://doi.org/10.1007/s00440-004-0344-0
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00440-004-0344-0