Abstract
Generalizations of the linear score function, a well-known concept in theoretical statistics, are introduced. As the Gaussian density and the classical Fisher information are closely related to the linear score, nonlinear (respectively fractional) score functions allow to identify generalized Gaussian densities (respectively Lévy stable laws) as the (unique) probability densities for which the score of a random variable X is proportional to \(-X\). In all cases, it is shown that the variance of the relative to the generalized Gaussian (respectively Lévy) score provides an upper bound for \(L^1\)-distance from the generalized Gaussian density (respectively Lévy stable laws). Connections with nonlinear and fractional Fokker–Planck type equations are introduced and discussed.
Similar content being viewed by others
References
Blanchet, A., Bonforte, M., Dolbeault, J., Grillo, G., Vázquez, J.L.: Hardy-Poincaré inequalities and applications to nonlinear diffusions. Compt. Rendus Math. 344, 431–436 (2007)
Blanchet, A., Bonforte, M., Dolbeault, J., Grillo, G., Vázquez, J.L.: Asymptotics of the fast diffusion equation via entropy estimates. Arch. Ration. Mech. Anal. 191, 347–385 (2009)
Bobkov, S.G., Chistyakov, G.P., Götze, F.: Fisher information and the central limit theorem. Probab. Theory Related Fields 159, 1–59 (2014)
Bobkov, S.G., Chistyakov, G.P., Götze, F.: Fisher information and convergence to stable laws. Bernoulli 20(3), 1620–1646 (2014)
Bonforte, M., Dolbeault, J., Grillo, G., Vázquez, J.L.: Sharp rates of decay of solutions to the nonlinear fast diffusion equation via functional inequalities. Proc. Natl. Acad. Sci. USA 107, 16459–16464 (2010)
Bonforte, M., Grillo, G., Vázquez, J.L.: Special fast diffusion with slow asymptotics, entropy method and flow on a riemannian manifold. Arch. Ration. Mech. Anal. 196, 631–680 (2010)
Carrillo, J.A., Toscani, G.: Asymptotic \(L^1\)-decay of solutions of the porous medium equation to self-similarity. Indiana Univ. Math. J. 49, 113–142 (2000)
Cox, D.R., Hinkley, D.V.: Theoretical statistics. Chapman & Hall, London (1974)
Csiszar, I.: Information-type measures of difference of probability distributions and indirect observations. Stud. Sci. Math. Hung. 2, 299–318 (1967)
Del Pino, M., Dolbeault, J.: Best constants for Gagliardo-Nirenberg inequalities and applications to nonlinear diffusions. J. Math. Pures Appl. 81, 847–875 (2002)
Dolbeault, J., Toscani, G.: Improved interpolation inequalities, relative entropy and fast diffusion equations. Ann. de l’ Institut Henri Poincaré (C) Non Linear Anal. 30(5), 917–934 (2013)
Gnedenko, B.V., Kolmogorov, A.N.: Limit distributions for sums of independent random variables. Addison-Wesley, Cambridge, Mass (1954)
Guo, D.: Relative entropy and score function: new information-estimation relationships through arbitrary additive perturbation. In: Proc. IEEE Int. Symp. Inform. Theory, Seoul, Korea, pp. 814–818 (2009)
Kullback, S.: A lower bound for discrimination information in terms of variation. IEEE Trans. Inf. The. 4, 126–127 (1967)
Johnson, O.: Entropy inequalities and the central limit theorem. Stoch. Process. Appl. 88, 291–304 (2000)
Johnson, O., Barron, A.R.: Fisher information inequalities and the central limit theorem. Probab. Theory Related Fields 129, 391–409 (2004)
Laha, R.G., Rohatgi, V.K.: Probability theory. Wiley Series in Probability and Mathematical Statistics. Wiley, New York, Chichester, Brisbane (1979)
Lieb, E.H.: Sharp constants in the hardy-Littlewood-Sobolev and related inequalities. Ann. Math. 118, 349–374 (1983)
Linnik Yu, V.: Linear forms and statistical criteria. II. Ukrainskii Mat. Zhournal 5, 247–290 (1983)
Linnik, Yu.V.: Linear forms and statistical criteria. I, II. Transl. Math. Statist. Prob. 3, 1–90 (1962) (Am. Math. Soc., Providence, R.I.)
Madiman, M., Barron, A.R.: Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inf. Theory 53(4), 2317–2329 (2007)
Riesz, M.: L’intégrale de Riemann-Liouville et le problème de Cauchy. Acta Math. 81, 1–223 (1949)
Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Contr. 2, 101–112 (1959)
Stein, E.M.: Singular integrals and differentiability properties of functions. Princeton Mathematical Series, vol. 30. Princeton University Press, Princeton (1970)
Toscani, G.: Sur l’inegalité logarithmique de Sobolev. C. R. Acad. Sci. Paris 324, 689–694 (1997)
Toscani, G.: Entropy dissipation and the rate of convergence to equilibrium for the Fokker-Planck equation, pp. 521–541. Quart. Appl. Math, LVII (1999)
Toscani, G.: The fractional Fisher information and the central limit theorem for stable laws. Ricerche mat. (2016). arXiv:1504.07057 (in press, preprint, 2015)
Toscani, G.: Entropy inequalities for stable densities and strengthened central limit theorems (2015). arXiv:1512.05874 (preprint)
Acknowledgments
This work has been written within the activities of the National Group of Mathematical Physics of INDAM (National Institute of High Mathematics). The support of the Project “Optimal mass transportation, geometrical and functional inequalities with applications”, financed by the Minister of University and Research, is kindly acknowledged.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
In this short appendix we summarize the mathematical notations and the meaning of the fractional derivative. Given a probability density f(x), \(x \in {\mathbb {R}}^n\), we define its Fourier transform \(\mathcal F(f)\) by
Let us set \(n=1\). Then, the one-dimensional derivative \(\mathcal D_{\alpha }\) is defined as follows. For \(0 <\alpha < 1\) we let \(R_\alpha \) be the one-dimensional normalized Riesz potential operator defined for locally integrable functions by [22, 24]
The constant \(S(\alpha )\) is chosen to have
Since for \(0 <\alpha < 1\) it holds [18]
where, as usual \(\Gamma (\cdot )\) denotes the Gamma function, the value of \(S(\alpha )\) is given by
Note that \(S(\alpha ) = S(1-\alpha )\).
We then define the fractional derivative of order \(\alpha \) of a real function f as (\(0 {<}\alpha {<} 1\))
Thanks to (39), in Fourier variables
Rights and permissions
About this article
Cite this article
Toscani, G. Score functions, generalized relative Fisher information and applications. Ricerche mat 66, 15–26 (2017). https://doi.org/10.1007/s11587-016-0281-0
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11587-016-0281-0