Skip to main content
Log in

Score functions, generalized relative Fisher information and applications

  • Published:
Ricerche di Matematica Aims and scope Submit manuscript

Abstract

Generalizations of the linear score function, a well-known concept in theoretical statistics, are introduced. As the Gaussian density and the classical Fisher information are closely related to the linear score, nonlinear (respectively fractional) score functions allow to identify generalized Gaussian densities (respectively Lévy stable laws) as the (unique) probability densities for which the score of a random variable X is proportional to \(-X\). In all cases, it is shown that the variance of the relative to the generalized Gaussian (respectively Lévy) score provides an upper bound for \(L^1\)-distance from the generalized Gaussian density (respectively Lévy stable laws). Connections with nonlinear and fractional Fokker–Planck type equations are introduced and discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Blanchet, A., Bonforte, M., Dolbeault, J., Grillo, G., Vázquez, J.L.: Hardy-Poincaré inequalities and applications to nonlinear diffusions. Compt. Rendus Math. 344, 431–436 (2007)

    Article  MATH  Google Scholar 

  2. Blanchet, A., Bonforte, M., Dolbeault, J., Grillo, G., Vázquez, J.L.: Asymptotics of the fast diffusion equation via entropy estimates. Arch. Ration. Mech. Anal. 191, 347–385 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bobkov, S.G., Chistyakov, G.P., Götze, F.: Fisher information and the central limit theorem. Probab. Theory Related Fields 159, 1–59 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bobkov, S.G., Chistyakov, G.P., Götze, F.: Fisher information and convergence to stable laws. Bernoulli 20(3), 1620–1646 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bonforte, M., Dolbeault, J., Grillo, G., Vázquez, J.L.: Sharp rates of decay of solutions to the nonlinear fast diffusion equation via functional inequalities. Proc. Natl. Acad. Sci. USA 107, 16459–16464 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  6. Bonforte, M., Grillo, G., Vázquez, J.L.: Special fast diffusion with slow asymptotics, entropy method and flow on a riemannian manifold. Arch. Ration. Mech. Anal. 196, 631–680 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  7. Carrillo, J.A., Toscani, G.: Asymptotic \(L^1\)-decay of solutions of the porous medium equation to self-similarity. Indiana Univ. Math. J. 49, 113–142 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  8. Cox, D.R., Hinkley, D.V.: Theoretical statistics. Chapman & Hall, London (1974)

    Book  MATH  Google Scholar 

  9. Csiszar, I.: Information-type measures of difference of probability distributions and indirect observations. Stud. Sci. Math. Hung. 2, 299–318 (1967)

    MathSciNet  MATH  Google Scholar 

  10. Del Pino, M., Dolbeault, J.: Best constants for Gagliardo-Nirenberg inequalities and applications to nonlinear diffusions. J. Math. Pures Appl. 81, 847–875 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dolbeault, J., Toscani, G.: Improved interpolation inequalities, relative entropy and fast diffusion equations. Ann. de l’ Institut Henri Poincaré (C) Non Linear Anal. 30(5), 917–934 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  12. Gnedenko, B.V., Kolmogorov, A.N.: Limit distributions for sums of independent random variables. Addison-Wesley, Cambridge, Mass (1954)

    MATH  Google Scholar 

  13. Guo, D.: Relative entropy and score function: new information-estimation relationships through arbitrary additive perturbation. In: Proc. IEEE Int. Symp. Inform. Theory, Seoul, Korea, pp. 814–818 (2009)

  14. Kullback, S.: A lower bound for discrimination information in terms of variation. IEEE Trans. Inf. The. 4, 126–127 (1967)

    Article  Google Scholar 

  15. Johnson, O.: Entropy inequalities and the central limit theorem. Stoch. Process. Appl. 88, 291–304 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  16. Johnson, O., Barron, A.R.: Fisher information inequalities and the central limit theorem. Probab. Theory Related Fields 129, 391–409 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  17. Laha, R.G., Rohatgi, V.K.: Probability theory. Wiley Series in Probability and Mathematical Statistics. Wiley, New York, Chichester, Brisbane (1979)

  18. Lieb, E.H.: Sharp constants in the hardy-Littlewood-Sobolev and related inequalities. Ann. Math. 118, 349–374 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  19. Linnik Yu, V.: Linear forms and statistical criteria. II. Ukrainskii Mat. Zhournal 5, 247–290 (1983)

    Google Scholar 

  20. Linnik, Yu.V.: Linear forms and statistical criteria. I, II. Transl. Math. Statist. Prob. 3, 1–90 (1962) (Am. Math. Soc., Providence, R.I.)

  21. Madiman, M., Barron, A.R.: Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inf. Theory 53(4), 2317–2329 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  22. Riesz, M.: L’intégrale de Riemann-Liouville et le problème de Cauchy. Acta Math. 81, 1–223 (1949)

    Article  MathSciNet  MATH  Google Scholar 

  23. Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Contr. 2, 101–112 (1959)

    Article  MathSciNet  MATH  Google Scholar 

  24. Stein, E.M.: Singular integrals and differentiability properties of functions. Princeton Mathematical Series, vol. 30. Princeton University Press, Princeton (1970)

  25. Toscani, G.: Sur l’inegalité logarithmique de Sobolev. C. R. Acad. Sci. Paris 324, 689–694 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  26. Toscani, G.: Entropy dissipation and the rate of convergence to equilibrium for the Fokker-Planck equation, pp. 521–541. Quart. Appl. Math, LVII (1999)

  27. Toscani, G.: The fractional Fisher information and the central limit theorem for stable laws. Ricerche mat. (2016). arXiv:1504.07057 (in press, preprint, 2015)

  28. Toscani, G.: Entropy inequalities for stable densities and strengthened central limit theorems (2015). arXiv:1512.05874 (preprint)

Download references

Acknowledgments

This work has been written within the activities of the National Group of Mathematical Physics of INDAM (National Institute of High Mathematics). The support of the Project “Optimal mass transportation, geometrical and functional inequalities with applications”, financed by the Minister of University and Research, is kindly acknowledged.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giuseppe Toscani.

Appendix

Appendix

In this short appendix we summarize the mathematical notations and the meaning of the fractional derivative. Given a probability density f(x), \(x \in {\mathbb {R}}^n\), we define its Fourier transform \(\mathcal F(f)\) by

$$\begin{aligned} \mathcal F(f)(\xi ) = \widehat{f}(\xi ) := \int _{{\mathbb {R}}^n} e^{- i \, \xi \cdot \, x} f(x)\, dx, \qquad \forall \xi \in {\mathbb {R}}^n. \end{aligned}$$

Let us set \(n=1\). Then, the one-dimensional derivative \(\mathcal D_{\alpha }\) is defined as follows. For \(0 <\alpha < 1\) we let \(R_\alpha \) be the one-dimensional normalized Riesz potential operator defined for locally integrable functions by [22, 24]

$$\begin{aligned} R_\alpha (f)(x) = S(\alpha ) \int _{\mathbb {R}}\frac{f(y)\, dy}{|x-y|^{1-\alpha }}. \end{aligned}$$

The constant \(S(\alpha )\) is chosen to have

$$\begin{aligned} \widehat{R_\alpha (f)}(\xi ) = |\xi |^\alpha \widehat{f}(\xi ). \end{aligned}$$
(39)

Since for \(0 <\alpha < 1\) it holds [18]

$$\begin{aligned} \mathcal F |x|^{\alpha -1} = |\xi |^{-\alpha } \pi ^{1/2} \Gamma \left( \frac{1-\alpha }{2} \right) \Gamma \left( \frac{\alpha }{2} \right) , \end{aligned}$$
(40)

where, as usual \(\Gamma (\cdot )\) denotes the Gamma function, the value of \(S(\alpha )\) is given by

$$\begin{aligned} S(\alpha ) = \left[ \pi ^{1/2} \Gamma \left( \frac{1-\alpha }{2} \right) \Gamma \left( \frac{\alpha }{2} \right) \right] ^{-1}. \end{aligned}$$

Note that \(S(\alpha ) = S(1-\alpha )\).

We then define the fractional derivative of order \(\alpha \) of a real function f as (\(0 {<}\alpha {<} 1\))

$$\begin{aligned} \frac{d^\alpha f(x)}{dx^\alpha } = {\mathcal {D}}_\alpha f(x) = \frac{d}{dx}R_{1-\alpha }(f)(x). \end{aligned}$$
(41)

Thanks to (39), in Fourier variables

$$\begin{aligned} \widehat{ {\mathcal {D}}}_\alpha f(\xi ) = i \frac{\xi }{|\xi |} |\xi |^\alpha \widehat{f}(\xi ). \end{aligned}$$
(42)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Toscani, G. Score functions, generalized relative Fisher information and applications. Ricerche mat 66, 15–26 (2017). https://doi.org/10.1007/s11587-016-0281-0

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11587-016-0281-0

Keywords

Mathematics Subject Classification

Navigation