Summary
This paper deals with minimum distance (MD) estimators and minimum penalized distance (MPD) estimators which are based on the L p distance. Rates of strong consistency of MPD density estimators are established within the family of density functions which have a bounded m-th derivative. For the case p=2, it is also proved that the MPD density estimator achieves the optimum rate of decrease of the mean integrated square error and the L 1 error. Estimation of derivatives of the density is considered as well.
In a class parametrized by entire functions, it is proved that the rate of convergence of the MD density estimator (and its derivatives) to the unknown density (its derivatives) is of order \(1{\text{/}}\sqrt n\)in expected L 1 and L 2 distances. In the same class of distributions, MD estimators of unknown density and its derivatives are proved to achieve an extraordinary rate (log log n/n)1/2 of strong consistency.
References
Birgé, L.: On estimating a density using Hellinger distance and some other strange facts. Probab. Th. Rel. Fields 71, 271–291 (1986)
Chen, J., Rubin, H.: The effect of non-normalization on the risk of the density estimators.
Devroye, L., Györfi, L.: Nonparametric density estimation. The L 1 view. New York: Wiley 1985
Gajek, L.: On improving density estimators which are not bona fide functions. Ann. Stat. 14, 1612–1618 (1986)
Ibragimov, I.A., Khas'minskii, R.Z.: Estimation of distribution density belonging to a class of entire functions. Th. Probab. Appl. 27, 551–562 (1982)
Khas'minskii, R.Z.: A lower bound on the risk of nonparametric estimates of densities in the uniform metric. Th. Probab. Appl. 23, 794–798 (1978)
Komlós, J., Major, P., Tusnády, G.: An approximation of partial sums of independent random varibles, and the sample distribution function. Z. Wahrscheinlichkeitstheor. Verw. Geb. 32, 111–131 (1975)
Reiss, R.-D.: On strongly consistent density estimates. Thesis, University of Cologne 1972
Reiss, R.-D.: Sharp rates of convergence of maximum likelihood estimators in non-parametric models. Z. Wahrscheinlichkeitstheor. Verw. Geb. 65, 473–482 (1984)
Reiss, R.-D.: Sharp rates of convergence of minimum penalized distance estimators. Sankhya, Ser. A, 48, 59–68 (1986)
Silverman, B.W.: Weak and strong uniform consistency of the kernel estimate of a density and its derivatives. Ann. Stat. 6, 177–184 (1978)
Silverman, B.W.: Spline smoothing: the equivalent variable kernel method. Ann. Stat. 12, 898–916 (1984)
Stefanyuk, A.P.: Convergence rate of a class of probability density estimates (in Russian). Avtom. Telemekh. 11, 187–192 (1979)
Tribel, H.: Theory of function spaces. Basel Boston: Birkhäuser 1983
Uspenskii, S.V., Demidenko, G.V., Perepelkin, V.G.: Embedding theorems with applications to differential equations (in Russian). Moscow: Nauka 1984
Vapnik, V.: Estimation of dependences based on empirical data. Berlin Heidelberg New York: Springer 1982
Vapnik, V., Stefanyuk, A.P.: Nonparametric methods of probability density recovery (in Russian). Aytom. Telemekh. 8, 38–52 (1978)
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Gajek, L. Estimating a density and its derivatives via the minimum distance method. Probab. Th. Rel. Fields 80, 601–617 (1989). https://doi.org/10.1007/BF00318908
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF00318908
Keywords
- Density Function
- Stochastic Process
- Probability Theory
- Minimum Distance
- Statistical Theory