Skip to main content
Log in

Structural adaptive deconvolution under \({\mathbb{L}_p}\)-losses

  • Published:
Mathematical Methods of Statistics Aims and scope Submit manuscript

Abstract

In this paper, we address the problem of estimating a multidimensional density f by using indirect observations from the statistical model Y = X + ε. Here, ε is a measurement error independent of the random vector X of interest and having a known density with respect to Lebesgue measure. Our aim is to obtain optimal accuracy of estimation under \({\mathbb{L}_p}\)-losses when the error ε has a characteristic function with a polynomial decay. To achieve this goal, we first construct a kernel estimator of f which is fully data driven. Then, we derive for it an oracle inequality under very mild assumptions on the characteristic function of the error ε. As a consequence, we getminimax adaptive upper bounds over a large scale of anisotropic Nikolskii classes and we prove that our estimator is asymptotically rate optimal when p ∈ [2,+∞]. Furthermore, our estimation procedure adapts automatically to the possible independence structure of f and this allows us to improve significantly the accuracy of estimation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Y. Baraud, and L. Birgé, “Estimating Composite Functions by Model Selection”, Ann. Inst. H. Poincaré Probab. Statist. 50, 285–314 (2014).

    Article  MathSciNet  MATH  Google Scholar 

  2. R. Benhaddou, M. Pensky, and D. Picard, “Anisotropic Denoising in Functional Deconvolution Model with Dimension-Free Convergence Rates”, Electronic J. Statist. 7, 1686–1715 (2013).

    Article  MathSciNet  MATH  Google Scholar 

  3. N. Bissantz, L. Dümbgen, H. Holzmann, and A. Munk, “Nonparametric Confidence Bands in Deconvolution Density Estimation”, J. Roy. Statist. Soc., Ser. B 69, 483–506 (2007).

    Article  Google Scholar 

  4. C. Butucea, “Deconvolution of Supersmooth Densities with Smooth Noise”, Canad. J. Statist. 32 (2), 181–192 (2004).

    Article  MathSciNet  MATH  Google Scholar 

  5. C. Butucea and F. Comte, “Adaptive Estimation of Linear Functionals in the Convolution Model and Applications”, Bernoulli 15 (1), 69–68 (2009).

    Article  MathSciNet  MATH  Google Scholar 

  6. C. Butucea and A. B. Tsybakov, “Sharp Optimality inDensity Deconvolution with Dominating Bias. I”, Teor. Veroyatn. Primen. 52 (1), 111–128 (2007).

    Article  MathSciNet  Google Scholar 

  7. C. Butucea and A. B. Tsybakov, “Sharp Optimality in Density Deconvolution with Dominating Bias. II”, Theory Probab. Appl. 52 (2), 237–249 (2008).

    Article  MathSciNet  MATH  Google Scholar 

  8. R. J. Carroll and P. Hall, “Optimal Rates of Convergence for Deconvolving a Density”, J. Amer. Statist. Assoc. 83, 1184–1186 (1988).

    Article  MathSciNet  MATH  Google Scholar 

  9. F. Comte and C. Lacour, “Anisotropic Adaptive Kernel Deconvolution”, Ann. Inst. H. Poincaré Probab. Statist. 49 (2), 569–609 (2013).

    Article  MathSciNet  MATH  Google Scholar 

  10. F. Comte and T. Rebafka, “Adaptive Density Estimation in the Pile-Up Model Involving Measurement Errors”, Electronic J. Statist. 6, 2002–2037 (2012).

    Article  MathSciNet  MATH  Google Scholar 

  11. L. Devroye and G. Lugosi, “Nonasymptotic Universal Smoothing Factor, Kernel Complexity and Yatracos Classes”, Ann. Statist. 25, 2626–2637 (1997).

    Article  MathSciNet  MATH  Google Scholar 

  12. J. Fan, “On the Optimal Rates of Convergence for Nonparametric Deconvolution Problems”, Ann. Statist. 19 (3), 1257–1272 (1991).

    Article  MathSciNet  MATH  Google Scholar 

  13. J. Fan, “Global Behavior of Deconvolution Kernel Estimates”, Statist. Sinica 1, 541–551 (1991).

    MathSciNet  MATH  Google Scholar 

  14. J. Fan, “Adaptively Local One-Dimensional Subproblems with Application to a Deconvolution Problem”, Ann. Statist. 21 (2), 600–610 (1993).

    Article  MathSciNet  MATH  Google Scholar 

  15. A. Goldenshluger and O. V. Lepski, “Structural Adaptation via Lp-Norm Oracle Inequalities”, Probab. Theory Rel. Fields 143, 41–71 (2009).

    Article  MathSciNet  MATH  Google Scholar 

  16. A. Goldenshluger and O. Lepski, “Uniform Bounds for Norms of Sums of Independent Random Functions”, Ann. Probab. 39, 2318–2384 (2011a).

    Article  MathSciNet  MATH  Google Scholar 

  17. A. Goldenshluger and O. Lepski, “Bandwidth Selection in Kernel Density Estimation: Oracle Inequalities and Adaptive Minimax Optimality”, Ann. Statist. 39, 1608–1639 (2011b).

    Article  MathSciNet  MATH  Google Scholar 

  18. A. Goldenshluger and O. Lepski, “On Adaptive Minimax Density Estimation on Rd”, Probab. Theory and Rel. Fields 159, 479–543 (2013).

    Article  MathSciNet  MATH  Google Scholar 

  19. L. Grafakos, Classical Fourier Analysis, in Graduate Texts in Mathematics (Springer, 2008), Vol. 249.

    Google Scholar 

  20. P. Hall and A. Meister, “A Ridge-Parameter Approach to Deconvolution”, Ann. Statist. 35 (4), 1535–1558 (2007).

    Article  MathSciNet  MATH  Google Scholar 

  21. J. I. Horowitz and E. Mammen, “Rate-Optimal Estimation for a General Class of Nonparametric Regression Models with Unknown Link Functions”, Ann. Statist. 35 (6), 2589–2619 (1990).

    Article  MathSciNet  MATH  Google Scholar 

  22. A. B. Juditsky, O.V. Lepski, and A. B. Tsybakov, “Nonparametric Estimation of Composite Functions”, Ann. Statist. 37 (3), 1360–1440 (2009).

    Article  MathSciNet  MATH  Google Scholar 

  23. G. Kerkyacharian, O. V. Lepski, and D. Picard, “Nonlinear Estimation in Anisotropic Multi-Index Denoising”, Probab. Theory and Rel. Fields 121, 137–170 (2001).

    Article  MathSciNet  MATH  Google Scholar 

  24. O. Lepski, “Upper Functions for Positive Random Functionals. I. General Setting and Gaussian Random Functions”, Math. Methods Statist. 22 (1), 1–27 (2013).

    Article  MathSciNet  MATH  Google Scholar 

  25. O. Lepski, “Multivariate Density Estimation Under Sup-Norm Loss: Oracle Approach, Adaptation and Independence Structure”, Ann. Statist. 40 (2), 1005–1034 (2013).

    Article  MathSciNet  MATH  Google Scholar 

  26. O. V. Lepski and N. Serdyukova, “Adaptive Estimation under Single-Index Constraint in a Regression Model”, Ann. Stat. 40 (1), 1–28 (2014).

    Article  MathSciNet  MATH  Google Scholar 

  27. O. Lepski and T. Willer, “Lower Bounds in the Convolution Structure Density Model”, Bernoulli (2014) (forthcoming paper).

    Google Scholar 

  28. K. Lounici and R. Nickl, “Global Uniform Risk Bounds forWavelet Deconvolution Estimators”, Ann. Statist. 39 (2), 201–231 (2011).

    Article  MathSciNet  MATH  Google Scholar 

  29. E. Masry, “Multivariate Probability Density Deconvolution for Stationary Random Processes”, IEEE Trans. Inform. Theory 37 (4), 1105–1115 (1991).

    Article  MathSciNet  MATH  Google Scholar 

  30. E. Masry, “Strong Consistency and Rates for Deconvolution of Multivariate Densities of Stationary Processes”, Stochastic Proc. and Their Appl. 47, 53–74 (1993).

    Article  MathSciNet  MATH  Google Scholar 

  31. P. Massart, Concentration Inequalities and Model Selection, in Lecture Notes in Mathematics, Vol. 1886: Lecture from the 33rd Summer School on Probability Theory Held in Saint-Floor, July 6-23, 2003 (Springer, Berlin, 2007).

    Google Scholar 

  32. A. Meister, “Deconvolution from Fourier-Oscillating Error Densities under Decay and Smoothness Restrictions”, Inverse Problems 24 (1), (2008).

  33. S. M. Nikol’skii, Approximation of Functions of Several Variables and Embedding Theorems (Nauka, Moscow, 1977) [in Russian].

    Google Scholar 

  34. M. Pensky and B. Vidakovic, “AdaptiveWavelet Estimator for Nonparametric Density Deconvolution”, Ann. Statist. 27 (6), 2033–2053 (1999).

    Article  MathSciNet  MATH  Google Scholar 

  35. G. Rebelles, “Pointwise Adaptive Estimation of a Multivariate Density under Independence Hypothesis”, Bernoulli 21 (4), 1984–2023 (2014).

    Article  MathSciNet  MATH  Google Scholar 

  36. G. Rebelles, “Lp Adaptive Estimation of an AnisotropicDensity under IndependenceHypothesis”, Electronic J. Statist. 9, 106–134 (2015).

    Article  MathSciNet  MATH  Google Scholar 

  37. A. Samarov and A. B. Tsybakov, “Aggregation of Density Estimators and Dimension Reduction”, in Advances in Statistical Modeling and Inference, Ser. Biostat., 3, (World Sci. Publ., Hackensack, NJ, 2007), pp. 233–251.

    Chapter  Google Scholar 

  38. L. A. Stefanski, “Rates of Convergence of Some Estimators in a Class of Deconvolution Problems”, Statist. Probab. Lett. 9, 229–235 (1990).

    Article  MathSciNet  MATH  Google Scholar 

  39. L. A. Stefanski and R. J. Carroll, “Deconvoluting KernelDensityEstimators”, Statistics 21, 169–184 (1990).

    Article  MathSciNet  MATH  Google Scholar 

  40. A. Tsybakov, Introduction to Nonparametric Estimation, in Springer Series in Statistics (Springer, New York, 2009).

    Book  Google Scholar 

  41. E. Youndjé and M. T. Wells, “Optimal Bandwidth Selection for Multivariate Kernel Deconvolution Density Estimation”, TEST 17 (1), 138–162 (2008).

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to G. Rebelles.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rebelles, G. Structural adaptive deconvolution under \({\mathbb{L}_p}\)-losses. Math. Meth. Stat. 25, 26–53 (2016). https://doi.org/10.3103/S1066530716010026

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3103/S1066530716010026

Keywords

2000 Mathematics Subject Classification

Navigation