Skip to main content
Log in

Large and moderate deviations principles for kernel estimators of the multivariate regression

  • Published:
Mathematical Methods of Statistics Aims and scope Submit manuscript

Abstract

In this paper, we prove large deviations principle for the Nadaraya-Watson estimator and for the semi-recursive kernel estimator of the regression in the multidimensional case. Under suitable conditions, we show that the rate function is a good rate function. We thus generalize the results already obtained in the one-dimensional case for the Nadaraya-Watson estimator. Moreover, we give a moderate deviations principle for these two estimators. It turns out that the rate function obtained in the moderate deviations principle for the semi-recursive estimator is larger than the one obtained for the Nadaraya-Watson estimator.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. I. A. Ahmad and P. Lin, “Nonparametric Sequential Estimation of a Multiple Regression Function”, Bull. Math. Statist. 17, 63–75 (1976).

    MathSciNet  MATH  Google Scholar 

  2. N. H. Bingham, C. M. Goldie, and J. L. Teugels, Regular Variation (Cambridge University Press, 1987).

  3. D. Bosq, Nonparametric Statistics for Stochastic Processes, in Lecture Notes in Control and Inform. Sci. (Springer, 1985).

  4. G. Collomb, “Proprietés de convergence presque-complète du prédicateur à noyau”, Z. Wahrsch. verw. Gebiete 66, 441–460 (1984).

    Article  MathSciNet  MATH  Google Scholar 

  5. G. Collomb and W. Härdle, “Strong Uniform Convergence Rates in Robust Nonparametric Time Series Analysis and Prediction: Kernel Regression Estimation from Dependent Observations”, Stoch. Proc. and Their Appl. 23, 77–89 (1986).

    Article  MATH  Google Scholar 

  6. A. Dembo and O. Zeitouni, Large Deviations Techniques and Applications, in Applications of Mathematics (Springer, New York, 1998).

    Google Scholar 

  7. L. Devroye, “The Uniform Convergence of the Nadaraya-Watson Regression Function Estimate”, Canad. J. Statist. 6, 179–191 (1979).

    Article  MathSciNet  Google Scholar 

  8. L. Devroye and T. J. Wagner, “On the L 1 Convergence of Kernel Estimators of Regression Function with Applications in Discrimination”, Z. Wahrsch. verw. Gebiete 51, 15–25 (1980).

    Article  MathSciNet  MATH  Google Scholar 

  9. W. Feller, An Introduction to Probability Theory and Its Applications, 2nd ed. (Wiley, New York, 1970), Vol. II.

    Google Scholar 

  10. C. Joutard, “Sharp Large Deviations in Nonparametric Estimation”, J. Nonparam. Statist. 18, 293–306 (2006).

    Article  MathSciNet  MATH  Google Scholar 

  11. D. Louani, “Some Large Deviations Limit Theorems in Conditional Nonparametric Statistics”, Statistics 33, 171–196 (1999).

    Article  MathSciNet  MATH  Google Scholar 

  12. Y. P. Mack and B.W. Silverman, “Weak and Strong Uniform Consistency of Kernel Regression Estimates”, Z. Wahrsch. verw. Gebiete, 61, 405–415 (1982).

    Article  MathSciNet  MATH  Google Scholar 

  13. A. Mokkadem, M. Pelletier, and B. Thiam, Large and Moderate Deviations Principles for Recursive Kernel Estimators of a Multivariate Density and Its Partial Derivatives”, Serdica Math. J. 32, 323–354 (2006).

    MathSciNet  Google Scholar 

  14. A. Mokkadem, M. Pelletier, and J. Worms, “Large and Moderate Deviations Principles for Kernel Estimation of a Multivariate Density and Its Partial Derivatives”, Austral. J. Statist. 4, 489–502 (2005).

    MathSciNet  Google Scholar 

  15. E. A. Nadaraya, “On Estimating Regression”, Theory Probab. Appl. 10, 186–190 (1964).

    Article  Google Scholar 

  16. B. L. S. Prakasa Rao, Nonparametric Functional Estimation (Academic Press, New York, 1983).

    MATH  Google Scholar 

  17. R. T. Rockafellar, Convex Analysis (Princeton University Press, Princeton, 1970).

    MATH  Google Scholar 

  18. G. Roussas, “Exact Rates of Almost Sure Convergence of a Recursive Kernel Estimate of a Probability Density Function: Application to Regression and Hazard Rate Estimate”, J. Nonparam. Statist. 3, 171–195 (1992).

    Article  MathSciNet  Google Scholar 

  19. R. Senoussi, Loi du log itéré et identification, Thèse (Université Paris-Sud, Paris, 1991).

    Google Scholar 

  20. G. S. Watson, “Smooth Regression Analysis”, Sankhya Ser. A. 26, 359–372 (1964).

    MathSciNet  MATH  Google Scholar 

  21. J. Worms, “Moderate Deviations of Some Dependent Variables, Part II: Some Kernel Estimators”, Math. Methods Statist. 10, 161–193 (2001).

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. Mokkadem.

About this article

Cite this article

Mokkadem, A., Pelletier, M. & Thiam, B. Large and moderate deviations principles for kernel estimators of the multivariate regression. Math. Meth. Stat. 17, 146–172 (2008). https://doi.org/10.3103/S1066530708020051

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3103/S1066530708020051

Key words

2000 Mathematics Subject Classification

Navigation