Skip to main content
Log in

Local Polynomial Regression: Optimal Kernels and Asymptotic Minimax Efficiency

  • Published:
Annals of the Institute of Statistical Mathematics Aims and scope Submit manuscript

Abstract

We consider local polynomial fitting for estimating a regression function and its derivatives nonparametrically. This method possesses many nice features, among which automatic adaptation to the boundary and adaptation to various designs. A first contribution of this paper is the derivation of an optimal kernel for local polynomial regression, revealing that there is a universal optimal weighting scheme. Fan (1993, Ann. Statist., 21, 196-216) showed that the univariate local linear regression estimator is the best linear smoother, meaning that it attains the asymptotic linear minimax risk. Moreover, this smoother has high minimax risk. We show that this property also holds for the multivariate local linear regression estimator. In the univariate case we investigate minimax efficiency of local polynomial regression estimators, and find that the asymptotic minimax efficiency for commonly-used orders of fit is 100% among the class of all linear smoothers. Further, we quantify the loss in efficiency when going beyond this class.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

REFERENCES

  • Cheng, M. Y., Fan, J. and Marron, J. S. (1993). Minimax efficiency of local polynomial fit estimators at boundaries, Institute of Statistics Mimeo Series #2098, University of North Carolina, Chapel Hill.

    Google Scholar 

  • Chu, C. K. and Marron, J. S. (1991). Choosing a kernel regression estimator, Statist. Sci., 6, 404–433.

    Google Scholar 

  • Cleveland, W. S. and Loader, C. (1996). Smoothing by local regression: Principles and methods, Computational Statistics, 11 (to appear).

  • Donoho, D. L. (1994). Statistical estimation and optimal recovery, Ann. Statist., 22, 238–270.

    Google Scholar 

  • Donoho, D. L. and Liu, R. C. (1991). Geometrizing rate of convergence III, Ann. Statist., 19, 668–701.

    Google Scholar 

  • Donoho, D. L., Johnstone, I. M., Kerkyacharian, G. and Picard, D. (1995). Wavelet shrinkage: asymptopia?, J. Roy. Statist. Soc. Ser. B, 57, 301–369.

    Google Scholar 

  • Epanechnikov, V. A. (1969). Nonparametric estimation of a multidimensional probability density, Theory Probab. Appl., 13, 153–158.

    Google Scholar 

  • Eubank, R. L. (1988). Spline Smoothing and Nonparametric Regression, Marcel Dekker, New York.

    Google Scholar 

  • Fan, J. (1992). Design-adaptive nonparametric regression, J. Amer. Statist. Assoc., 87, 998–1004.

    Google Scholar 

  • Fan, J. (1993). Local linear regression smoothers and their minimax efficiency, Ann. Statist., 21, 196–216.

    Google Scholar 

  • Fan, J. and Gijbels, I. (1992). Variable bandwidth and local linear regression smoothers, Ann. Statist., 20, 2008–2036.

    Google Scholar 

  • Fan, J. and Gijbels, I. (1995a). Data-driven bandwidth selection in local polynomial fitting: variable bandwidth and spatial adaptation, J. Roy. Statist. Soc. Ser. B, 57, 371–394.

    Google Scholar 

  • Fan, J. and Gijbels, I. (1995b). Adaptive order polynomial fitting: bandwidth robustification and bias reduction, Journal of Computational and Graphical Statistics, 4, 213–227.

    Google Scholar 

  • Fan, J. and Marron, J. S. (1994). Fast implementations of nonparametric curve estimators, Journal of Computational and Graphical Statistics, 3, 35–56.

    Google Scholar 

  • Gasser, T. and Engel, J. (1990). The choice of weights in kernel regression estimation, Biometrika, 77, 377–381.

    Google Scholar 

  • Gasser, T. and Müller, H.-G. (1984). Estimating regression functions and their derivatives by the kernel method, Scand. J. Statist., 11, 171–185.

    Google Scholar 

  • Gasser, T., Müller, H.-G. and Mammitzsch, V. (1985). Kernels for nonparametric curve estimation, J. Roy. Statist. Soc. Ser. B, 47, 238–252.

    Google Scholar 

  • Granovsky, B. L. and Müller, H.-G. (1991). Optimizing kernel methods: a unifying variational principle, Internat. Statist. Rev., 59, 373–388.

    Google Scholar 

  • Green, P. J. and Silverman, B. W. (1994). Nonparametric Regression and Generalized Linear Models: a Roughness Penalty Approach, Chapman and Hall, London.

    Google Scholar 

  • Härdle, W. (1990). Applied Nonparametric Regression, Cambridge University Press, Boston.

    Google Scholar 

  • Hastie, T. J. and Loader, C. (1993). Local regression: automatic kernel carpentry (with discussion), Statist. Sci., 8, 120–143.

    Google Scholar 

  • Jennen-Steinmetz, C. and Gasser, T. (1988). A unifying approach to nonparametric regression estimation, J. Amer. Statist. Assoc., 83, 1084–1089.

    Google Scholar 

  • Lejeune, M. (1985). Estimation non-paramétrique par noyaux: régression polynomiale mobile, Rev. Statist. Appl., 33, 43–68.

    Google Scholar 

  • Müller, H.-G. (1987). Weighted local regression and kernel methods for nonparametric curve fitting, J. Amer. Statist. Assoc., 82, 231–238.

    Google Scholar 

  • Müller, H.-G. (1988). Nonparametric Analysis of Longitudinal Data, Springer, Berlin.

    Google Scholar 

  • Müller, H.-G. (1991). Smooth optimum kernel estimators near endpoints, Biometrika, 78, 521–530.

    Google Scholar 

  • Nadaraya, E. A. (1964). On estimating regression, Theory Probab. Appl., 9, 141–142.

    Google Scholar 

  • Rao, C. R. (1973). Linear Statistical Inference and Its Applications, Wiley, New York.

    Google Scholar 

  • Ruppert, D. and Wand, M. P. (1994). Multivariate weighted least squares regression, Ann. Statist., 22, 1346–1370.

    Google Scholar 

  • Ruppert, D., Sheather, S. J. and Wand, M. P. (1995). An effective bandwidth selector for local least squares regression, J. Amer. Statist. Assoc., 90, 1257–1270.

    Google Scholar 

  • Sacks, J. and Ylvisaker, D. (1981). Asymptotically optimum kernels for density estimation at a point, Ann. Statist., 9, 334–346.

    Google Scholar 

  • Seifert, B., Brockmann, M., Engel, J. and Gasser, T. (1994). Fast algorithms for nonparametric curve estimation, Journal of Computational and Graphical Statistics, 3, 192–213.

    Google Scholar 

  • Silverman, B. W. (1984). Spline smoothing: the equivalent variable kernel method, Ann. Statist., 12, 898–916.

    Google Scholar 

  • Wahba, G. (1990). Spline Models for Observational Data, SIAM, Philadelphia.

    Google Scholar 

  • Wand, M. P. and Jones, M. C. (1995). Kernel Smoothing, Chapman and Hall, London.

    Google Scholar 

  • Watson, G. S. (1964). Smooth regression analysis, Sankhyā Ser. A, 26, 359–372.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

About this article

Cite this article

Fan, J., Gasser, T., Gijbels, I. et al. Local Polynomial Regression: Optimal Kernels and Asymptotic Minimax Efficiency. Annals of the Institute of Statistical Mathematics 49, 79–99 (1997). https://doi.org/10.1023/A:1003162622169

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1003162622169

Navigation