Skip to main content
Log in

Nonparametric Regression Combining Linear Structure

  • Mathematics
  • Published:
Wuhan University Journal of Natural Sciences

Abstract

Nonparametric models are popular owing to their flexibility in model building and optimality in estimation. However nonparametric models have the curse of dimensionality and do not use any of the prior information. How to sufficiently mine structure information hidden in the data is still a challenging issue in model building. In this paper, we propose a parametric family of estimators which allows for penalizing deviation from linear structure. The new estimator can automatically capture the linear information underlying regressions function to avoid the curse of dimensionality and offers a smooth choice between the full non-parametric models and parametric models. Besides, the new estimator is the linear estimator when the model has linear structure, and it is the local linear estimator when the model has no linear structure. Compared with the complete nonparametric models, our estimator has smaller bias due to using linear structure information of the data. The new estimator is useful in higher dimensions; the usual nonparametric methods have the curse of dimensionality. Based on the projection framework, the theoretical results give the structure of the new estimator and simulation studies demonstrate the advantages of the new approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Arnold S F. The Theory of Linear Models and Multivariate Analysis [M]. New York: Wiley, 1981.

    Google Scholar 

  2. Carroll R J, Fan J, Gijbels W, et al. Generalized partially linear single-index models [J]. Journal of the American Statistical Association, 1997, 92: 477–489.

    Article  Google Scholar 

  3. Härdle W, Müller M, Sperlich S, et al. Nonparametric and Semiparametric Models [M]. New York: Springer-Verlag, 2004.

    Book  Google Scholar 

  4. Buja A, Hastie T J, Tibshirani R J. Linear smoothers and additive models (with discussion) [J]. Annals of Statistics, 1989, 17: 453–555.

    Article  Google Scholar 

  5. Nelder J A, Wedderburn R W M. Generalized linear models [J]. Journal of Applied Econometrics, 1972, 5: 99–135.

    Google Scholar 

  6. Hastie T, Tibshirani R. Generalized Additive Models [M]. London: Chapman and Hall, 1990.

    Google Scholar 

  7. Fan J. Local linear regression smoothers and their minimax efficiencies [J]. Annals of Statistics, 1993, 21: 196–216.

    Article  Google Scholar 

  8. Fan J, Gasser T, Gijbels I, et al. Local polynomial regression: Optimal kernels and asymptotic minimax efficiency [J]. Ann Inst Statist Math, 1997, 49: 79–99.

    Article  Google Scholar 

  9. Stone C. Optimal rates of convergence for nonparametric estimators [J]. Annals of Statistics, 1980, 8: 1348–1360.

    Article  Google Scholar 

  10. Chu C K, Marron J S. Choosing a kernel regression estimator (with discussion) [J]. Statist Sci, 1991, 6: 404–436.

    Article  Google Scholar 

  11. Hardle W, Gasser T. Robust nonparametric function fitting [J]. Roy Statist Soc Ser B, 1984, 46: 42–51.

    Google Scholar 

  12. Stone C. Optimal global rates of convergence for nonparametric regression [J]. Annals of Statistics, 1982, 10: 1040–1053.

    Article  Google Scholar 

  13. Studer M, Seifert B, Gasser T. Nonparametric regression penalizing deviations from additivity [J]. Annals of Statistics, 2005, 33: 1295–1329.

    Article  Google Scholar 

  14. Park J, Seifert B. Local additive estimation [J]. Roy Statist Soc Ser B, 2010, 72: 171–191.

    Article  Google Scholar 

  15. Lin L, Song Y Q, Liu Z. Local linear-additive estimation for multiple nonparametric regressions [J]. Journal of Multivariate Analysis, 2014, 123: 252–269.

    Article  Google Scholar 

  16. Mammen E, Marron J S, Turlach B, et al. A general projection framework for constrained smoothing [J]. Statist Sci, 2001, 16: 232–248.

    Article  Google Scholar 

  17. Hurvich C, Simonoff J, Tsai C. Smoothing parameter selection in nonparametric regression using an improved Akaike information criterion [J]. Roy Statist Soc Ser B, Methodol, 1998, 60: 271–293.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yanli Zhang.

Additional information

Foundation item: Supported by the Project of Humanities and Social Science of Ministry of Education of China (16YJA910003), the “Financial Statistics and Risk Management” Fostering Team of Shandong University of Finance and Economics and the Project of Shandong Province Higher Educational Science and Technology (J16LI56 and J17KA163)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, Y., Song, Y., Lin, L. et al. Nonparametric Regression Combining Linear Structure. Wuhan Univ. J. Nat. Sci. 24, 277–282 (2019). https://doi.org/10.1007/s11859-019-1397-3

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11859-019-1397-3

Key words

CLC number

Navigation