Wuhan University Journal of Natural Sciences

, Volume 24, Issue 4, pp 277–282 | Cite as

Nonparametric Regression Combining Linear Structure

  • Yanli ZhangEmail author
  • Yunquan Song
  • Lu Lin
  • Xiuli Wang


Nonparametric models are popular owing to their flexibility in model building and optimality in estimation. However nonparametric models have the curse of dimensionality and do not use any of the prior information. How to sufficiently mine structure information hidden in the data is still a challenging issue in model building. In this paper, we propose a parametric family of estimators which allows for penalizing deviation from linear structure. The new estimator can automatically capture the linear information underlying regressions function to avoid the curse of dimensionality and offers a smooth choice between the full non-parametric models and parametric models. Besides, the new estimator is the linear estimator when the model has linear structure, and it is the local linear estimator when the model has no linear structure. Compared with the complete nonparametric models, our estimator has smaller bias due to using linear structure information of the data. The new estimator is useful in higher dimensions; the usual nonparametric methods have the curse of dimensionality. Based on the projection framework, the theoretical results give the structure of the new estimator and simulation studies demonstrate the advantages of the new approach.

Key words

nonparametric full model nonlinear 

CLC number

O 212.4 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Arnold S F. The Theory of Linear Models and Multivariate Analysis [M]. New York: Wiley, 1981.Google Scholar
  2. [2]
    Carroll R J, Fan J, Gijbels W, et al. Generalized partially linear single-index models [J]. Journal of the American Statistical Association, 1997, 92: 477–489.CrossRefGoogle Scholar
  3. [3]
    Härdle W, Müller M, Sperlich S, et al. Nonparametric and Semiparametric Models [M]. New York: Springer-Verlag, 2004.CrossRefGoogle Scholar
  4. [4]
    Buja A, Hastie T J, Tibshirani R J. Linear smoothers and additive models (with discussion) [J]. Annals of Statistics, 1989, 17: 453–555.CrossRefGoogle Scholar
  5. [5]
    Nelder J A, Wedderburn R W M. Generalized linear models [J]. Journal of Applied Econometrics, 1972, 5: 99–135.Google Scholar
  6. [6]
    Hastie T, Tibshirani R. Generalized Additive Models [M]. London: Chapman and Hall, 1990.Google Scholar
  7. [7]
    Fan J. Local linear regression smoothers and their minimax efficiencies [J]. Annals of Statistics, 1993, 21: 196–216.CrossRefGoogle Scholar
  8. [8]
    Fan J, Gasser T, Gijbels I, et al. Local polynomial regression: Optimal kernels and asymptotic minimax efficiency [J]. Ann Inst Statist Math, 1997, 49: 79–99.CrossRefGoogle Scholar
  9. [9]
    Stone C. Optimal rates of convergence for nonparametric estimators [J]. Annals of Statistics, 1980, 8: 1348–1360.CrossRefGoogle Scholar
  10. [10]
    Chu C K, Marron J S. Choosing a kernel regression estimator (with discussion) [J]. Statist Sci, 1991, 6: 404–436.CrossRefGoogle Scholar
  11. [11]
    Hardle W, Gasser T. Robust nonparametric function fitting [J]. Roy Statist Soc Ser B, 1984, 46: 42–51.Google Scholar
  12. [12]
    Stone C. Optimal global rates of convergence for nonparametric regression [J]. Annals of Statistics, 1982, 10: 1040–1053.CrossRefGoogle Scholar
  13. [13]
    Studer M, Seifert B, Gasser T. Nonparametric regression penalizing deviations from additivity [J]. Annals of Statistics, 2005, 33: 1295–1329.CrossRefGoogle Scholar
  14. [14]
    Park J, Seifert B. Local additive estimation [J]. Roy Statist Soc Ser B, 2010, 72: 171–191.CrossRefGoogle Scholar
  15. [15]
    Lin L, Song Y Q, Liu Z. Local linear-additive estimation for multiple nonparametric regressions [J]. Journal of Multivariate Analysis, 2014, 123: 252–269.CrossRefGoogle Scholar
  16. [16]
    Mammen E, Marron J S, Turlach B, et al. A general projection framework for constrained smoothing [J]. Statist Sci, 2001, 16: 232–248.CrossRefGoogle Scholar
  17. [17]
    Hurvich C, Simonoff J, Tsai C. Smoothing parameter selection in nonparametric regression using an improved Akaike information criterion [J]. Roy Statist Soc Ser B, Methodol, 1998, 60: 271–293.CrossRefGoogle Scholar

Copyright information

© Wuhan University and Springer-Verlag GmbH Germany 2019

Authors and Affiliations

  1. 1.School of StatisticsShandong University of Finance and EconomicsJinan, ShandongChina
  2. 2.College of ScienceChina University of PetroleumQingdao, ShandongChina
  3. 3.Zhongtai Securities Institute for Financial StudiesShandong UniversityJinan, ShandongChina
  4. 4.School of Mathematics and StatisticsShandong Normal UniversityJinan, ShandongChina

Personalised recommendations