A great deal of effort has been devoted to the inference of additive model in the last decade. Among existing procedures, the kernel type are too costly to implement for high dimensions or large sample sizes, while the spline type provide no asymptotic distribution or uniform convergence. We propose a one step backfitting estimator of the component function in an additive regression model, using spline estimators in the first stage followed by kernel/local linear estimators. Under weak conditions, the proposed estimator’s pointwise distribution is asymptotically equivalent to an univariate kernel/local linear estimator, hence the dimension is effectively reduced to one at any point. This dimension reduction holds uniformly over an interval under assumptions of normal errors. Monte Carlo evidence supports the asymptotic results for dimensions ranging from low to very high, and sample sizes ranging from moderate to large. The proposed confidence band is applied to the Boston housing data for linearity diagnosis.
Bandwidths B spline Knots Local linear estimator Nadaraya-Watson estimator Nonparametric regression
This is a preview of subscription content, log in to check access.
Andrews D., Whang Y. (1990). Additive interactive regression models: circumvention of the curse of the dimensionality. Econometric Theory 6, 466–479CrossRefMathSciNetGoogle Scholar
Breiman L., Friedman J.H. (1985). Estimating optimal transformations for multiple regression and correlation. Journal of the American Statistical Association 80, 580–619zbMATHCrossRefMathSciNetGoogle Scholar
Kim W., Linton O.B., Hengartner N. (1999). A computationally efficient oracle estimator for additive nonparametric regression with bootstrap confidence intervals. Journal of Computational and Graphical Statistics 8, 278–297CrossRefMathSciNetGoogle Scholar
Mammen E., Linton O., Nielsen J. (1999). The existence and asymptotic properties of a backfitting projection algorithm under weak conditions. Annals of Statistics 27, 1443–1490zbMATHMathSciNetGoogle Scholar
Yang, L. (2007). Confidence band for additive regression model. Journal of Data Science, forthcoming.Google Scholar
Yang L., Härdle W., Nielsen J.P. (1999). Nonparametric autoregression with multiplicative volatility and additive mean. Journal of Time Series Analysis 20, 579–604zbMATHCrossRefMathSciNetGoogle Scholar
Yang L., Sperlich S., Härdle W. (2003). Derivative estimation and testing in generalized additive models. Journal of Statistical Planning and Inference 115: 521–542zbMATHCrossRefMathSciNetGoogle Scholar
Yang L., Park B.U., Xue L., Härdle W. (2006). Estimation and testing of varying coefficients in additive models with marginal integration. Journal of the American Statistical Association 101: 1212–1227zbMATHCrossRefMathSciNetGoogle Scholar