Abstract
The dimension reduction is helpful and often necessary in exploring the nonparametric regression structure. In this area, Sliced inverse regression (SIR) is a promising tool to estimate the central dimension reduction (CDR) space. To estimate the kernel matrix of the SIR, we herein suggest the spline approximation using the least squares regression. The heteroscedasticity can be incorporated well by introducing an appropriate weight function. The root-n asymptotic normality can be achieved for a wide range choice of knots. This is essentially analogous to the kernel estimation. Moreover, we also propose a modified Bayes information criterion (BIC) based on the eigenvalues of the SIR matrix. This modified BIC can be applied to any form of the SIR and other related methods. The methodology and some of the practical issues are illustrated through the horse mussel data. Empirical studies evidence the performance of our proposed spline approximation by comparison of the existing estimators.
Similar content being viewed by others
References
Li K. Sliced inverse regression for dimension reduction (with discussion). J Amer Statist Assoc, 86: 316–342 (1991)
Cook R. Regression graphics: Ideas for studying regressions through graphics. New York: Wiley & Sons, 1998
Zhu L, Zhu L. Model checking for the conditional independence in dimension reduction models. Working Paper, 2005b
Hsing T, Carroll R. An asymptotic theory for sliced inverse regression. Ann Statist, 20: 1040–1061 (1992)
Zhu L, Ng K. Asymptotics of sliced inverse regression. Statist Sinica, 5: 727–736 (1995)
Zhu L, Fang K. Asymptotics for kernel estimate of sliced inverse regression. Ann Statist, 24: 1053–1068 (1996)
Fung W, He X, Liu L, Shi P. Dimension reduction based on canonical correlation, Statist Sinica, 12: 1093–1113 (2002)
Kent J. Sliced inverse regression for dimension reduction: comment. J Amer Statist Assoc, 86: 336–337 (1991)
Chen C, Li K. Can SIR be as popular as multiple linear regression? Statist Sinica, 8: 289–316 (1998)
Cook R, Weisberg S. Residuals and Influence in Regression. New York: Chapman and Hall, 1982
Schott J. Determining the dimensionality in sliced inverse regression. J Amer Statist Assoc, 89: 141–148 (1994)
Velilla S. Assessing the number of linear components in a general regression problem. J Amer Statist Assoc, 93: 1088–1098 (1998)
Bura E, Cook R. Estimating the structural dimension of regressions via parametric inverse regression. J Roy Stat Soc B, 63: 393–410 (2001)
Ferré L. Determination of the dimension in SIR and related methods. J Amer Statist Assoc, 93: 132–140 (1998)
Mallows C. Bounds on distribution functions in terms of expectations of order-statistics. Ann Probability, 1: 297–303 (1973)
Akaike H. Maximum likelihood identification of Gaussian autoregressive moving average models. Biometrika, 60: 255–265 (1973)
Schwarz G. Estimating the dimension of a model. Ann Math Statist, 30: 461–464 (1978)
Zhu L, Miao B, Peng H. Sliced Inverse Regression with Large Dimensional Covariates. J Amer Statist Assoc, 101: 630–643 (2006)
Zhu L, Zhu L. On kernel method for sliced average variance estimation. J Multi Ana, 98: 970–991 (2007)
Agarwal G, Studden W. Asymptotic integrated mean squares error using least squares and bias minimizing splines. Ann Statist, 8: 1307–1325 (1980)
Huang S, Studden W. An equivalent kernel method for least squares spline regression. Statist Decisions, 3: 179–201 (Supp) (1993)
Zhou X, Shen X, Wolfe D. Local asymptotics for regression splines and confidence regions. Ann Statist, 26: 1760–1782 (1998)
Schumaker L. Spline Functions. New York: John Wiley, 1981
Eaton M, Tyler D. On Wielandt’s inequality and its applications to the asymptotic distribution of the eigenvalues of a random symmetric matrix. Ann Statist, 19: 260–271 (1991)
Camden M. The Data Bundle. Wellington: New Zealand Statistical Association, 1989
Yao Q, Tong H. Quantifying the influence of initial values on nonlinear prediction. J Roy Statist Soc Ser B, 56: 701–726 (1994)
Härdle W, Tsybakov A. Local polynomial estimators of the volatility function in nonparametric autoregression. J Economet, 81: 233–242 (1997)
Barrow D. Smith P. Asymptotic properties of best L 2 [0,1] approximation by spline with variable knots. Q Appl Math, 36: 293–304 (1978)
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was supported by the special fund (2006) for selecting and training young teachers of universities in Shanghai (Grant No. 79001320) and an FRG grant (FRG/06-07/I-06) from Hong Kong Baptist University, China, and a grant (HKU 7058/05P) from the Research Grants Council of Hong Kong, China
Rights and permissions
About this article
Cite this article
Zhu, Lp., Yu, Z. On spline approximation of sliced inverse regression. SCI CHINA SER A 50, 1289–1302 (2007). https://doi.org/10.1007/s11425-007-0085-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11425-007-0085-5
Keywords
- asymptotic normality
- spline
- Bayes information criterion
- dimension reduction
- sliced inverse regression
- structural dimensionality