Skip to main content
Log in

On spline approximation of sliced inverse regression

  • Published:
Science in China Series A: Mathematics Aims and scope Submit manuscript

Abstract

The dimension reduction is helpful and often necessary in exploring the nonparametric regression structure. In this area, Sliced inverse regression (SIR) is a promising tool to estimate the central dimension reduction (CDR) space. To estimate the kernel matrix of the SIR, we herein suggest the spline approximation using the least squares regression. The heteroscedasticity can be incorporated well by introducing an appropriate weight function. The root-n asymptotic normality can be achieved for a wide range choice of knots. This is essentially analogous to the kernel estimation. Moreover, we also propose a modified Bayes information criterion (BIC) based on the eigenvalues of the SIR matrix. This modified BIC can be applied to any form of the SIR and other related methods. The methodology and some of the practical issues are illustrated through the horse mussel data. Empirical studies evidence the performance of our proposed spline approximation by comparison of the existing estimators.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Li K. Sliced inverse regression for dimension reduction (with discussion). J Amer Statist Assoc, 86: 316–342 (1991)

    Article  MATH  Google Scholar 

  2. Cook R. Regression graphics: Ideas for studying regressions through graphics. New York: Wiley & Sons, 1998

    MATH  Google Scholar 

  3. Zhu L, Zhu L. Model checking for the conditional independence in dimension reduction models. Working Paper, 2005b

  4. Hsing T, Carroll R. An asymptotic theory for sliced inverse regression. Ann Statist, 20: 1040–1061 (1992)

    MATH  Google Scholar 

  5. Zhu L, Ng K. Asymptotics of sliced inverse regression. Statist Sinica, 5: 727–736 (1995)

    MATH  Google Scholar 

  6. Zhu L, Fang K. Asymptotics for kernel estimate of sliced inverse regression. Ann Statist, 24: 1053–1068 (1996)

    Article  MATH  Google Scholar 

  7. Fung W, He X, Liu L, Shi P. Dimension reduction based on canonical correlation, Statist Sinica, 12: 1093–1113 (2002)

    MATH  Google Scholar 

  8. Kent J. Sliced inverse regression for dimension reduction: comment. J Amer Statist Assoc, 86: 336–337 (1991)

    Article  Google Scholar 

  9. Chen C, Li K. Can SIR be as popular as multiple linear regression? Statist Sinica, 8: 289–316 (1998)

    MATH  Google Scholar 

  10. Cook R, Weisberg S. Residuals and Influence in Regression. New York: Chapman and Hall, 1982

    MATH  Google Scholar 

  11. Schott J. Determining the dimensionality in sliced inverse regression. J Amer Statist Assoc, 89: 141–148 (1994)

    Article  MATH  Google Scholar 

  12. Velilla S. Assessing the number of linear components in a general regression problem. J Amer Statist Assoc, 93: 1088–1098 (1998)

    Article  MATH  Google Scholar 

  13. Bura E, Cook R. Estimating the structural dimension of regressions via parametric inverse regression. J Roy Stat Soc B, 63: 393–410 (2001)

    Article  MATH  Google Scholar 

  14. Ferré L. Determination of the dimension in SIR and related methods. J Amer Statist Assoc, 93: 132–140 (1998)

    Article  MATH  Google Scholar 

  15. Mallows C. Bounds on distribution functions in terms of expectations of order-statistics. Ann Probability, 1: 297–303 (1973)

    MATH  Google Scholar 

  16. Akaike H. Maximum likelihood identification of Gaussian autoregressive moving average models. Biometrika, 60: 255–265 (1973)

    Article  MATH  Google Scholar 

  17. Schwarz G. Estimating the dimension of a model. Ann Math Statist, 30: 461–464 (1978)

    Google Scholar 

  18. Zhu L, Miao B, Peng H. Sliced Inverse Regression with Large Dimensional Covariates. J Amer Statist Assoc, 101: 630–643 (2006)

    Article  Google Scholar 

  19. Zhu L, Zhu L. On kernel method for sliced average variance estimation. J Multi Ana, 98: 970–991 (2007)

    Article  MATH  Google Scholar 

  20. Agarwal G, Studden W. Asymptotic integrated mean squares error using least squares and bias minimizing splines. Ann Statist, 8: 1307–1325 (1980)

    MATH  Google Scholar 

  21. Huang S, Studden W. An equivalent kernel method for least squares spline regression. Statist Decisions, 3: 179–201 (Supp) (1993)

    Google Scholar 

  22. Zhou X, Shen X, Wolfe D. Local asymptotics for regression splines and confidence regions. Ann Statist, 26: 1760–1782 (1998)

    Article  MATH  Google Scholar 

  23. Schumaker L. Spline Functions. New York: John Wiley, 1981

    MATH  Google Scholar 

  24. Eaton M, Tyler D. On Wielandt’s inequality and its applications to the asymptotic distribution of the eigenvalues of a random symmetric matrix. Ann Statist, 19: 260–271 (1991)

    MATH  Google Scholar 

  25. Camden M. The Data Bundle. Wellington: New Zealand Statistical Association, 1989

    Google Scholar 

  26. Yao Q, Tong H. Quantifying the influence of initial values on nonlinear prediction. J Roy Statist Soc Ser B, 56: 701–726 (1994)

    MATH  Google Scholar 

  27. Härdle W, Tsybakov A. Local polynomial estimators of the volatility function in nonparametric autoregression. J Economet, 81: 233–242 (1997)

    Article  Google Scholar 

  28. Barrow D. Smith P. Asymptotic properties of best L 2 [0,1] approximation by spline with variable knots. Q Appl Math, 36: 293–304 (1978)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Li-ping Zhu.

Additional information

This work was supported by the special fund (2006) for selecting and training young teachers of universities in Shanghai (Grant No. 79001320) and an FRG grant (FRG/06-07/I-06) from Hong Kong Baptist University, China, and a grant (HKU 7058/05P) from the Research Grants Council of Hong Kong, China

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhu, Lp., Yu, Z. On spline approximation of sliced inverse regression. SCI CHINA SER A 50, 1289–1302 (2007). https://doi.org/10.1007/s11425-007-0085-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11425-007-0085-5

Keywords

MSC(2000)

Navigation