Abstract
We propose two new classes of estimators of the sufficient dimension reduction space based on invariant linear operators. Many second-order dimension reduction estimators, such as the Sliced Average Variance Estimate, the Sliced Inverse Regression-II, Contour Regression, and Directional regression, rely on the assumptions of linear conditional mean and constant conditional variance. In this paper we show that, under the conditional mean assumption alone, the candidate matrices for many second-order estimators are invariant for the dimension reduction subspace. As a result, these matrices provide useful information about the dimension reduction subspace—that is, a subset of their eigenvectors spans the dimension reduction subspace. Using this property, we develop two new methods for estimating the central subspace: the Iterative Invariant Transformation and the Nonparametrically Boosted Inverse Regression, the second of which is guaranteed to be \(\sqrt {n}\)-consistent and exhaustive estimator of the dimension reduction subspace. We also conduct a simulation study to show strong evidence that Nonparametric Boosted Inverse Regression outperforms some of the classical second-order inverse regression methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
E. Bura, R.D. Cook, Estimating the structural dimension of regressions via parametric inverse. J. Roy. Stat. Soc. B 63, 393–410 (2001)
J.B. Conway, A Course in Functional Analysis, 2nd edn. (Springer, 1990)
R.D. Cook, Using dimension-reduction subspaces to identify important inputs in models of physical systems, in 1994 Proceedings of the Section on Physical and Engineering Sciences (American Statistical Association, Alexandria, VA, 1994), pp. 18–25
R.D. Cook, Principal Hessian directions revisited. J. Am. Stat. Assoc. 93, 84–94 (1998a)
R.D. Cook, Regression Graphics: Ideas for Studying Regressions Through Graphics (Wiley, New York, 1998b)
R.D. Cook, B. Li, Dimension reduction for conditional mean in regression. Ann. Stat. 30, 455–474 (2002)
R.D. Cook, B. Li, Determining the dimension of iterative Hessian transformation. Ann. Stat. 32, 2501–2531 (2004)
R.D. Cook, L. Ni, Sufficient dimension reduction via inverse regression a minimum discrepancy approach. J. Am. Stat. Assoc. 108, 410–428 (2005)
R.D. Cook, S. Weisberg, Sliced inverse regression for dimension reduction: Comment. J. Am. Stat. Assoc. 86, 328–332 (1991)
A.P. Dawid, Conditional independence in statistical theory. J. Roy. Stat. Soc. B (Methodological) 1–31 (1979)
M.L. Eaton, Multivariate Statistics: A Vector Space Approach (Institute of Mathematical Statistics, 2007)
L. Ferré, A.F. Yao, Functional sliced inverse regression analysis. Stat. J. Theor. Appl. Stat. 37, 475–488 (2003)
L. Ferré, A.F. Yao, Smoothed functional inverse regression. Statistica Sinica 15, 665–683 (2005)
T. Kato, Perturbation Theory for Linear Operators (Springer, 1980)
B. Li, Sufficient Dimension Reduction: Methods and Applications with R (CRC Press/Chapman & Hall, 2018)
B. Li, J. Song, Nonlinear sufficient dimension reduction for functional data. Ann. Stat. 45, 1059–1095 (2017)
B. Li, S. Wang, On directional regression for dimension reduction. J. Am. Stat. Assoc. 35, 2143–2172 (2007)
B. Li, H. Zha, F. Chiaromonte, Contour regression: A general approach to dimension reduction. Ann. Stat. 33, 1580–1616 (2005)
K.-C. Li, Sliced inverse regression for dimension reduction. J. Am. Stat. Assoc. 86, 316–327(1991)
K.-C. Li, On principal hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. J. Am. Stat. Assoc. 87, 1025–1039 (1992)
K.-C. Li, N. Duan, Regression analysis under link violation. Ann. Stat. 17, 1009–1052 (1989)
H. Lian, G. Li, Series expansion for functional sufficient dimension reduction. J. Multivariate Anal. 124, 150–165 (2014)
W. Luo, B. Li, Combining eigenvalues and variation of eigenvectors for order determination. Biometrika 103, 875–887 (2016)
S. Wang, Dimension Reduction in Regression, Ph.D. Thesis, Pennsylvania State University (2005)
Y. Xia, A constructive approach to the estimation of dimension reduction directions. Ann. Stat. 35, 2654–2690 (2007)
Y. Xia, H. Tong, W.K. Li, L.-X. Zhu, An adaptive estimation of dimension reduction space. J. Roy. Stat. Soc. B 64, 363–410 (2002)
Z. Ye, R.E. Weiss, Using the bootstrap to select one of a new class of dimension reduction methods. J. Am. Stat. Assoc. 98, 968–979 (2003)
X. Yin, B. Li, R. Cook, Successive direction extraction for estimating the central subspace in a multiple-index regression. J. Multivariate Anal. 99, 1733–1757 (2008)
Acknowledgements
The author would like to thank two referees and Professor Efstathia Bura for their thoughtful and helpful comments and suggestions. The author’s research is supported in part by the National Science Foundation grant DMS-1713078, which he gratefully acknowledges.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Li, B. (2021). Estimating Sufficient Dimension Reduction Spaces by Invariant Linear Operators. In: Bura, E., Li, B. (eds) Festschrift in Honor of R. Dennis Cook. Springer, Cham. https://doi.org/10.1007/978-3-030-69009-0_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-69009-0_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-69008-3
Online ISBN: 978-3-030-69009-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)