Sliced inverse regression (SIR) and related methods were introduced in order to reduce the dimensionality of regression problems. In general semiparametric regression framework, these methods determine linear combinations of a set of explanatory variables X related to the response variable Y, without losing information on the conditional distribution of Y given X. They are based on a “slicing step” in the population and sample versions. They are sensitive to the choice of the number H of slices, and this is particularly true for SIR-II and SAVE methods. At the moment there are no theoretical results nor practical techniques which allows the user to choose an appropriate number of slices. In this paper, we propose an approach based on the quality of the estimation of the effective dimension reduction (EDR) space: the square trace correlation between the true EDR space and its estimate can be used as goodness of estimation. We introduce a naïve bootstrap estimation of the square trace correlation criterion to allow selection of an “optimal” number of slices. Moreover, this criterion can also simultaneously select the corresponding suitable dimension K (number of the linear combination of X). From a practical point of view, the choice of these two parameters H and K is essential. We propose a 3D-graphical tool, implemented in R, which can be useful to select the suitable couple (H, K). An R package named “edrGraphicalTools” has been developed. In this article, we focus on the SIR-I, SIR-II and SAVE methods. Moreover the proposed criterion can be use to determine which method seems to be efficient to recover the EDR space, that is the structure between Y and X. We indicate how the proposed criterion can be used in practice. A simulation study is performed to illustrate the behavior of this approach and the need for selecting properly the number H of slices and the dimension K. A short real-data example is also provided.
Efron B (1982) The jackknife, the bootstrap and other resampling plans. CBMS-NSF regional conference series in applied mathematics, 38. Society for Industrial and Applied Mathematics (SIAM), PhiladelphiaGoogle Scholar
Ferré L (1997) Dimension choice for sliced inverse regression based on ranks. Student 2: 95–108Google Scholar
Ferré L (1998) Determining the dimension in sliced inverse regression and related methods. J Am Stat Assoc 93: 132–140zbMATHCrossRefGoogle Scholar
Li KC (1992) On principal Hessian directions for data visualization and dimension reduction: another application of Stein’s lemma. J Am Stat Assoc 87: 1025–1039zbMATHCrossRefGoogle Scholar
Li Y, Zhu L-X (2007) Asymptotics for sliced average variance estimation. Ann Stat 35: 41–69CrossRefGoogle Scholar
Liquet B, Saracco J (2008) Application of the bootstrap approach to the choice of dimension and the α parameter in the SIRα method. Commun Stat Simul Comput 37: 1198–1218MathSciNetzbMATHCrossRefGoogle Scholar
Prendergast LA (2007) Implications of influence function analysis for sliced inverse regression and sliced average variance estimation. Biometrika 94: 585–601MathSciNetzbMATHCrossRefGoogle Scholar