Skip to main content
Log in

Data-driven slicing for dimension reduction in regressions: A likelihood-ratio approach

  • Articles
  • Published:
Science China Mathematics Aims and scope Submit manuscript

Abstract

To efficiently estimate the central subspace in sufficient dimension reduction, response discretization via slicing its range is one of the most used methodologies when inverse regression-based methods are applied. However, existing slicing schemes are almost all ad hoc and not widely accepted. Thus, how to define data-driven schemes with certain optimal properties is a longstanding problem in this field. The research described herewith is then two-fold. First, we introduce a likelihood-ratio-based framework for dimension reduction, subsuming the popularly used methods including the sliced inverse regression, the sliced average variance estimation and the likelihood acquired direction. Second, we propose a regularized log likelihood-ratio criterion to obtain a data-driven slicing scheme and derive the asymptotic properties of the estimators. A simulation study is carried out to examine the performance of the proposed method and that of existing methods. A data set concerning concrete compressive strength is also analyzed for illustration and comparison.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Cook R D. Regression Graphics: Ideas for Studying Regressions Through Graphics. New York: Wiley, 1998

    Book  Google Scholar 

  2. Cook R D, Forazni L. Likelihood-based sufficient dimension reduction. J Amer Statist Assoc, 2009, 104: 197–208

    Article  MathSciNet  Google Scholar 

  3. Cook R D, Weisberg S. Sliced inverse regression for dimension reduction: Comment. J Amer Statist Assoc, 1991, 86: 328–332

    Google Scholar 

  4. Cook R D, Yin X R. Dimension reduction and visualization in discriminant analysis (with discussion). Aust N Z J Stat, 2001, 43: 147–199

    Article  MathSciNet  Google Scholar 

  5. Cook R D, Zhang X. Fused estimators of the central subspace in sufficient dimension reduction. J Amer Statist Assoc, 2014, 109: 815–827

    Article  MathSciNet  Google Scholar 

  6. Fisher R A. The use of multiple measurements in taxonomic problems. Ann Hum Genet, 1936, 7: 179–188

    Google Scholar 

  7. Hotelling H. Relations between two sets of variates. Biometrika, 1936, 28: 321–377

    Article  Google Scholar 

  8. Li B. Sufficient Dimension Reduction: Methods and Applications with R. New York: Chapman and Hall/CRC, 2018

    Book  Google Scholar 

  9. Li K C. Sliced inverse regression for dimension reduction. J Amer Statist Assoc, 1991, 86: 316–327

    Article  MathSciNet  Google Scholar 

  10. Li Y X, Zhu L X. Asymptotics for sliced average variance estimation. Ann Statist, 2007, 35: 41–69

    Article  MathSciNet  Google Scholar 

  11. Wang T. Dimension reduction via adaptive slicing. Statist Sinica, 2022, 32: 499–516

    MathSciNet  Google Scholar 

  12. Wen X M, Setodji C M, Adekpedjou A. A minimum discrepancy approach to multivariate dimension reduction via k-means inverse regression. Stat Interface, 2009, 2: 503–511

    Article  MathSciNet  Google Scholar 

  13. Ye Z S, Weiss R E. Using the bootstrap to select one of a new class of dimension reduction methods. J Amer Statist Assoc, 2003, 98: 968–979

    Article  MathSciNet  Google Scholar 

  14. Yeh I C. Modeling of strength of high-performance concrete using artificial neural networks. Cement Concrete Res, 1998, 28: 1797–1808

    Article  Google Scholar 

  15. Zeng J, Mai Q, Zhang X. Subspace estimation with automatic dimension and variable selection in sufficient dimension reduction. J Amer Statist Assoc, 2023, in press

  16. Zhu L P, Zhu L X, Feng Z H. Dimension reduction in regressions through cumulative slicing estimation. J Amer Statist Assoc, 2010, 105: 1455–1466

    Article  MathSciNet  Google Scholar 

  17. Zhu L X, Ohtaki M, Li Y X. On hybrid methods of inverse regression-based algorithms. Comput Statist Data Anal, 2007, 51: 2621–2635

    Article  MathSciNet  Google Scholar 

  18. Zhu M, Hastie T J. Feature extraction for nonparametric discriminant analysis. J Comput Graph Statist, 2003, 12: 101–120

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was supported by National Natural Science Foundation of China (Grant Nos. 11971017 and 11971018), Shanghai Rising-Star Program (Grant No. 20QA1407500), Multidisciplinary Cross Research Foundation of Shanghai Jiao Tong University (Grant Nos. YG2019QNA26, YG2019QNA37 and YG2021QN06) and Neil Shen’s SJTU Medical Research Fund of Shanghai Jiao Tong University. The authors are grateful to the referees for their helpful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tao Wang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, P., Wang, T. & Zhu, L. Data-driven slicing for dimension reduction in regressions: A likelihood-ratio approach. Sci. China Math. 67, 647–664 (2024). https://doi.org/10.1007/s11425-022-2088-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11425-022-2088-x

Keywords

MSC(2020)

Navigation