Skip to main content

Estimating Sufficient Dimension Reduction Spaces by Invariant Linear Operators

  • Chapter
  • First Online:
Festschrift in Honor of R. Dennis Cook

Abstract

We propose two new classes of estimators of the sufficient dimension reduction space based on invariant linear operators. Many second-order dimension reduction estimators, such as the Sliced Average Variance Estimate, the Sliced Inverse Regression-II, Contour Regression, and Directional regression, rely on the assumptions of linear conditional mean and constant conditional variance. In this paper we show that, under the conditional mean assumption alone, the candidate matrices for many second-order estimators are invariant for the dimension reduction subspace. As a result, these matrices provide useful information about the dimension reduction subspace—that is, a subset of their eigenvectors spans the dimension reduction subspace. Using this property, we develop two new methods for estimating the central subspace: the Iterative Invariant Transformation and the Nonparametrically Boosted Inverse Regression, the second of which is guaranteed to be \(\sqrt {n}\)-consistent and exhaustive estimator of the dimension reduction subspace. We also conduct a simulation study to show strong evidence that Nonparametric Boosted Inverse Regression outperforms some of the classical second-order inverse regression methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • E. Bura, R.D. Cook, Estimating the structural dimension of regressions via parametric inverse. J. Roy. Stat. Soc. B 63, 393–410 (2001)

    Article  MathSciNet  Google Scholar 

  • J.B. Conway, A Course in Functional Analysis, 2nd edn. (Springer, 1990)

    Google Scholar 

  • R.D. Cook, Using dimension-reduction subspaces to identify important inputs in models of physical systems, in 1994 Proceedings of the Section on Physical and Engineering Sciences (American Statistical Association, Alexandria, VA, 1994), pp. 18–25

    Google Scholar 

  • R.D. Cook, Principal Hessian directions revisited. J. Am. Stat. Assoc. 93, 84–94 (1998a)

    Article  MathSciNet  Google Scholar 

  • R.D. Cook, Regression Graphics: Ideas for Studying Regressions Through Graphics (Wiley, New York, 1998b)

    Book  Google Scholar 

  • R.D. Cook, B. Li, Dimension reduction for conditional mean in regression. Ann. Stat. 30, 455–474 (2002)

    Article  MathSciNet  Google Scholar 

  • R.D. Cook, B. Li, Determining the dimension of iterative Hessian transformation. Ann. Stat. 32, 2501–2531 (2004)

    MathSciNet  MATH  Google Scholar 

  • R.D. Cook, L. Ni, Sufficient dimension reduction via inverse regression a minimum discrepancy approach. J. Am. Stat. Assoc. 108, 410–428 (2005)

    Article  MathSciNet  Google Scholar 

  • R.D. Cook, S. Weisberg, Sliced inverse regression for dimension reduction: Comment. J. Am. Stat. Assoc. 86, 328–332 (1991)

    MATH  Google Scholar 

  • A.P. Dawid, Conditional independence in statistical theory. J. Roy. Stat. Soc. B (Methodological) 1–31 (1979)

    Google Scholar 

  • M.L. Eaton, Multivariate Statistics: A Vector Space Approach (Institute of Mathematical Statistics, 2007)

    Google Scholar 

  • L. FerrĂ©, A.F. Yao, Functional sliced inverse regression analysis. Stat. J. Theor. Appl. Stat. 37, 475–488 (2003)

    MathSciNet  MATH  Google Scholar 

  • L. FerrĂ©, A.F. Yao, Smoothed functional inverse regression. Statistica Sinica 15, 665–683 (2005)

    MathSciNet  MATH  Google Scholar 

  • T. Kato, Perturbation Theory for Linear Operators (Springer, 1980)

    Google Scholar 

  • B. Li, Sufficient Dimension Reduction: Methods and Applications with R (CRC Press/Chapman & Hall, 2018)

    Google Scholar 

  • B. Li, J. Song, Nonlinear sufficient dimension reduction for functional data. Ann. Stat. 45, 1059–1095 (2017)

    MathSciNet  MATH  Google Scholar 

  • B. Li, S. Wang, On directional regression for dimension reduction. J. Am. Stat. Assoc. 35, 2143–2172 (2007)

    MathSciNet  Google Scholar 

  • B. Li, H. Zha, F. Chiaromonte, Contour regression: A general approach to dimension reduction. Ann. Stat. 33, 1580–1616 (2005)

    Article  MathSciNet  Google Scholar 

  • K.-C. Li, Sliced inverse regression for dimension reduction. J. Am. Stat. Assoc. 86, 316–327(1991)

    Article  MathSciNet  Google Scholar 

  • K.-C. Li, On principal hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. J. Am. Stat. Assoc. 87, 1025–1039 (1992)

    Article  MathSciNet  Google Scholar 

  • K.-C. Li, N. Duan, Regression analysis under link violation. Ann. Stat. 17, 1009–1052 (1989)

    MathSciNet  MATH  Google Scholar 

  • H. Lian, G. Li, Series expansion for functional sufficient dimension reduction. J. Multivariate Anal. 124, 150–165 (2014)

    Article  MathSciNet  Google Scholar 

  • W. Luo, B. Li, Combining eigenvalues and variation of eigenvectors for order determination. Biometrika 103, 875–887 (2016)

    Article  MathSciNet  Google Scholar 

  • S. Wang, Dimension Reduction in Regression, Ph.D. Thesis, Pennsylvania State University (2005)

    Google Scholar 

  • Y. Xia, A constructive approach to the estimation of dimension reduction directions. Ann. Stat. 35, 2654–2690 (2007)

    Article  MathSciNet  Google Scholar 

  • Y. Xia, H. Tong, W.K. Li, L.-X. Zhu, An adaptive estimation of dimension reduction space. J. Roy. Stat. Soc. B 64, 363–410 (2002)

    Article  MathSciNet  Google Scholar 

  • Z. Ye, R.E. Weiss, Using the bootstrap to select one of a new class of dimension reduction methods. J. Am. Stat. Assoc. 98, 968–979 (2003)

    Article  MathSciNet  Google Scholar 

  • X. Yin, B. Li, R. Cook, Successive direction extraction for estimating the central subspace in a multiple-index regression. J. Multivariate Anal. 99, 1733–1757 (2008)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The author would like to thank two referees and Professor Efstathia Bura for their thoughtful and helpful comments and suggestions. The author’s research is supported in part by the National Science Foundation grant DMS-1713078, which he gratefully acknowledges.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bing Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Li, B. (2021). Estimating Sufficient Dimension Reduction Spaces by Invariant Linear Operators. In: Bura, E., Li, B. (eds) Festschrift in Honor of R. Dennis Cook. Springer, Cham. https://doi.org/10.1007/978-3-030-69009-0_3

Download citation

Publish with us

Policies and ethics