Skip to main content
Log in

On principal Hessian directions for multivariate response regressions

  • Original Paper
  • Published:
Computational Statistics Aims and scope Submit manuscript

Abstract

We consider a multivariate response regression analysis with a vector of predictors. In this article, we develop the modification of principal Hessian directions based on principal components for estimating the central mean subspace without requiring a prespecified parametric model. We use a permutation test suggested by Cook and Yin (Aust New Z J Stat 43:147–199, 2001) for inference about the dimension. Simulation results and one real data are reported, and comparisons are made with four methods—most predictable variates, k-means inverse regression, optimal method of Yoo and Cook (Biometrika 94:231–242, 2007) and canonical correlation approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Aragon Y (1997) A Gauss implementation of multivariate sliced inverse regression. Comp Stat 12: 355–372

    MATH  MathSciNet  Google Scholar 

  • Bura E, Cook RD (2001) Estimating the structural dimension of regressions via parametric inverse regression. J Roy Stat Soc Ser B 63: 393–410

    Article  MATH  MathSciNet  Google Scholar 

  • Cheng CS, Li KC (1995) A study of the method of principal Hessian directions for analysis of data from designed experiments. Stat Sin 5: 617–639

    MATH  MathSciNet  Google Scholar 

  • Cook RD (1994) On the interpretation of regression plots. J Am Stat Assoc 89: 177–189

    Article  MATH  Google Scholar 

  • Cook RD (1998) Principal Hessian directions revisited (with discussion). J Am Stat Assoc 93: 84–100

    Article  MATH  Google Scholar 

  • Cook RD (2007) Fisher lecture: dimension reduction in regression. Stat Sci 22: 1–26

    Article  Google Scholar 

  • Cook RD, Li B (2002) Dimension reduction for conditional mean in regression. Ann Stat 30: 455–474

    Article  MATH  MathSciNet  Google Scholar 

  • Cook RD, Li B, Chiaromonte F (2009) Envelope models for parsimonious and efficient multivariate linear regression (with discussion). Stat Sin (to appear)

  • Cook RD, Nachtsheim JC (1994) Reweighting to achieve elliptically contoured covariates in regression. J Am Stat Assoc 89: 592–599

    Article  MATH  Google Scholar 

  • Cook RD, Ni L (2005) Sufficient dimension reduction via inverse regression: a minimum discrepancy approach. J Am Stat Assoc 100: 410–428

    Article  MATH  MathSciNet  Google Scholar 

  • Cook RD, Setodji CM (2003) A model-free test for reduced rank in multivariate regression. J Am Stat Assoc 98: 340–351

    Article  MATH  MathSciNet  Google Scholar 

  • Cook RD, Weisberg S (1991) Comment on “Sliced inverse regression for dimension,” by KC Li. J Am Stat Assoc 86: 328–332

    Article  Google Scholar 

  • Cook RD, Yin X (2001) Dimension reduction and visualization in discriminant analysis (with discussion). Aust New Z J Stat 43: 147–199

    Article  MATH  MathSciNet  Google Scholar 

  • Diaconis P, Freedman D (1984) Asymptotics of graphical projection pursuit. Ann Stat 12: 793–815

    Article  MATH  MathSciNet  Google Scholar 

  • Eaton ML (1986) A characterization of spherical distributions. J Multivar Anal 20: 272–276

    Article  MATH  MathSciNet  Google Scholar 

  • Hall P, Li KC (1993) On almost linearity of low dimensional projections from high dimensional data. Ann Stat 21: 867–889

    Article  MATH  MathSciNet  Google Scholar 

  • Hotelling H (1935) The most predictable criterion. J Educ Psychol 26: 139–142

    Article  Google Scholar 

  • Hsing T (1999) Nearest-neighbor inverse regression. Ann Stat 27: 697–731

    Article  MATH  MathSciNet  Google Scholar 

  • Li B, Wen S, Zhu L (2008) On a projective resampling method for dimension reduction with multivariate responses. J Am Stat Assoc 103: 1177–1186

    Article  MathSciNet  Google Scholar 

  • Li B, Zha H, Chiaromonte F (2005) Contour regression: a general approach to dimension reduction. Ann Stat 33: 1580–1616

    Article  MATH  MathSciNet  Google Scholar 

  • Li KC (1989) Data visualization with SIR: a transformation-based projection pursuit method. UCLA statistical series 24. University of California, Los Angeles

    Google Scholar 

  • Li KC (1991) Sliced inverse regression for dimension reduction (with discussion). J Am Stat Assoc 86: 316–342

    Article  MATH  Google Scholar 

  • Li KC (1992) On principal Hessian directions for data visualization and dimension reduction: another application of Stein’s lemma. J Am Stat Assoc 87: 1025–1039

    Article  MATH  Google Scholar 

  • Li KC, Aragon Y, Shedden K, Agnan CT (2003) Dimension reduction for multivariate response data. J Am Stat Assoc 98: 99–109

    Article  MATH  Google Scholar 

  • Lue HH (2004) Principal Hessian directions for regression with measurement error. Biometrika 91: 409–423

    Article  MATH  MathSciNet  Google Scholar 

  • Lue HH (2009) Sliced inverse regression for multivariate response regression. J Stat Plan Inference 139: 2656–2664

    Article  MATH  MathSciNet  Google Scholar 

  • Pan WH, Bai CH, Chen JR, Chiu HC (1997) Associations between carotid atherosclerosis and high factor VIII activity, dyslipidemia, and hypertension. Stroke 28: 88–94

    Google Scholar 

  • Saracco J (2005) Asymptotics for pooled marginal slicing estimator based on SIRα approach. J Multivar Anal 96: 117–135

    Article  MATH  MathSciNet  Google Scholar 

  • Setodji CM, Cook RD (2004) K-means inverse regression. Technometrics 46: 421–429

    Article  MathSciNet  Google Scholar 

  • Stein C (1981) Estimation of the mean of a multivariate normal distribution. Ann Stat 9: 1135–1151

    Article  MATH  Google Scholar 

  • Xia Y, Tong H, Li WK, Zhu LX (2002) An adaptive estimation of dimension reduction space. J Roy Stat Soc Ser B 64: 363–410

    Article  MATH  MathSciNet  Google Scholar 

  • Yin X, Bura E (2006) Moment-based dimension reduction for multivariate response regression. J Stat Plan Inference 136: 3675–3688

    Article  MATH  MathSciNet  Google Scholar 

  • Yoo JK, Cook RD (2007) Optimal sufficient dimension reduction for the conditional mean in multivariate regression. Biometrika 94: 231–242

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Heng-Hui Lue.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lue, HH. On principal Hessian directions for multivariate response regressions. Comput Stat 25, 619–632 (2010). https://doi.org/10.1007/s00180-010-0192-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00180-010-0192-6

Keywords

Navigation