Skip to main content

Cook’s Fisher Lectureship Revisited for Semi-supervised Data Reduction

  • Chapter
  • First Online:
Festschrift in Honor of R. Dennis Cook
  • 327 Accesses

Abstract

R. D. Cook’s Fisher lectureship (Cook, Stat Sci 22:1–26, 2007) opened a new and seminal paradigm in sufficient dimension reduction literature. It suggests a model-based approach, and it enables us to reduce the dimension of categorical and continuous predictors simultaneously. Here, the lectureship is extended for reducing the predictors in semi-supervised data under an isotonic error model, which has been common in many popular science fields such as speech recognition, spam email filtering, artificial intelligence, video surveillance, and so on. Under the isotonic error model, a combined dimension reduction model is proposed for semi-supervised data, and related theories are investigated. Numerical studies and real data example confirm its potential usefulness.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  • K. Adragni, R.D. Cook, Sufficient dimension reduction and prediction in regression. Philos. Trans. R. Soc. A 367, 4385–4405 (2009)

    Article  MathSciNet  Google Scholar 

  • R. Ando, T. Zhang, A framework for learning predictive structures from multiple tasks and unlabeled data. J. Mach. Learn. Res. 6, 1817–1853 (2005)

    MathSciNet  MATH  Google Scholar 

  • A. Blum, T. Mitchell, Combining labeled and unlabeled data with co-training, in Proceedings of the eleventh Annual Conference on Computational Learning Theory (1998), pp. 92–100

    Google Scholar 

  • E. Bura, L. Forzani, Sufficient reductions in regressions with elliptically contoured inverse predictors. J. Am. Stat. Assoc. 110, 420–434 (2015)

    Article  MathSciNet  Google Scholar 

  • E. Bura, S. Duarte, L. Forzani, Sufficient reductions in regressions with exponential family inverse predictors. J. Am. Stat. Assoc. 111, 1313–1329 (2016)

    Article  MathSciNet  Google Scholar 

  • O. Chapelle, A. Zien, Semi-supervised classification by low density separation, in Proceedings of International Workshop on Artificial Intelligence and Statistics (2005), pp. 57–64

    Google Scholar 

  • X. Chen, F. Zou, R.D. Cook, Coordinate-independent sparse sufficient dimension reduction and variable selection. Ann. Stat. 38, 3696–3723 (2010)

    MathSciNet  MATH  Google Scholar 

  • M. Collins, Y. Singer, Unsupervised models for named entity classification, in Proceedings Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora (1999), pp. 100–110

    Google Scholar 

  • R.D. Cook, Regression Graphics (Wiley, New York, 1998)

    Book  Google Scholar 

  • R.D. Cook, Dimension reduction and graphical exploration in regression including survival analysis. Stat. Med. 22, 1399–1413 (2003)

    Article  Google Scholar 

  • R.D. Cook, Fisher lecture: dimension reduction in regression. Stat. Sci. 22, 1–26 (2007)

    MathSciNet  MATH  Google Scholar 

  • R.D. Cook, Principal components, sufficient dimension reduction and envelopes. Annu. Rev. Stat. Appl. 5, 533–559 (2018)

    Article  MathSciNet  Google Scholar 

  • R.D. Cook, L. Li, Dimension reduction in regressions with exponential family predictors. J. Comput. Graph. Stat. 18, 774–791 (2009)

    Article  MathSciNet  Google Scholar 

  • R.D. Cook, L. Forzani, Principal fitted components for dimension reduction in regression. Stat. Sci. 485, 485–501 (2009)

    MathSciNet  MATH  Google Scholar 

  • R.D. Cook, L. Forzani, Likelihood-based sufficient dimension reduction. J. Am. Stat. Assoc. 104, 197–208 (2009)

    Article  MathSciNet  Google Scholar 

  • S. Ding, R.D. Cook, Dimensional folding PCA and PFC for matrix-valued predictors. Stat. Sinica 24, 463–492 (2013)

    MATH  Google Scholar 

  • J. Hooper, Simultaneous equations and canonical correlation theory. Econometrika 27, 245–256 (1959)

    Article  MathSciNet  Google Scholar 

  • B. Li, Sufficient Dimension Reduction: Methods and Applications with R (Chapman and Hall/CRC, New York, 2018)

    Book  Google Scholar 

  • K. Nigam, A. McCallum, S. Thrun, T. Mitchell, Text classification from labeled and unlabeled documents using EM. Mach. Learn. 39, 103–134 (1998)

    Article  Google Scholar 

  • V. Vapnik, Statistical Learning Theory (Wiley, New York, 1998)

    MATH  Google Scholar 

  • J. Wang, X. Shen, Large margin semi-supervised learning. J. Mach. Learn. Res. 8, 1867–1891 (2007)

    MathSciNet  MATH  Google Scholar 

  • J. Wang, X. Shen, W. Pan, On transductive support vector machine. Contemp. Math. 43, 7–19 (2007)

    Article  MathSciNet  Google Scholar 

  • J.K. Yoo, Fused sliced inverse regression in survival analysis. Commun. Stat. Appl. 24, 533–541 (2008)

    Google Scholar 

  • J.K. Yoo, R.D. Cook, Response dimension reduction for the conditional mean in multivariate regression. Comput. Stat. Data Anal. 53, 334–343 (2008)

    Article  MathSciNet  Google Scholar 

  • J.K. Yoo, K. Lee, Model-free predictor tests in survival regression through sufficient dimension reduction. Lifetime Data Anal. 17, 433–444 (2011)

    Article  MathSciNet  Google Scholar 

  • X. Zhu, Z. Ghahramani, J. Lafferty, Semi-supervised learning using Gaussian fields and harmonic functions, in Proceedings of the Twentieth International Conference on International Conference on Machine Learning (2003), pp. 912–919

    Google Scholar 

Download references

Acknowledgements

I sincerely appreciate Professor R. D. Cook to raise me from an academic kid and to inspire and support me all the time. Without you, I cannot be present me. You make myself try to be better person, father, and professor.

For Jae Keun Yoo, this work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean Ministry of Education (NRF-2019R1F1A1050715/2019R1A6A1A11051177).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jae Keun Yoo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Yoo, J.K. (2021). Cook’s Fisher Lectureship Revisited for Semi-supervised Data Reduction. In: Bura, E., Li, B. (eds) Festschrift in Honor of R. Dennis Cook. Springer, Cham. https://doi.org/10.1007/978-3-030-69009-0_9

Download citation

Publish with us

Policies and ethics