Abstract
R. D. Cook’s Fisher lectureship (Cook, Stat Sci 22:1–26, 2007) opened a new and seminal paradigm in sufficient dimension reduction literature. It suggests a model-based approach, and it enables us to reduce the dimension of categorical and continuous predictors simultaneously. Here, the lectureship is extended for reducing the predictors in semi-supervised data under an isotonic error model, which has been common in many popular science fields such as speech recognition, spam email filtering, artificial intelligence, video surveillance, and so on. Under the isotonic error model, a combined dimension reduction model is proposed for semi-supervised data, and related theories are investigated. Numerical studies and real data example confirm its potential usefulness.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
K. Adragni, R.D. Cook, Sufficient dimension reduction and prediction in regression. Philos. Trans. R. Soc. A 367, 4385–4405 (2009)
R. Ando, T. Zhang, A framework for learning predictive structures from multiple tasks and unlabeled data. J. Mach. Learn. Res. 6, 1817–1853 (2005)
A. Blum, T. Mitchell, Combining labeled and unlabeled data with co-training, in Proceedings of the eleventh Annual Conference on Computational Learning Theory (1998), pp. 92–100
E. Bura, L. Forzani, Sufficient reductions in regressions with elliptically contoured inverse predictors. J. Am. Stat. Assoc. 110, 420–434 (2015)
E. Bura, S. Duarte, L. Forzani, Sufficient reductions in regressions with exponential family inverse predictors. J. Am. Stat. Assoc. 111, 1313–1329 (2016)
O. Chapelle, A. Zien, Semi-supervised classification by low density separation, in Proceedings of International Workshop on Artificial Intelligence and Statistics (2005), pp. 57–64
X. Chen, F. Zou, R.D. Cook, Coordinate-independent sparse sufficient dimension reduction and variable selection. Ann. Stat. 38, 3696–3723 (2010)
M. Collins, Y. Singer, Unsupervised models for named entity classification, in Proceedings Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora (1999), pp. 100–110
R.D. Cook, Regression Graphics (Wiley, New York, 1998)
R.D. Cook, Dimension reduction and graphical exploration in regression including survival analysis. Stat. Med. 22, 1399–1413 (2003)
R.D. Cook, Fisher lecture: dimension reduction in regression. Stat. Sci. 22, 1–26 (2007)
R.D. Cook, Principal components, sufficient dimension reduction and envelopes. Annu. Rev. Stat. Appl. 5, 533–559 (2018)
R.D. Cook, L. Li, Dimension reduction in regressions with exponential family predictors. J. Comput. Graph. Stat. 18, 774–791 (2009)
R.D. Cook, L. Forzani, Principal fitted components for dimension reduction in regression. Stat. Sci. 485, 485–501 (2009)
R.D. Cook, L. Forzani, Likelihood-based sufficient dimension reduction. J. Am. Stat. Assoc. 104, 197–208 (2009)
S. Ding, R.D. Cook, Dimensional folding PCA and PFC for matrix-valued predictors. Stat. Sinica 24, 463–492 (2013)
J. Hooper, Simultaneous equations and canonical correlation theory. Econometrika 27, 245–256 (1959)
B. Li, Sufficient Dimension Reduction: Methods and Applications with R (Chapman and Hall/CRC, New York, 2018)
K. Nigam, A. McCallum, S. Thrun, T. Mitchell, Text classification from labeled and unlabeled documents using EM. Mach. Learn. 39, 103–134 (1998)
V. Vapnik, Statistical Learning Theory (Wiley, New York, 1998)
J. Wang, X. Shen, Large margin semi-supervised learning. J. Mach. Learn. Res. 8, 1867–1891 (2007)
J. Wang, X. Shen, W. Pan, On transductive support vector machine. Contemp. Math. 43, 7–19 (2007)
J.K. Yoo, Fused sliced inverse regression in survival analysis. Commun. Stat. Appl. 24, 533–541 (2008)
J.K. Yoo, R.D. Cook, Response dimension reduction for the conditional mean in multivariate regression. Comput. Stat. Data Anal. 53, 334–343 (2008)
J.K. Yoo, K. Lee, Model-free predictor tests in survival regression through sufficient dimension reduction. Lifetime Data Anal. 17, 433–444 (2011)
X. Zhu, Z. Ghahramani, J. Lafferty, Semi-supervised learning using Gaussian fields and harmonic functions, in Proceedings of the Twentieth International Conference on International Conference on Machine Learning (2003), pp. 912–919
Acknowledgements
I sincerely appreciate Professor R. D. Cook to raise me from an academic kid and to inspire and support me all the time. Without you, I cannot be present me. You make myself try to be better person, father, and professor.
For Jae Keun Yoo, this work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean Ministry of Education (NRF-2019R1F1A1050715/2019R1A6A1A11051177).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Yoo, J.K. (2021). Cook’s Fisher Lectureship Revisited for Semi-supervised Data Reduction. In: Bura, E., Li, B. (eds) Festschrift in Honor of R. Dennis Cook. Springer, Cham. https://doi.org/10.1007/978-3-030-69009-0_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-69009-0_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-69008-3
Online ISBN: 978-3-030-69009-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)