Abstract
It is generally accepted that training in statistics must include some exposure to the mechanics of computational statistics. This exposure to computational methods is of an essential nature when we consider extremely high-dimensional data. Computer-aided techniques can help us to discover dependencies in high dimensions without complicated mathematical tools.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
L. Breiman, J.H. Friedman, R. Olshen, C.J. Stone, Classification and Regression Trees (Wadsworth, Belmont, 1984)
R.D. Cook, S. Weisberg, Comment on “Sliced inverse regression for dimension reduction”. J. Am. Stat. Assoc. 86(414), 328–332 (1991)
N. Duan, K.-C. Li, Slicing regression: a link-free regression method. Ann. Stat. 19(2), 505–530 (1991)
EUNITE. Electricity load forecast competition of the European Network on Intelligent Technologies for Smart Adaptive Systems (2001). http://neuron.tuke.sk/competition/
J.H. Friedman, W. Stuetzle, Projection Pursuit Classification (1981). (Unpublished manuscript)
J.H. Friedman, J.W. Tukey, A projection pursuit algorithm for exploratory data analysis. IEEE Trans. Comput. C 23, 881–890 (1974)
P. Hall, K.-C. Li, On almost linearity of low dimensional projections from high dimensional data. Ann. Stat. 21(2), 867–889 (1993)
J.L. Hodges, E.L. Lehman, The efficiency of some non-parametric competitors of the \(t\)-test. Ann. Math. Stat. 27, 324–335 (1956)
P. Huber, Projection pursuit. Ann. Stat. 13(2), 435–475 (1985)
M.C. Jones, R. Sibson, What is projection pursuit? (With discussion). J. R. Stat. Soc. Ser. A 150(1), 1–36 (1987)
K. Kendall, S. Stuart, Distribution Theory, vol. 1, The Advanced Theory of Statistics (Griffin, London, 1977)
J.B. Kruskal, Toward a practical method which helps uncover the structure of a set of observations by finding the line tranformation which optimizes a new “index of condensation”, in Statistical Computation, ed. by R.C. Milton, J.A. Nelder (Academic Press, New York, 1969), pp. 427–440
J.B. Kruskal, Linear transformation of multivariate data to reveal clustering, in Multidimensional Scaling: Theory and Applications in the Behavioural Sciences, volume 1, ed. by R.N. Shepard, A.K. Romney, S.B. Nerlove (Seminar Press, London, 1972), pp. 179–191
K.-C. Li, Sliced inverse regression for dimension reduction (with discussion). J. Am. Stat. Assoc. 86(414), 316–342 (1991)
K.-C. Li, On principal Hessian directions for data visualization and dimension reduction: another application of Stein’s lemma. J. Am. Stat. Assoc. 87, 1025–1039 (1992)
J. Mercer, Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. R. Soc. Lond. 209, 415–446 (1909)
J.R. Schott, Determining the dimensionality in sliced inverse regression. J. Am. Stat. Assoc. 89(425), 141–148 (1994)
V. Vapnik, The Nature of Statistical Learning Theory (Springer, New York, 1995)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Härdle, W.K., Simar, L. (2019). Computationally Intensive Techniques. In: Applied Multivariate Statistical Analysis. Springer, Cham. https://doi.org/10.1007/978-3-030-26006-4_20
Download citation
DOI: https://doi.org/10.1007/978-3-030-26006-4_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26005-7
Online ISBN: 978-3-030-26006-4
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)