Skip to main content

Computationally Intensive Techniques

  • Chapter
  • First Online:
Applied Multivariate Statistical Analysis

Abstract

It is generally accepted that training in statistics must include some exposure to the mechanics of computational statistics. This exposure to computational methods is of an essential nature when we consider extremely high-dimensional data. Computer-aided techniques can help us to discover dependencies in high dimensions without complicated mathematical tools.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • L. Breiman, J.H. Friedman, R. Olshen, C.J. Stone, Classification and Regression Trees (Wadsworth, Belmont, 1984)

    Google Scholar 

  • R.D. Cook, S. Weisberg, Comment on “Sliced inverse regression for dimension reduction”. J. Am. Stat. Assoc. 86(414), 328–332 (1991)

    MATH  Google Scholar 

  • N. Duan, K.-C. Li, Slicing regression: a link-free regression method. Ann. Stat. 19(2), 505–530 (1991)

    Article  MathSciNet  Google Scholar 

  • EUNITE. Electricity load forecast competition of the European Network on Intelligent Technologies for Smart Adaptive Systems (2001). http://neuron.tuke.sk/competition/

  • J.H. Friedman, W. Stuetzle, Projection Pursuit Classification (1981). (Unpublished manuscript)

    Google Scholar 

  • J.H. Friedman, J.W. Tukey, A projection pursuit algorithm for exploratory data analysis. IEEE Trans. Comput. C 23, 881–890 (1974)

    Article  Google Scholar 

  • P. Hall, K.-C. Li, On almost linearity of low dimensional projections from high dimensional data. Ann. Stat. 21(2), 867–889 (1993)

    Article  MathSciNet  Google Scholar 

  • J.L. Hodges, E.L. Lehman, The efficiency of some non-parametric competitors of the \(t\)-test. Ann. Math. Stat. 27, 324–335 (1956)

    Article  Google Scholar 

  • P. Huber, Projection pursuit. Ann. Stat. 13(2), 435–475 (1985)

    Article  MathSciNet  Google Scholar 

  • M.C. Jones, R. Sibson, What is projection pursuit? (With discussion). J. R. Stat. Soc. Ser. A 150(1), 1–36 (1987)

    Article  Google Scholar 

  • K. Kendall, S. Stuart, Distribution Theory, vol. 1, The Advanced Theory of Statistics (Griffin, London, 1977)

    Google Scholar 

  • J.B. Kruskal, Toward a practical method which helps uncover the structure of a set of observations by finding the line tranformation which optimizes a new “index of condensation”, in Statistical Computation, ed. by R.C. Milton, J.A. Nelder (Academic Press, New York, 1969), pp. 427–440

    Google Scholar 

  • J.B. Kruskal, Linear transformation of multivariate data to reveal clustering, in Multidimensional Scaling: Theory and Applications in the Behavioural Sciences, volume 1, ed. by R.N. Shepard, A.K. Romney, S.B. Nerlove (Seminar Press, London, 1972), pp. 179–191

    Google Scholar 

  • K.-C. Li, Sliced inverse regression for dimension reduction (with discussion). J. Am. Stat. Assoc. 86(414), 316–342 (1991)

    Article  Google Scholar 

  • K.-C. Li, On principal Hessian directions for data visualization and dimension reduction: another application of Stein’s lemma. J. Am. Stat. Assoc. 87, 1025–1039 (1992)

    Article  MathSciNet  Google Scholar 

  • J. Mercer, Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. R. Soc. Lond. 209, 415–446 (1909)

    Article  Google Scholar 

  • J.R. Schott, Determining the dimensionality in sliced inverse regression. J. Am. Stat. Assoc. 89(425), 141–148 (1994)

    Article  MathSciNet  Google Scholar 

  • V. Vapnik, The Nature of Statistical Learning Theory (Springer, New York, 1995)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wolfgang Karl Härdle .

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Härdle, W.K., Simar, L. (2019). Computationally Intensive Techniques. In: Applied Multivariate Statistical Analysis. Springer, Cham. https://doi.org/10.1007/978-3-030-26006-4_20

Download citation

Publish with us

Policies and ethics