International Encyclopedia of Statistical Science

2011 Edition
| Editors: Miodrag Lovric

Principal Component Analysis

  • Ian Jolliffe
Reference work entry


Large or massive data sets are increasingly common and often include measurements on many variables. It is frequently possible to reduce the number of variables considerably while still retaining much of the information in the original data set. Principal component analysis (PCA) is probably the best known and most widely used dimension-reducing technique for doing this. Suppose we have n measurements on a vector x of p random variables, and we wish to reduce the dimension from p to q, where q is typically much smaller than p. PCA does this by finding linear combinations, a1x, a2x, , aqx, called principal components, that successively have maximum variance for the data, subject to being uncorrelated with previous { a}k{ x}s. Solving this maximization problem, we find that the vectors a1, a2, , aq are the eigenvectors of the covariance matrix, S, of the data, corresponding to the q largest eigenvalues (see  Eigenvalue, Eigenvector and Eigenspace). The eigenvalues...

This is a preview of subscription content, log in to check access.

References and Further Reading

  1. Hotelling H (1933) Analysis of a complex of statistical variables into principal components. J Educ Psychol 24:417–441, 498–520Google Scholar
  2. Jackson JE (1991) A user’s guide to principal components. Wiley, New YorkzbMATHGoogle Scholar
  3. Jolliffe IT (2002) Principal component analysis, 2nd edn. Springer, New YorkzbMATHGoogle Scholar
  4. Pearson K (1901) On lines and planes of closest fit to systems of points in space. Philos Mag 2:559–572Google Scholar
  5. Yule W, Berger M, Butler S, Newham V, Tizard J (1969) The WPPSI: an empirical evaluation with a British sample. Brit J Educ Psychol 39:1–13Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Ian Jolliffe
    • 1
  1. 1.University of ExeterExeterUK