VC Dimensions of Principal Component Analysis
Motivated by statistical learning theoretic treatment of principal component analysis, we are concerned with the set of points in ℝ d that are within a certain distance from a k-dimensional affine subspace. We prove that the VC dimension of the class of such sets is within a constant factor of (k+1)(d−k+1), and then discuss the distribution of eigenvalues of a data covariance matrix by using our bounds of the VC dimensions and Vapnik’s statistical learning theory. In the course of the upper bound proof, we provide a simple proof of Warren’s bound of the number of sign sequences of real polynomials.
- 2.Basu, S., Pollack, R., Roy, M.-F.: An asymptotically tight bound on the number of connected components of realizable sign conditions. Combinatorica (to appear) Google Scholar