The Equivalence Between Principal Component Analysis and Nearest Flat in the Least Square Sense

Article

Abstract

In this paper, we declare the equivalence between the principal component analysis and the nearest q-flat in the least square sense by showing that, for given m data points, the linear manifold with nearest distance is identical to the linear manifold with largest variance. Furthermore, from this observation, we give a new simpler proof for the approach to find the nearest q-flat.

Keywords

Linear manifold Unsupervised learning Nearest q-flat Principal component analysis Eigenvalue decomposition 

Mathematics Subject Classification

15A18 58C40 

References

  1. 1.
    Bradley, P., Mangasarian, O.: k-plane clustering. J. Glob. Optim. 16(1), 23–32 (2000)MATHMathSciNetCrossRefGoogle Scholar
  2. 2.
    Tseng, P.: Nearest q-flat to m points. J. Optim. Theory Appl. 105(1), 249–252 (2000)MATHMathSciNetCrossRefGoogle Scholar
  3. 3.
    Zhang, T., Szlam, A., Wang, Y., Lerman, G.: Randomized hybrid linear modeling by local best-fit flats. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1927–1934, 2010Google Scholar
  4. 4.
    Chen, G., Lerman, G.: Spectral curvature clustering (SCC). Int. J. Comput. Vis. 81(3), 317–330 (2009)CrossRefGoogle Scholar
  5. 5.
    Chen, G., Lerman, G.: Foundations of a multi-way spectral clustering framework for hybrid linear modeling. Found. Comput. Math. 9(5), 517–558 (2009)MATHMathSciNetCrossRefGoogle Scholar
  6. 6.
    Wang, Y., Jiang, Y., Wu, Y., Zhou, Z.H.: Spectral clustering on multiple manifolds. IEEE Trans. Neural Netw. 22(7), 1149–1161 (2011)CrossRefGoogle Scholar
  7. 7.
    Shao, Y.H., Bai, L., Wang, Z., Hua, X.Y., Deng, N.Y.: Proximal plane clustering via eigenvalues. Proc. Comput. Sci. 17, 41–47 (2013)CrossRefGoogle Scholar
  8. 8.
    Amaldi, E., Dhyani, K., Liberti, L.: A two-phase heuristic for the bottleneck k-hyperplane clustering problem. Comput. Optim. Appl. 56(3), 619–633 (2013)MATHMathSciNetCrossRefGoogle Scholar
  9. 9.
    Szlam, A., Sapiro, G.: Discriminative k-metrics. In: 2009 ACM Conference on Proceedings of the International Conference on Machine Learning (ICML), pp. 1009–1016, 2009Google Scholar
  10. 10.
    Lerman, G., Zhang, T.: Robust recovery of multiple subspaces by geometric lp minimization. Ann. Stat. 39(5), 2686–2715 (2011)MATHMathSciNetCrossRefGoogle Scholar
  11. 11.
    Ramirez, I., Sprechmann, P., Sapiro, G.: Classification and clustering via dictionary learning with structured incoherence and shared features. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3501–3508, 2010Google Scholar
  12. 12.
    Thiagarajan, J.J., Ramamurthy, K.N., Spanias, A.: Multilevel dictionary learning for sparse representation of images. In: 2011 IEEE Digital Signal Processing Workshop and IEEE Signal Processing Education Workshop (DSP/SPE), pp. 271–276, 2011Google Scholar
  13. 13.
    Abdi, H., Williams, L.J.: Principal component analysis. Wiley Interdiscip. Rev. 2(4), 433–459 (2010)CrossRefGoogle Scholar
  14. 14.
    Jolliffe, I.: Principal Component Analysis. Wiley, New York (2005)CrossRefGoogle Scholar
  15. 15.
    Ringnér, M.: What is principal component analysis? Nat. Biotechnol. 26(3), 303–304 (2008)CrossRefGoogle Scholar
  16. 16.
    Demšar, U., Harris, P., Brunsdon, C., Fotheringham, A.S., McLoone, S.: Principal component analysis on spatial data: an overview. Ann. Assoc. Am. Geogr. 103(1), 106–128 (2013)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Zhijiang CollegeZhejiang University of TechnologyHangzhouPeople’s Republic of China
  2. 2.College of ScienceChina Agricultural UniversityBeijingPeople’s Republic of China

Personalised recommendations