Advertisement

Abstract

In nonlinear dimensionality reduction, the kernel dimension is the square of the vector number in the data set. In many applications, the number of data vectors is very large. The spectral decomposition of a large dimensioanl kernel encounters difficulties in at least three aspects: large memory usage, high computational complexity, and computational instability. Although the kernels in some nonlinear DR methods are sparse matrices, which enable us to overcome the difficulties in memory usage and computational complexity partially, yet it is not clear if the instability issue can be settled. In this chapter, we study some fast algorithms that avoid the spectral decomposition of large dimensional kernels in DR processing, dramatically reducing memory usage and computational complexity, as well as increasing numerical stability. In Section 15.1, we introduce the concepts of rank revealings. In Section 15.2, we present the randomized low rank approximation algorithms. In Section 15.3, greedy lank-revealing algorithms (GAT) and randomized anisotropic transformation algorithms (RAT), which approximate leading eigenvalues and eigenvectors of DR kernels, are introduced. Numerical experiments are shown in Section 15.4 to illustrate the validity of these algorithms. The justification of RAT algorithms is included in Section 15.5.

Keywords

Fast Algorithm Random Matrix Random Projection Nonlinear Dimensionality Reduction Swiss Roll 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Cheng, H., Gimbutas, Z., Martinsson, P.G., Rokhlin, V.: On the compression of low rank matrices. SIAM Journal on Scientific Computing 26(4), 1389–1404 (2005).MathSciNetzbMATHCrossRefGoogle Scholar
  2. [2]
    Woolfe, F., Liberty, E., Rokhlin, V., Tygert, M.: A randomized algorithm for the approximation of matrices. Appl. Comput. Harmon. Anal. 25(3), 335–366 (2008).MathSciNetzbMATHCrossRefGoogle Scholar
  3. [3]
    Hansen, P.C.: Rank-Deficient and Discrete Ill-Posed Problems: Numerical Aspects of Linear Inversion. SIAM, Philadelphia (1998).CrossRefGoogle Scholar
  4. [4]
    Stewart, G.W.: Matrix Algorithms Volume I: Basic Decompositions. SIAM, Philadelphia (1998).Google Scholar
  5. [5]
    Chan, T.F., Hansen, P.C.: Some applications of the rank revealing QR factorization. SIAM J. Sci. Statist. Comput. 13, 727–741 (1992).MathSciNetzbMATHCrossRefGoogle Scholar
  6. [6]
    Gu, M., Eisenstat, S.C.: Efficient algorithms for computing a strong rankrevealing QR factorization. SIAM J. Sci. Comput. 17, 848–869 (1996).MathSciNetzbMATHCrossRefGoogle Scholar
  7. [7]
    Hong, Y.P., Pan, C.T.: Rank-revealing QR factorizations and the singular value decomposition. Mathematics of Computation 58(197), 213–232 (1992).MathSciNetzbMATHGoogle Scholar
  8. [8]
    Berry, M., Pulatova, S., Stewart, G.: Algorithm 844: computing sparse reducedrank approximations to sparse matrices. ACM Trans Math Softw 31(2), 252–269 (2005).MathSciNetzbMATHCrossRefGoogle Scholar
  9. [9]
    Goreinov, S.A., Tyrtyshnikov, E.E., Zamarashkin, N.L.: A theory of pseudoskeleton approximations. Linear Algebra and Its Applications 261, 1–21 (1997).MathSciNetzbMATHCrossRefGoogle Scholar
  10. [10]
    Tyrtyshnikov, E.: Matrix bruhat decompositions with a remark on the QR (GR) algorithm. Linear Algebra Appl. 250, 61–68 (1997).MathSciNetzbMATHCrossRefGoogle Scholar
  11. [11]
    Tyrtyshnikov, E., Zamarashkin, N.: Thin structure of eigenvalue clusters for non-hermitian Toeplitz matrices. Linear Algebra Appl. 292, 297–310 (1999).MathSciNetzbMATHCrossRefGoogle Scholar
  12. [12]
    Zamarashkin, N., Tyrtyshnikov, E.: Eigenvalue estimates for Hankel matrices. Sbornik: Mathematics 192, 59–72 (2001).MathSciNetCrossRefGoogle Scholar
  13. [13]
    Fierro, R., Bunch, J.: Bounding the subspaces from rank revealing two-sided orthogonal decompositions. SIAM Matrix Anal. Appl. 16, 743–759 (1995).MathSciNetzbMATHCrossRefGoogle Scholar
  14. [14]
    Fierro, R., Hansen, P.: Low-rank revealing UTV decompositions. Numerical Algorithms 15, 37–55 (1997).MathSciNetzbMATHCrossRefGoogle Scholar
  15. [15]
    Fierro, R., Hansen, P.C., Hansen, P.S.K.: UTV Tools: Matlab templates for rank-revealing UTV decompositions. Numerical Algorithms 20, 165–194 (1999).MathSciNetzbMATHCrossRefGoogle Scholar
  16. [16]
    Golub, G.H., van Loan, C.F.: Matrix Computations, third edn. Johns Hopkins Press, Baltimore (1996).zbMATHGoogle Scholar
  17. [17]
    Fierro, R., Hansen, P.: UTV Expansion Pack: Special-purpose rank revaling algorithms. Numerical Algorithms 40, 47–66 (2005).MathSciNetzbMATHCrossRefGoogle Scholar
  18. [18]
    Hansen, P.C., Yalamov, P.Y.: Computing symmetric rank-revealing decompositions via triangular factorization. SIAM J. Matrix Anal. Appl. 28, 443–458 (2001).MathSciNetCrossRefGoogle Scholar
  19. [19]
    Luk, F.T., S, Q.: A symmetric rank-revealing Toeplitz matrix decomposition. J. VLSI Signal Proc. 14, 19–28 (1996).zbMATHCrossRefGoogle Scholar
  20. [20]
    Belabbas, M.A., Wolfe, P.J.: Fast low-rank approximation for covariance matrices. In: Proceedings of the 2nd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (2007).Google Scholar
  21. [21]
    Belabbas, M.A., Wolfe, P.J.: On sparse representations of linear operators and the approximation of matrix products. In: Proceedings of the 42nd Annual Conference on Information Sciences and Systems, pp. 258–263 (2008).Google Scholar
  22. [22]
    Fowlkes, C., Belongie, S., Chung, F., Malik, J.: Spectral grouping using the Nyström method. IEEE Trans. Patt. Anal. Mach. Intell. pp. 214–225 (2004).Google Scholar
  23. [23]
    Parker, P., Wolfe, P.J., Tarokh, V.: A signal processing application of randomized low-rank approximations. In: IEEE Worksh. Statist. Signal Process., pp. 345–350 (2005).Google Scholar
  24. [24]
    Williams, C.K.I., Seeger, M.: Using the Nyström method to speed up kernel machines. In: Neural Information Processing Systems, pp. 682–688 (2000).Google Scholar
  25. [25]
    Martinsson, P.G., Rokhlin, V., Tygert, M.: A randomized algorithm for the approximation of matrices. Tech. Rep. 1361, Dept. of Computer Science, Yale University (2006).Google Scholar
  26. [26]
    Martinsson, P.G., Rokhlin, V., Tygert, M.: On interpolation and integration in finite-dimensional spaces of bounded functions. Comm. Appl. Math. Comput. Sci. pp. 133–142 (2006).Google Scholar
  27. [27]
    Belabbas, M.A., Wolfe, P.J.: Spectral methods in machine learning: New strategies for very large data sets. PANS 106(2), 369–374 (2009).CrossRefGoogle Scholar
  28. [28]
    Chui, C., Wang, J.: Dimensionality reduction of hyper-spectral imagery data for feature classification. In: W. Freeden, Z. Nashed, T. Sonar (eds.) Handbook of Geomathematics. Springer, Berlin (2010).Google Scholar
  29. [29]
    Chui, C., Wang, J.: Randomized anisotropic transform for nonlinear dimensionality reduction. International Journal on Geomathematics 1(1), 23–50 (2010).MathSciNetzbMATHCrossRefGoogle Scholar
  30. [30]
    Xiao, L., Sun, J., Boyd, S.P.: A duality view of spectral methods for dimensionality reduction. In: W.W. Cohen, A. Moore (eds.) Machine Learning: Proceedings of the Twenty-Third International Conference, ACM International Conference Proceeding Series, vol. 148, pp. 1041–1048. ICML, Pittsburgh, Pennsylvania, USA (2006).Google Scholar
  31. [31]
    Goldstine, H.H., von Neumann, J.: Numerical inverting of matrices of high order II. Amer. Math. Soc. Proc. 2, 188–202 (1951).zbMATHCrossRefGoogle Scholar
  32. [32]
    Chen, Z., Dongarra, J.J.: Condition numbers of Gaussian random matrices. SIAM J. on Matrix Anal. Appl. 27, 603–620 (2005).MathSciNetCrossRefGoogle Scholar

Copyright information

© Higher Education Press, Beijing and Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Jianzhong Wang
    • 1
  1. 1.Department of Mathematics and StatisticsSam Houston State UniversityHuntsvilleUSA

Personalised recommendations