Advertisement

Abstract

Unlike Isomap method that preserves geodesic distances, MVU method learns the data from the similarities, preserving both local distances and angles between the pairs of all neighbors of each point in the data set. Since the method keeps the local maximum variance in dimensionality reduction processing, it is called maximum variance unfolding (MVU). Like multidimensional scaling (MDS), MVU can be applied to the cases that only the local similarities of objects in a set are given. In these cases, MVU tries to find a configuration that preserves the given similarities. Technically, MVU adopts semidefinite programming (SDP) to solve the DR problems. Hence, it is also called semidefinite embedding (SDE) or SDP. Solving a DR problem using MVU is expensive regarding both memory cost and time cost. To overcome the shortage of high computational cost, landmark MVU (LMVU) is introduced. In Section 9.1, we describe the MVU method and the corresponding maximization model. In Section 9.2, we give a brief review of SDP and introduce several popular SDP software packages. The experiments and applications of MVU are included in Section 9.3. The LMVU is discussed in Section 9.4.

Keywords

Dimensionality Reduction Local Distance Sensor Location Neural Information Processing System Neighborhood System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Weinberger, K.Q., Packer, B.D., Saul, L.K.: Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization. In: Proc. of the 10th International Workshop on AI and Statistics (2005).Google Scholar
  2. [2]
    Weinberger, K.Q., Saul, L.K.: Unsupervised learning of image manifolds by semidefinite programming. IEEE Conference on Computer Vision and Pattern Recognition 2, 988–995 (2004).Google Scholar
  3. [3]
    Saul, L.K., Roweis, S.T., Weinberger, K.Q., Sha, F., Packer, B.: Unfolding a manifold by semidefinite programing. Presentation, Computer and Information Science, University of Pennsylvania (2005).Google Scholar
  4. [4]
    Vandenberghe, L., Boyd, S.: Semidefinite programming. SIAM Review 38, 49–95 (1996).MathSciNetzbMATHCrossRefGoogle Scholar
  5. [5]
    Wright, S.: Primal-Dual Interior-Point Methods. PA: SIAM, Philadelphia (1997).zbMATHCrossRefGoogle Scholar
  6. [6]
    Borchers, B.: CSDP User’s Guide (2009).Google Scholar
  7. [7]
    Helmberg, C., Rendl, F., Vanderbei, R.J., Wolkowicz, H.: An interior-point method for semidefinite programming. SIAM Journal on Optimization 6(2), 342–361 (1996).MathSciNetzbMATHCrossRefGoogle Scholar
  8. [8]
    Borchers, B.: CSDP, a C library for semidefinite programming. Optimization Methods & Software 11–2 (1–4), 613–623 (1999).Google Scholar
  9. [9]
    Fujisawa, K., Kojima, M., Nakata, K., Yamashita, M.: SDPA (semidefinite programming algorithm) users manual — version 6.00. Tech. Rep. B-308, Tokyo Institute of Technology (1995).Google Scholar
  10. [10]
    Sturm, J.F.: Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones. Optimization Methods & Software 11–2(1–4), 625–653 (1999).MathSciNetCrossRefGoogle Scholar
  11. [11]
    Toh, K.C., Tütüncü, R.H., Todd, M.J.: On the implementation and usage of SDPT3 — a Matlab software package for semidefinite-quadratic-linear programming (2006).Google Scholar
  12. [12]
    Toh, K., Todd, M., Tütüncü, R.: SDPT3 — a Matlab software package for semidefinite programming. Optimization Methods and Software 11, 545–581 (1999).MathSciNetCrossRefGoogle Scholar
  13. [13]
    Tütüncü, R., Toh, K., Todd, M.: Solving semidefinite-quadratic-linear programs using SDPT3. Mathematical Programming Ser. B 95, 189–217 (2003).zbMATHCrossRefGoogle Scholar
  14. [14]
    Weinberger, K.Q., Sha, F., Zhu, Q., Saul, L.K.: Graph Laplacian regularization for large-scale semidefinite programming. In: B. Schölkopf, J. Platt, T. Hofmann (eds.) Advances in Neural Information Processing Systems (NIPS), vol. 19. MIT Press, Cambridge, MA (2007).Google Scholar
  15. [15]
    Biswas, P., Liang, T.C., Toh, K.C., Wang, T.C., Ye, Y.: Semidefinite programming approaches for sensor network localization with noisy distance measurements. IEEE Transactions on Automation Science and Engineering 4(3), 360–371 (2006).CrossRefGoogle Scholar
  16. [16]
    Liu, R., Jain, V., Zhang, H.: Sub-sampling for efficient spectral mesh processing. In: T. Nishita, Q. Peng, H.P. Seidel (eds.) Proc. 24th Comput. Graph. Intl. Conf., Lecture Notes in Comput. Sci., vol. 4035, pp. 172–184. Springer, Berlin (2006).Google Scholar
  17. [17]
    Weinberger, K.Q., Sha, F., Saul, L.K.: Learning a kernel matrix for nonlinear dimensionality reduction. In: the Twenty First International Confernence on Machine Learning. Banff, Canada (2004).Google Scholar
  18. [18]
    de Silva, V., Tenenbaum, J.: Sparse multidimensional scaling using landmark points. Tech. rep., Stanford University (2004).Google Scholar
  19. [19]
    Belabbas, M.A., Wolfe, P.J.: On landmark selection and sampling in high-dimensional data analysis (2009). Submitted to Poyal Society, arXiv:0906. 4582v1.Google Scholar
  20. [20]
    Bengio, Y., Paiement, J., Vincent, P., Delalleau, O., Roux, N.L., Ouimet, M.: Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps, and spectral clustering. In: S. Thrun, L. Saul, B. Schölkopf (eds.) Advances in Neural Information Processing Systems. MIT Press, Cambridge, MA (2004).Google Scholar
  21. [21]
    Platt, J.C.: Fastmap, MetricMap, and Landmark MDS are all Nyström algorithms. In: Proc. 10th Intl. Worksh. Artif. Intell. Statist., pp. 261–268 (2005).Google Scholar
  22. [22]
    Weinberger, K.Q., Saul, L.K.: An introduction to nonlinear dimensionality reduction by maximum variance unfolding. In: AAAI (2006).Google Scholar

Copyright information

© Higher Education Press, Beijing and Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Jianzhong Wang
    • 1
  1. 1.Department of Mathematics and StatisticsSam Houston State UniversityHuntsvilleUSA

Personalised recommendations