Abstract
The foremost nonlinear dimensionality reduction algorithms provide an embedding only for the given training data, with no straightforward extension for test points. This shortcoming makes them unsuitable for problems such as classification and regression. We propose a novel dimensionality reduction algorithm which learns a parametric mapping between the high-dimensional space and the embedded space. The key observation is that when the dimensionality of the data exceeds its quantity, it is always possible to find a linear transformation that preserves a given subset of distances, while changing the distances of another subset. Our method first maps the points into a high-dimensional feature space, and then explicitly searches for an affine transformation that preserves local distances while pulling non-neighbor points as far apart as possible. This search is formulated as an instance of semi-definite programming, and the resulting transformation can be used to map out-of-sample points into the embedded space.
Chapter PDF
Similar content being viewed by others
References
Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Proceedings NIPS (2001)
Cox, T., Cox, M.: Multidimensional Scaling, 2nd edn. Chapman Hall, Boca Raton (2001)
Ham, J., Lee, D., Mika, S., Schölkopf, B.: A kernel view of the dimensionality reduction of manifolds. In: International Conference on Machine Learning (2004)
Jolliffe, I.: Principal Component Analysis. Springer, New York (1986)
Mika, S., Schölkopf, B., Smola, A., Müller, K.R., Scholz, M., Rätsch, G.: Kernel PCA and de-noising in feature spaces. In: Kearns, M.S., Solla, S.A., Cohn, D.A. (eds.) Proceedings NIPS 11. MIT Press, Cambridge (1999)
Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)
Samaria, F., Harter, A.: Parameterisation of a Stochastic Model for Human Face Identification. In: Proceedings of 2nd IEEE Workshop on Applications of Computer Vision (1994)
Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge (2002)
Shaw, B., Jebara, T.: Minimum volume embedding. In: Meila, M., Shen, X. (eds.) Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, San Juan, Puerto Rico, March 21-24. JMLR: W&CP, vol. 2, pp. 460–467 (2007)
Shaw, B., Jebara, T.: Structure preserving embedding. In: Bottou, L., Littman, M. (eds.) Proceedings of the 26th International Conference on Machine Learning, pp. 937–944. Omnipress, Montreal (June 2009)
Song, L., Smola, A.J., Borgwardt, K.M., Gretton, A.: Colored maximum variance unfolding. In: NIPS (2007)
Tenenbaum, J.: Mapping a manifold of perceptual observations. Advances in Neural Information Processing Systems 10, 682–687 (1998)
Weinberger, K.Q., Saul, L.K.: Unsupervised learning of image manifolds by semidefinite programming. In: Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR 2004), vol. II, pp. 988–995 (2004)
Weinberger, K., Sha, F., Zhu, Q., Saul, L.: Graph Laplacian regularization for large-scale semidefinite programming. In: Advances in Neural Information Processing Systems, vol. 19, p. 1489 (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tadavani, P.K., Ghodsi, A. (2010). Learning an Affine Transformation for Non-linear Dimensionality Reduction. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2010. Lecture Notes in Computer Science(), vol 6322. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15883-4_2
Download citation
DOI: https://doi.org/10.1007/978-3-642-15883-4_2
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-15882-7
Online ISBN: 978-3-642-15883-4
eBook Packages: Computer ScienceComputer Science (R0)