Leaving Local Optima in Unsupervised Kernel Regression
Embedding high-dimensional patterns in low-dimensional latent spaces is a challenging task. In this paper, we introduce re-sampling strategies to leave local optima in the data space reconstruction error (DSRE) minimization process of unsupervised kernel regression (UKR). For this sake, we concentrate on a hybrid UKR variant that combines iterative solution construction with gradient descent based optimization. Patterns with high reconstruction errors are removed from the manifold and re-sampled based on Gaussian sampling. Re-sampling variants consider different pattern reconstruction errors, varying numbers of re-sampled patterns, and termination conditions. The re-sampling process with UKR can also improve ISOMAP embeddings. Experiments on typical benchmark data sets illustrate the capabilities of strategies for leaving optima.
KeywordsLocal Optimum Latent Space Reconstruction Error Gaussian Kernel Function Nonlinear Dimensionality Reduction
Unable to display preview. Download preview PDF.
- 1.Bache, K., Lichman, M.: UCI machine learning repository (2013)Google Scholar
- 5.Kramer, O.: Dimensionality reduction by unsupervised k-nearest neighbor regression. In: International Conference on Machine Learning and Applications (ICMLA), pp. 275–278 (2011)Google Scholar
- 6.Lee, J.A., Verleysen, M.: Nonlinear Dimensionality Reduction. Springer (2007)Google Scholar
- 8.Nocedal, J., Wright, S.J.: Numerical Optimization. Springer (2000)Google Scholar