Kernel PCA Pattern Reconstruction via Approximate Pre-Images
Algorithms based on Mercer kernels construct their solutions in terms of expansions in a high-dimensional feature space F. Previous work has shown that all algorithms which can be formulated in terms of dot products in F can be performed using a kernel without explicitly working in F. The list of such algorithms includes support vector machines and nonlinear kernel principal component extraction. So far, however, it did not include the reconstruction of patterns from their largest nonlinear principal components, a technique which is common practice in linear principal component analysis.
The present work proposes an idea for approximately performing this task. As an illustrative example, an application to the de-noising of data clusters is presented.
KeywordsInput Space Reproduce Kernel Hilbert Space Kernel Principal Component Analysis Support Vector Classifier Mercer Kernel
Unable to display preview. Download preview PDF.
- B. E. Boser, I. M. Guyon, and V. N. Vapnik. A training algorithm for optimal margin classifiers. In D. Haussler, editor, Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory, pages 144–152, Pittsburgh, PA, July 1992. ACM Press.Google Scholar
- C. J. C. Burges. Simplified support vector decision rules. In L. Saitta, editor, Proceedings, 13th Intl. Conf. on Machine Learning, pages 71–77, San Mateo, CA, 1996. Morgan Kaufmann.Google Scholar