Abstract
A key idea of nonlinear Support Vector Machines (SVMs) is to map the inputs in a nonlinear way to a high dimensional feature space, while Mercer’s condition is applied in order to avoid an explicit expression for the nonlinear mapping. In SVMs for nonlinear classification a large margin classifier is constructed in the feature space. For regression a linear regressor is constructed in the feature space. Other kernel extensions of linear algorithms have been proposed like kernel Principal Component Analysis (PCA) and kernel Fisher Discriminant Analysis. In this paper, we discuss the extension of linear Canonical Correlation Analysis (CCA) to a kernel CCA with application of the Mercer condition. We also discuss links with single output Least Squares SVM (LS-SVM) Regression and Classification.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Anderson, T.W.: An introduction to multivariate analysis. Wiley, New York (1966).
Baudat, G., Anouar, F.: Generalized Discriminant Analysis Using a Kernel Approach. Neural Computation 12 (2000) 2385–2404
Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford Univ. Press (1995)
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press (2000)
Evgeniou, T., Pontil, M., Poggio, T.: Regularization Networks and Support Vector Machines. Advances in Computational Mathematics 13 (2001) 1–50
Friedman, J.: Regularized Discriminant Analysis. Journal of the American Statistical Association 84 (1989) 165–175
Lai, P.L., Fyfe, C.: Kernel and nonlinear canonical correlation analysis. International Journal of Neural Systems (2001), accepted for publication
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müller, K.-R.: Fisher discriminant analysis with kernels. In: Hu, Y.-H., Larsen, J., Wilson, E., Douglas, S. (Eds.): Proc. Neural Networks for Signal Processing Workshop IX, NNSP-99 (1999)
Ripley, B.D.: Neural Networks and Related Methods for Classification. Journal Royal Statistical Society B 56 (1994) 409–456
Schölkopf, B., Smola, A., Müller, K.-M.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10 (1998) 1299–1319
Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Processing Letters 9 (1999) 293–300
Van Gestel, T., Suykens, J.A.K., Baestaens, D.-E., Lambrechts, A., Lanckriet, G., Vandaele, B., De Moor, B., Vandewalle, J.: Predicting Financial Time Series using Least Squares Support Vector Machines within the Evidence Framework. IEEE Transactions on Neural Networks (2001), to appear
Vapnik, V.: Statistical Learning Theory. Wiley, New-York (1998)
Williams, C.K.I.: Prediction with Gaussian Processes: from Linear Regression to Linear Prediction and Beyond. In: Jordan, M.I.: Learning and Inference in Graphical Models. Kluwer Academic Press (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Van Gestel, T., Suykens, J.A.K., De Brabanter, J., De Moor, B., Vandewalle, J. (2001). Kernel Canonical Correlation Analysis and Least Squares Support Vector Machines. In: Dorffner, G., Bischof, H., Hornik, K. (eds) Artificial Neural Networks — ICANN 2001. ICANN 2001. Lecture Notes in Computer Science, vol 2130. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44668-0_54
Download citation
DOI: https://doi.org/10.1007/3-540-44668-0_54
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42486-4
Online ISBN: 978-3-540-44668-2
eBook Packages: Springer Book Archive