Abstract
Principal component analysis based on Hebbian learning is originally designed for data processing in Euclidean spaces. We present in this contribution an extension of Oja’s online learning approach for non-Euclidean spaces. First we review the kernel principal component approach. We show that for differentiable kernel this approach can be formulated as an online learning scheme. Hence, PCA can be explicitly carried out in the data space but now equipped with a non-Euclidean metric. Moreover, the theoretical framework can be extended to principal component learning in Banach spaces based on semi-inner products. This becomes particularly important when learning in l p -norm spaces with p ≠2 is considered. In this contribution we focus on the mathematics and theoretical justification of the approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Aronszajn, N.: Theory of reproducing kernels. Transactions of the American Mathematical Society 68, 337–404 (1950)
Der, R., Lee, D.: Large-margin classification in Banach spaces. In: JMLR Workshop and Conference Proceedings. AISTATS, vol. 2, pp. 91–98 (2007)
Giles, J.: Classes of semi-inner-product spaces. Transactions of the American Mathematical Society 129, 436–446 (1967)
Günter, S., Schraudolph, N., Vishwanathan, S.: Fast iterative kernel principal component analysis. Journal of Machine Learning Research 8, 1893–1918 (2007)
Hammer, B., Villmann, T.: Generalized relevance learning vector quantization. Neural Networks 15(8-9), 1059–1068 (2002)
Haykin, S.: Neural Networks. A Comprehensive Foundation. Macmillan, New York (1994)
Hein, M., Bousquet, O., Schölkopf, B.: Maximal margin classification for metric spaces. Journal of Computer Systems Sciences 71, 333–359 (2005)
Hoffmann, T., Schölkopf, B., Smola, A.: Kernel methods in machine learning. The Annals of Statistics 36(3), 1171–1220 (2008)
Joliffe, I.: Principal Component Analysis, 2nd edn. Springer (2002)
Kästner, M., Hammer, B., Biehl, M., Villmann, T.: Functional relevance learning in generalized learning vector quantization. Neurocomputing 90(9), 85–95 (2012)
Kim, K., Franz, M., Schölkopf, B.: Kernel hebbian algorithm for iterative kernel principal component analysis. Technical Report 109, Max-Planck-Institute for Biological Cybernetics (June 2003)
Kim, K., Franz, M., Schölkopf, B.: Iterative kernel principal component analysis for image modelling. IEEE Transactions on Pat. 27(9), 1351–1366 (2005)
Lumer, G.: Semi-inner-product spaces. Transactions of the American Mathematical Society 100, 29–43 (1961)
Mercer, J.: Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society, London, A 209, 415–446 (1909)
Oja, E.: Neural networks, principle components and suspaces. International Journal of Neural Systems 1, 61–68 (1989)
Oja, E.: Nonlinear pca: Algorithms and applications. In: Proc. of the World Congress on Neural Networks, Portland, pp. 396–400 (1993)
Sanger, T.: Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural Networks 12, 459–473 (1989)
Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press (2002)
Schneider, P., Hammer, B., Biehl, M.: Adaptive relevance matrices in learning vector quantization. Neural Computation 21, 3532–3561 (2009)
Schölkopf, B., Smola, A., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neur. 14(7), 1299–1319 (1998)
Steinwart, I.: On the influence of the kernel on the consistency of support vector machines. Journal of Machine Learning Research 2, 67–93 (2001)
Villmann, T., Haase, S.: A note on gradient based learning in vector quantization using differentiable kernels for Hilbert and Banach spaces. Machine Learning Reports 6(MLR-02-2012), 1–29 (2012) ISSN:1865-3960, http://www.techfak.uni-bielefeld.de/fschleif/mlr/mlr_02_2012.pdf
Villmann, T., Hammer, B.: Functional Principal Component Learning Using Oja’s Method and Sobolev Norms. In: Príncipe, J.C., Miikkulainen, R. (eds.) WSOM 2009. LNCS, vol. 5629, pp. 325–333. Springer, Heidelberg (2009)
von Luxburg, U., Bousquet, O.: Distance-based classification with Lipschitz functions. Journal of Machine Learning Research 5, 669–695 (2004)
Zhang, H., Xu, Y., Zhang, J.: Reproducing kernel banach spaces for machine learning. Journal of Machine Learning Research 10, 2741–2775 (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Biehl, M., Kästner, M., Lange, M., Villmann, T. (2013). Non-Euclidean Principal Component Analysis and Oja’s Learning Rule – Theoretical Aspects. In: Estévez, P., Príncipe, J., Zegers, P. (eds) Advances in Self-Organizing Maps. Advances in Intelligent Systems and Computing, vol 198. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35230-0_3
Download citation
DOI: https://doi.org/10.1007/978-3-642-35230-0_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-35229-4
Online ISBN: 978-3-642-35230-0
eBook Packages: EngineeringEngineering (R0)