Sum and Product Kernel Regularization Networks
We study the problem of learning from examples (i.e., supervised learning) by means of function approximation theory. Approximation problems formulated as regularized minimization problems with kernel-based stabilizers exhibit easy derivation of solution, in the shape of a linear combination of kernel functions (one-hidden layer feed-forward neural network schemas). Based on Aronszajn’s formulation of sum of kernels and product of kernels, we derive new approximation schemas – Sum Kernel Regularization Network and Product Kernel Regularization Network. We present some concrete applications of the derived schemas, demonstrate their performance on experiments and compare them to classical solutions. For many tasks our schemas outperform the classical solutions.
KeywordsKernel Function Product Kernel Information Society Reproduce Kernel Hilbert Space Representer Theorem
Unable to display preview. Download preview PDF.
- 2.Girosi, F.: An equivalence between sparse approximation and support vector machines. Technical report, Massachutesetts Institute of Technology A.I. Memo No. 1606 (1997)Google Scholar
- 5.Kudová, P.: Learning with kernel based regularization networks. In: Information Technologies - Applications and Theory, pp. 83–92 (2005)Google Scholar
- 7.Šidlofová, T.: Existence and uniqueness of minimization problems with fourier based stabilizers. In: Proceedings of Compstat, Prague (2004)Google Scholar
- 8.Šámalová, T., Kudová, P.: Sum and product kernel networks. Technical report, Institute of Computer Science, AS CR (2005)Google Scholar
- 9.Prechelt, L.: PROBEN1 – a set of benchmarks and benchmarking rules for neural network training algorithms. Technical Report 21/94, Universitaet Karlsruhe (1994)Google Scholar
- 10.LAPACK: Linear algebra package, http://www.netlib.org/lapack/
- 11.PAPI: Performance appl. prog. interface, http://icl.cs.utk.edu/papi/