Abstract
A theory of learning is proposed,which extends naturally the classic regularization framework of kernelmachines to the case in which the agent interacts with a richer environment, compactly described by the notion of constraint. Variational calculus is exploited to derive general representer theorems that give a description of the structure of the solution to the learning problem. It is shown that such solution can be represented in terms of constraint reactions, which remind the corresponding notion in analytic mechanics. In particular, the derived representer theorems clearly show the extension of the classic kernel expansion on support vectors to the expansion on support constraints. As an application of the proposed theory three examples are given, which illustrate the dimensional collapse to a finite-dimensional space of parameters. The constraint reactions are calculated for the classic collection of supervised examples, for the case of box constraints, and for the case of hard holonomic linear constraints mixed with supervised examples. Interestingly, this leads to representer theorems for which we can re-use the kernel machine mathematical and algorithmic apparatus.
Keywords
- Soft Constraint
- Hard Constraint
- Unilateral Constraint
- Holonomic Constraint
- Kernel Machine
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Part of this chapter is an excerpt of the paper “Foundations on Support Constraint Machines,” by the same authors, that will appear in Neural Computation, which contains a significantly broader view on the subject along with a more general framework, algorithmic issues, and references to applications.
This is a preview of subscription content, access via your institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Adams, R.A., Fournier, J.F.: Sobolev Spaces, 2nd edn. Academic Press (2003)
Argyriou, A., Micchelli, C.A., Pontil, M.: When is there a representer theorem? Vector versus matrix regularizers. Journal of Machine Learning Research 10, 2507–2529 (2009)
Attouch, H., Buttazzo, G., Michaille, G.: Variational Analysis in Sobolev and BV Spaces. Applications to PDEs and Optimization. SIAM, Philadelphia (2006)
Diligenti, M., Gori, M., Maggini, M., Rigutini, L.: Multitask kernel-based learning with logic constraints. In: Proc. 19th European Conf. on Artificial Intelligence, pp. 433–438 (2010)
Diligenti, M., Gori, M., Maggini, M., Rigutini, L.: Bridging logic and kernel machines. Machine Learning 86, 57–88 (2012)
Dinuzzo, F., Schoelkopf, B.: The representer theorem for Hilbert spaces: A necessary and sufficient condition. In: Proc. Neural Information Processing Systems (NIPS) Conference, pp. 189–196 (2012)
Giaquinta, M., Hildebrand, S.: Calculus of Variations I, vol. 1. Springer (1996)
Gnecco, G., Gori, M., Melacci, S., Sanguineti, M.: Learning with hard constraints. In: Mladenov, V., Koprinkova-Hristova, P., Palm, G., Villa, A.E.P., Appollini, B., Kasabov, N. (eds.) ICANN 2013. LNCS, vol. 8131, pp. 146–153. Springer, Heidelberg (2013)
Gnecco, G., Gori, M., Melacci, S., Sanguineti, M.: A theoretical framework for supervised learning from regions. Neurocomputing 129, 25–32 (2014)
Gnecco, G., Gori, M., Sanguineti, M.: Learning with boundary conditions. Neural Computation 25, 1029–1106 (2013)
Gnecco, G., Gori, M., Melacci, S., Sanguineti, M.: Foundations of support constraints machines. Neural Computation (to appear)
Gori, M., Melacci, S.: Constraint verification with kernel machines. IEEE Transactions on Neural Networks and Learning Systems 24, 825–831 (2013)
Klement, E.P., Mesiar, R., Pap, E.: Triangular Norms. Kluwer (2000)
Kunapuli, G., Bennett, K.P., Shabbeer, A., Maclin, R., Shavlik, J.: Online knowledge-based support vector machines. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds.) ECML PKDD 2010, Part II. LNCS (LNAI), vol. 6322, pp. 145–161. Springer, Heidelberg (2010)
Mangasarian, O.L., Wild, E.W.: Nonlinear knowledge-based classification. IEEE Transactions on Neural Networks 19, 1826–1832 (2008)
Melacci, S., Gori, M.: Learning with box kernels. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(11), 2680–2692 (2013)
Melacci, S., Maggini, M., Gori, M.: Semi–supervised learning with constraints for multi–view object recognition. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009, Part II. LNCS, vol. 5769, pp. 653–662. Springer, Heidelberg (2009)
Poggio, T., Girosi, F.: A theory of networks for approximation and learning. Technical report. MIT (1989)
Schwartz, L.: Théorie des distributions. Hermann, Paris (1978)
Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press (2004)
Sun, Z., Zhang, Z., Wang, H., Jiang, M.: Cutting plane method for continuously constrained kernel-based regression. IEEE Transactions on Neural Networks 21, 238–247 (2010)
Suykens, J.A.K., Alzate, C., Pelckmans, K.: Primal and dual model representations in kernel-based learning. Statistics Surveys 4, 148–183 (2010)
Theodoridis, S., Slavakis, K., Yamada, I.: Adaptive learning in a world of projections. IEEE Signal Processing Magazine 28, 97–123 (2011)
Tikhonov, A.N., Arsenin, V.Y.: Solution of ill-posed problems. W.H. Winston, Washington, DC (1977)
Yuille, A.L., Grzywacz, N.M.: A mathematical analysis of the motion coherence theory. International Journal of Computer Vision 3, 155–175 (1989)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Gnecco, G., Gori, M., Melacci, S., Sanguineti, M. (2015). Learning as Constraint Reactions. In: Koprinkova-Hristova, P., Mladenov, V., Kasabov, N. (eds) Artificial Neural Networks. Springer Series in Bio-/Neuroinformatics, vol 4. Springer, Cham. https://doi.org/10.1007/978-3-319-09903-3_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-09903-3_12
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-09902-6
Online ISBN: 978-3-319-09903-3
eBook Packages: EngineeringEngineering (R0)