Skip to main content
Log in

Model Selection for Regularized Least-Squares Algorithm in Learning Theory

  • Published:
Foundations of Computational Mathematics Aims and scope Submit manuscript

Abstract

We investigate the problem of model selection for learning algorithms depending on a continuous parameter. We propose a model selection procedure based on a worst-case analysis and on a data-independent choice of the parameter. For the regularized least-squares algorithm we bound the generalization error of the solution by a quantity depending on a few known constants and we show that the corresponding model selection procedure reduces to solving a bias-variance problem. Under suitable smoothness conditions on the regression function, we estimate the optimal parameter as a function of the number of data and we prove that this choice ensures consistency of the algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to E. De Vito, A. Caponnetto or L. Rosasco.

Rights and permissions

Reprints and permissions

About this article

Cite this article

De Vito, E., Caponnetto, A. & Rosasco, L. Model Selection for Regularized Least-Squares Algorithm in Learning Theory. Found Comput Math 5, 59–85 (2005). https://doi.org/10.1007/s10208-004-0134-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10208-004-0134-1

Navigation