Neural Processing Letters

, Volume 21, Issue 3, pp 175–188

An Incremental Learning Strategy for Support Vector Regression


DOI: 10.1007/s11063-004-5714-1

Cite this article as:
WANG, W. Neural Process Lett (2005) 21: 175. doi:10.1007/s11063-004-5714-1


Support vector machine (SVM) provides good generalization performance but suffers from a large amount of computation. This paper presents an incremental learning strategy for support vector regression (SVR). The new method firstly formulates an explicit expression of ||W||2 by constructing an orthogonal basis in feature space together with a basic Hilbert space identity, and then finds the regression function through minimizing the formula of ||W||2 rather than solving a convex programming problem. Particularly, we combine the minimization of ||W||2 with kernel selection that can lead to good generalization performance. The presented method not only provides a novel way for incremental SVR learning, but opens an opportunity for model selection of SVR as well. An artificial data set, a benchmark data set and a real-world data set are employed to evaluate the method. The simulations support the feasibility and effectiveness of the proposed approach.


incremental learningkernel selectionregressionsupport vector machine

Copyright information

© Springer 2005

Authors and Affiliations

  1. 1.Institute of System Engineering, Faculty of Computer and Information TechnologyShanxi UniversityTaiyuanPeople’s Republic of China