Support Vector Regression Based on Unconstrained Convex Quadratic Programming
Support vector regression (SVR) based on unconstrained convex quadratic programming is proposed, in which Gaussian loss function is adopted. Compared with standard SVR, this method has a fast training speed and can be generalized into the complex-valued field directly. Experimental results confirm the feasibility and the validity of our method.
KeywordsQuadratic Programming Support Vector Regression Sinc Function Generalize Inverse Matrix Conjugate Gradient Descent
Unable to display preview. Download preview PDF.
- 3.Smola, Schölkopf, B.: A tutorial on support vector regression, NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway College, University of London, UK (1998), Available: http://www.kernel-machines.org/
- 4.Smola, A.: Regression estimation with support vector learning machines. Master’s thesis, Technische Universität München (1996), Available: http://www.kernel-machines.org/
- 5.Zhou, W., Zhang, L., Jiao, L.: Complex support vector machines for regression estimation. Signal Processing (submitted)Google Scholar
- 6.Yuan, Y.X., Sun, W.Y.: Optimization Theory and Method. Science Publishing House, Beijing (1999)Google Scholar