LTS-SVMR for Modeling of Nonlinear Systems with Noise and Outliers

  • Chen-Chia Chuang
  • Jin-Tsong Jeng
  • Guan-Yi Hu
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 273)


In general, there are many machine learning algorithms have developed for the intelligent control. Besides, support vector machine (SVM) among machine learning is generally used in recent years. That is, many contributions about the support vector machine regression (SVMR) and the least squares-support vector machine regression (LS-SVMR) can be found in some well-known journals. In this paper, for the robustness problem of LS-SVMR, we propose the least trimmed squares support vector machine regression (LTS-SVMR) which is the hybrid of the least trimmed squares (LTS). That is, when the LTS method faces on the training sample with noise and outliers, it can effectively remove large noise and outliers under the proper initial nonlinear function. That is, robustness of the LS-SVMR is enhanced by combining the LS-SVMR and the LTS. Finally, the proposed LTS-SVMR is applied on modeling of nonlinear systems with noise and outliers.


Least squares support vector machine regression least trimmed squares modeling noise and outliers 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Karatzoglou, A., Meyer, D.: Support Vector Machines in R. Journal of Statistical Software 15(9), 1–28 (2006)Google Scholar
  2. 2.
    Chuang, C.C., Lai, M.H., Chen, S.S., Jeng, J.T.: Hybrid robust LS-SVMR with outliers for MIMO system. In: IEEE International Conference on Systems Man and Cybernetics (SMC), pp. 10–13 (October 2010)Google Scholar
  3. 3.
    Quan, T., Liu, X., Liu, Q.: Weighted Lest Squares Support Vector Machine Local Region Method for Nonlinear Time Series Prediction. Applied Soft Computing 10(2), 562–566 (2010)CrossRefGoogle Scholar
  4. 4.
    Li, Z., Tang, X.: Using Support Vector Machines to Enhance the Perform-ance of Bayesian Face Recognition. IEEE Transactions on Information Forensics and Security 2(2), 174–180 (2007)CrossRefGoogle Scholar
  5. 5.
    Camps-Valls, G., Soria-Olivas, E., Perez-Ruixo, J.J., Perez-Cruz, F., Artes-Rodriguez, A., Jimenez-Torres, N.V.: Therapeutic Drug Monitoring of Kidney Transplant Recipients Using Profiled Support Vector Machines. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Aplications and Reviewa 37(3), 359–372 (2007)CrossRefGoogle Scholar
  6. 6.
    Suykens, J.A.K., Gestel, T.V., Brabanter, J.D., Moor, B.D., Vandewalle, J.: Least Squares Support Vector Machines, ch. 2. World Scientific, New Jersey (2002)Google Scholar
  7. 7.
    Valyon, J., Horvath, G.: A Weighted Generalized LS–SVM. Periodica Polytechnica Ser. El. Eng. 47(3), 229–251 (2003)Google Scholar
  8. 8.
    Yu, Z., Cai, Y.: Least Squares Wavelet Support Vector Machines for Nonlinear System Identification. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3497, pp. 436–441. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  9. 9.
    Rousseeuw, P.J., Leroy, A.M.: Robust Regression and outlier Detection. John Wiley & Sons, Inc., Hoboken (2003)Google Scholar
  10. 10.
    Rousseeuw, P.J.: Multivariate Estimation With High Breakdown Point. In: Grossmann, W., Pflug, G., Vincze, I., Werty, W. (eds.) Mathematical Statistics and Applications, vol. B, pp. 283–297. Reidel, Dordrecht (1985)CrossRefGoogle Scholar
  11. 11.
    Alpaydm, E.: Introduction to Machine Learning, ch.1. The MIT Press (October 2004)Google Scholar
  12. 12.
    Quan, T., Liu, X., Liu, Q.: Weighted Lest Squares Support Vector Machine Local Region Method for Nonlinear Time Series Prediction. Applied Soft Computing 10(2), 562–566 (2010)CrossRefGoogle Scholar
  13. 13.
    Chuang, C.C., Su, S.F., Jeng, J.T., Hsiao, C.C.: Robust Support Vector Regression Networks for Function Approximation With Outliers. IEEE Transactions on Neural Networks 13(6), 1322–1330 (2002)CrossRefGoogle Scholar
  14. 14.
    Suykens, J.A.K., De Brabanter, J., Lukas, L., Vandewalle, J.: Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing 48(1-4), 85–105 (2002)CrossRefMATHGoogle Scholar
  15. 15.
    Rousseeuw, P.J., Driessen, K.V.: Computing LTS Regression for Large Data Sets. Data Mining and Knowledge Discovery 21(1), 29–45 (2006)CrossRefMathSciNetGoogle Scholar
  16. 16.
    Leontitsis, A., Pange, J.: Statistical significance of the LMS regression. Mathematics and Computers in Simulation 64(5), 543–547 (2004)CrossRefMATHMathSciNetGoogle Scholar
  17. 17.
    Fu, Y.Y., Wu, C.J., Jeng, J.T., Ko, C.N.: Identification of MIMO systems using radial basis function networks with hybrid learning algorithm. Applied Mathematics and Computation 213(1), 184–196 (2009)CrossRefMATHMathSciNetGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Chen-Chia Chuang
    • 1
  • Jin-Tsong Jeng
    • 2
  • Guan-Yi Hu
    • 2
  1. 1.Department of Electrical EngineeringNational Ilan UniversityYilanTaiwan ROC
  2. 2.Department of Computer Science and Information EngineeringNational Formosa UniversityHuweiTaiwan ROC

Personalised recommendations