Advertisement

A novel support vector regression algorithm incorporated with prior knowledge and error compensation for small datasets

  • Zhenyu Liu
  • Yunkun Xu
  • Chan QiuEmail author
  • Jianrong Tan
Original Article
  • 58 Downloads

Abstract

To solve the modeling problem under the conditions where the measured data are insufficient but biased prior knowledge from a simulator is available, we propose a novel multi-scale \(\nu \)-linear programming support vector regression (\(\nu \)-LPSVR) called \(\nu \)-MPESVR. The proposed algorithm constructs a nested support vector regression model, which incorporates prior knowledge into \(\nu \)-LPSVR, compensates for the errors between prior knowledge data and the measured data, and simultaneously achieves small training error for both the prediction model and the error compensation model. Considering that measured data may exist in multiple feature spaces, we extend the algorithm to multi-scale \(\nu \)-LPSVR to achieve accurate modeling for complex problems. In addition, a strategy for parameter selection for \(\nu \)-MPESVR is presented in this paper. The performance of the proposed algorithm is estimated in a synthetic example and a practical application. The performances of all models are evaluated with the root mean square error (RMSE), mean absolute error (MAE), and coefficient of determination (R2). Taking the three groups of experiments in the synthetic example as an instance, we find that the \(\nu \)-MPESVR performs better, and it can still maintain high accuracy when the biases of prior knowledge data change (RMSE values of 0.1962, 0.1904, and 0.2261, MAE values of 0.1396, 0.1375, and 0.1623, and R2 values of 0.9919, 0.9923, and 0.9892 for the three groups of experiments, respectively). The experimental results indicate that the proposed algorithm can obtain a satisfactory model with a finite amount of measured data, and the performance is better than that of existing algorithms.

Keywords

Support vector regression Prior knowledge Error compensation Multi-scale Small dataset 

Notes

Acknowledgements

Support from the National Natural Science Foundation of China (Grant Nos. 51475418, 51490663, 51475417) is gratefully acknowledged.

References

  1. 1.
    Aslahi-Shahri BM, Rahmani R, Chizari M, Maralani A, Eslami M, Golkar MJ, Ebrahimi A (2016) A hybrid method consisting of GA and SVM for intrusion detection system. Neural Comput Appl 27(6):1–8Google Scholar
  2. 2.
    Bishop CM (2006) Pattern recognition and machine learning (Information science and statistics). Springer, New YorkzbMATHGoogle Scholar
  3. 3.
    Bloch G, Lauer F, Colin G, Chamaillard Y (2008) Support vector regression from simulation data and few experimental samples. Inf Sci Int J 178(20):3813–3827Google Scholar
  4. 4.
    Chang C, Lin C (2002) Training v-support vector regression: theory and algorithms. Neural Comput 14(8):1959–1977zbMATHGoogle Scholar
  5. 5.
    Chen J, Xue X, Ha M, Yu D (2014) Support vector regression method for wind speed prediction incorporating probability prior knowledge. Math Probl Eng 2014(2014):1–10Google Scholar
  6. 6.
    Cherkassky V, Ma Y (2004) Practical selection of SVM parameters and noise estimation for SVM regression. Neural Netw 17(1):113–126zbMATHGoogle Scholar
  7. 7.
    Farooq T, Guergachi A, Krishnan S (2010) Knowledge-based Green’s kernel for support vector regression. Math Probl Eng 1024(1024–123X):16zbMATHGoogle Scholar
  8. 8.
    Hasanipanah M, Shahnazar A, Amnieh HB, Armaghani DJ (2017) Prediction of air-overpressure caused by mine blasting using a new hybrid PSO–SVR model. Eng Comput 33(1):23–31Google Scholar
  9. 9.
    Lauer F (2008) Incorporating prior knowledge in support vector regression. Mach Learn 70(1):89–118Google Scholar
  10. 10.
    Lauer F, Bloch G (2008) Incorporating prior knowledge in support vector machines for classification: a review. Neurocomputing 71(7):1578–1594Google Scholar
  11. 11.
    Leung AYT, Zhang H (2009) Particle swarm optimization of tuned mass dampers. Eng Struct 31(3):715–728Google Scholar
  12. 12.
    Leung AYT, Zhang H, Cheng CC, Lee YY (2010) Particle swarm optimization of TMD by non-stationary base excitation during earthquake. Earthq Eng Struct Dyn 37(9):1223–1246Google Scholar
  13. 13.
    Liu B, Wang L, Jin YH, Tang F, Huang DX (2005) Improved particle swarm optimization combined with chaos. Chaos Solitons Fractals 25(5):1261–1271zbMATHGoogle Scholar
  14. 14.
    Lu Z, Sun J (2009) Non-mercer hybrid kernel for linear programming support vector regression in nonlinear systems identification. Appl Soft Comput 9(1):94–99MathSciNetGoogle Scholar
  15. 15.
    Lu Z, Sun J, Butts K (2014) Multiscale asymmetric orthogonal wavelet kernel for linear programming support vector learning and nonlinear dynamic systems identification. IEEE Trans Cybern 44(5):712Google Scholar
  16. 16.
    Milner S, Davis C, Zhang H, Llorca J (2012) Nature-inspired self-organization, control, and optimization in heterogeneous wireless networks. IEEE Trans Mobile Comput 11(7):1207–1222Google Scholar
  17. 17.
    Moazami S, Noori R, Amiri BJ, Yeganeh B, Partani S, Safavi S (2016) Reliable prediction of carbon monoxide using developed support vector machine. Atmos Pollut Res 7(3):412–418Google Scholar
  18. 18.
    Noori R, Abdoli M, Ghasrodashti AA, Jalili Ghazizade M (2008) Prediction of municipal solid waste generation with combination of support vector machine and principal component analysis: a case study of mashhad. Environ Prog Sustain Energy 28(2):249–258Google Scholar
  19. 19.
    Noori R, Deng Z, Kiaghadi A, Kachoosangi FT (2016) How reliable are ANN, ANFIS, and SVM techniques for predicting longitudinal dispersion coefficient in natural rivers? J Hydraul Eng 142(1):04015039Google Scholar
  20. 20.
    Noori R, Karbassi A, Moghaddamnia A, Han D, Zokaei-Ashtiani M, Farokhnia A, Gousheh MG (2011) Assessment of input variables determination on the SVM model performance using PCA, gamma test, and forward selection techniques for monthly stream flow prediction. J Hydrol 401(3):177–189Google Scholar
  21. 21.
    Platt JC (1999) Fast training of support vector machines using sequential minimal optimization. In: Schölkopf B, Burges C, Smola A (eds) Advances in kernel methods, chap 12. MIT press, Cambridge, MA, pp 185–208Google Scholar
  22. 22.
    Sartakhti JS, Afrabandpey H, Saraee M (2017) Simulated annealing least squares twin support vector machine (SA-LSTSVM) for pattern classification. Soft Comput 21(15):4361–4373Google Scholar
  23. 23.
    Schölkopf B, Smola AJ, Williamson RC, Bartlett PL (2000) New support vector algorithms. Neural Comput 12(5):1207–1245Google Scholar
  24. 24.
    Smola A, Scholkopf B, Ratsch G (1999) Linear programs for automatic accuracy control in regression. In: Ninth international conference on artificial neural networks, 1999. ICANN 99, vol 2, pp 575–580Google Scholar
  25. 25.
    Smola AJ, Schölkopf B (2004) A tutorial on support vector regression. Stat Comput 14(3):199–222MathSciNetGoogle Scholar
  26. 26.
    Vapnik VN (1997) The nature of statistical learning theory. IEEE Trans Neural Netw 38(4):409Google Scholar
  27. 27.
    Wang Y, Jiang P (2017) Fluctuation evaluation and identification model for small-batch multistage machining processes of complex aircraft parts. Proc Inst Mech Eng Part B J Eng Manuf 231(10):1820–1837Google Scholar
  28. 28.
    Zhang W, Yu L, Yoshida T, Wang Q (2018) Feature weighted confidence to incorporate prior knowledge into support vector machines for classification. Knowl Inf Syst 1:1–27Google Scholar
  29. 29.
    Zhao W, Tao T, Zio E, Wang W (2016) A novel hybrid method of parameters tuning in support vector regression for reliability prediction: particle swarm optimization combined with analytical selection. IEEE Trans Reliab 65(3):1393–1405Google Scholar
  30. 30.
    Zhao Y, Jiang P (2017) Angular rate sensing with gyrowheel using genetic algorithm optimized neural networks. Sensors 17(7):1692Google Scholar
  31. 31.
    Zheng D, Wang J, Zhao Y (2006) Non-flat function estimation with a multi-scale support vector regression. Neurocomputing 70(1):420–429Google Scholar
  32. 32.
    Zhou J, Duan B, Huang J, Cao H (2014) Data-driven modeling and optimization for cavity filters using linear programming support vector regression. Neural Comput Appl 24(7–8):1771–1783Google Scholar
  33. 33.
    Zhou J, Duan B, Huang J, Li N (2015) Incorporating prior knowledge and multi-kernel into linear programming support vector regression. Soft Comput 19(7):2047–2061Google Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  1. 1.State Key Laboratory of CAD&CGZhejiang UniversityHangzhouPeople’s Republic of China

Personalised recommendations