Advertisement

Implementation Method of SVR Algorithm in Resource-Constrained Platform

  • Bing Liu
  • Shoujuan Huang
  • Ruidong Wu
  • Ping FuEmail author
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 157)

Abstract

With the development of the Internet of Things and edge computing, machine learning algorithms need to be deployed on resource-constrained embedded platforms. Support Vector Regression (SVR) is one of the most popular algorithms widely used in solving problems characterized by small samples, high-dimensional, and nonlinear, with its good generalization ability and prediction performance. However, SVR algorithm requires a lot of resources when it is implemented. Therefore, this paper proposes a method to implement SVR algorithm in the resource-constrained embedded platform. The method analyses the characteristics of the data in the SVR algorithm and the solution process of the algorithm. Then, according to the characteristics of the embedded platform, the implementation process of the algorithm is optimized. Experiments using UCI datasets show that the implemented SVR algorithm is correct and effective, and the optimized SVR algorithm reduces time and memory consumption at the same time, which is of great significance for the implementation of SVR algorithm in resource-constrained embedded platforms.

Keywords

SVR algorithm Resource-constrained Embedded platform Implementation method 

References

  1. 1.
    Brereton, R.G., Lloyd, G.R.: Support vector machines for classification and regression. Analyst 135(2), 230–267 (2010)CrossRefGoogle Scholar
  2. 2.
    Cortes, C., Vapnik, V.: Support-vector network. Mach. Learn. 20, 273–297 (1995)Google Scholar
  3. 3.
    Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory—COLT ‘92, p. 144Google Scholar
  4. 4.
    Drucker, H., Burges, C.J.C., Kaufman, L., Smola, A.J., Vapnik, V.N.: Support vector regression machines. In: Advances in Neural Information Processing Systems 9, NIPS 1996, pp. 155–161. MIT Press (1997)Google Scholar
  5. 5.
    Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Advances in Kernel Methods. MIT Press (1999)Google Scholar
  6. 6.
    Kuhn, H.W., Tucker, A.W.: Nonlinear programming. In: Proceedings of 2nd Berkeley Symposium, pp. 481–492. Berkeley, University of California Press (1951)Google Scholar
  7. 7.
    Chang, C.C., Lin, C.J.: LIBSVM: A Library for Support Vector Machines, pp. 1–27. ACM (2011)Google Scholar
  8. 8.
    Tüfekci, P.: Prediction of full load electrical power output of a base load operated combined cycle power plant using machine learning methods. Int. J. Electr. Power Energy Syst. 60, 126–140 (2014). ISSN 0142-0615CrossRefGoogle Scholar
  9. 9.
    Quinlan, R.: Combining instance-based and model-based learning. In: Proceedings on the Tenth International Conference of Machine Learning, pp. 236–243. University of Massachusetts, Amherst, Morgan Kaufmann (1993)CrossRefGoogle Scholar
  10. 10.
    Waugh, S.: Extending and benchmarking cascade-correlation. Ph.D. thesis, Computer Science Department, University of Tasmania (1995)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.Harbin Institute of TechnologyHarbinChina

Personalised recommendations