Demonstration of SVM Classification Based on Improved Gauss Kernel Function

  • Yang Han
  • Jie Li
  • Jin-Ze Li
  • Hong-Wei Xing
  • Ai-Min Yang
  • Yu-Hang Pan
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 613)

Abstract

This article carries on the analysis in view of the Gaussian kernel function among many kernel function support vector machine, which explains the scope of application of Gaussian kernel function and design discriminant algorithm whether sample data for the Gaussian distribution. Select the square of Hilbert space as standards of separability measure data. The design maximizes evaluation function based on the classification of the Gaussian kernel function parameter intervals. The process of kernel function parameter optimization algorithm is designed. The results prove that - It has the very high practical value that Gaussian kernel function parameter optimization algorithm which is designed in this paper. The kernel function parameters λ = 0.6; comparatively analyzed Support vector machine classification results when λ = 0.1, 0.2,…,0.9; Verify the superiority of the parameter optimization algorithm.

Keywords

Support vector machine Gaussian kernel function Hilbert space Classification interval Parameter optimization 

Notes

Acknowledgments

This work was supported by National Key Research and Development Program (no. 2016YFB0601403), the National Natural Science Foundation of China (no. 51504080), and the National Natural Science Foundation of Hebei Education Department (no. QN2016088).

References

  1. 1.
    Vapnik, V.N., Chervonenkis, A.J.: Ordered risk minimization. Autom. Remote Control 34, 1226–1235 (1974)MathSciNetMATHGoogle Scholar
  2. 2.
    Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)CrossRefMATHGoogle Scholar
  3. 3.
    Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)MATHGoogle Scholar
  4. 4.
    Burbidge, R., Trotter, M., Buxton, B., Holden, S.: Drug design by machine learning: support vector machines for pharmaceutical data analysis. Comput. Chem. 26(1), 5–14 (2001)CrossRefGoogle Scholar
  5. 5.
    Liu, H.F.: Some properties of support vector machines mercer’s nuclear. J. Beijing Union Univ. 59, 41–43 (2005)Google Scholar
  6. 6.
    Zheng, X.X., Qian, F.: Research on Gauss kernel support vector machine classification and model parameter selection. Comput. Eng. Appl. 1, 77–79 (2006)Google Scholar
  7. 7.
    Kong, X.B., Cao, L., Zhang, S.L.: A SVM kernel parameter selection approach based on the Gauss distribution. J. North Univ. China 120, 343–347 (2008)MATHGoogle Scholar
  8. 8.
    Feng, G.H.: Comparison of kernel function and parameter selection for SVM classification. Comput. Eng. Appl. 47, 123–125 (2011)Google Scholar
  9. 9.
    Wang, X.F., Chen, J.W.: Parameter selection of SVM with Gaussian kernel. Appl. Comput. Syst. 23, 242–245 (2014)Google Scholar
  10. 10.
    Li, D., Zhou, K.F., Sun, W.D., Wang, J.L., Yu, H., Liu, H.: Application of BP neural network and SVM in mine environmental evaluation. Arid Land Geogr. 38, 128–134 (2015)Google Scholar
  11. 11.
    Xie, R.B., Zhang, L., Yan, X.H., Yang, J., Lu, W.H.: Optimized parameters by simulated annealing of SVM in power transformer diagnosis. Com. Meas. Contr. 23, 1495–1498 (2015)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Yang Han
    • 1
  • Jie Li
    • 2
  • Jin-Ze Li
    • 3
  • Hong-Wei Xing
    • 2
  • Ai-Min Yang
    • 1
  • Yu-Hang Pan
    • 1
  1. 1.Hebei Key Laboratory of Data Science and ApplicationsNorth China University of Science and TechnologyTangshanChina
  2. 2.The Ministry of Education Key Laboratory with Modern Metallurgical TechnologyNorth China University of Science and TechnologyTangshanChina
  3. 3.College of Grassland and Environment SciencesXinjiang Agricultural UniversityXinjiangChina

Personalised recommendations