Skip to main content
Log in

An Adaptive Gaussian Kernel for Support Vector Machine

  • Research Article-Computer Engineering and Computer Science
  • Published:
Arabian Journal for Science and Engineering Aims and scope Submit manuscript

Abstract

The most commonly used kernel function of support vector machine (SVM) in nonlinear separable dataset in machine learning is Gaussian kernel, also known as radial basis function. The Gaussian kernel decays exponentially in the input feature space and uniformly in all directions around the support vector, causing hyper-spherical contours of kernel function. In this study, an adaptive kernel function is designed based on the Gaussian kernel, which is used in SVM. While the sigma parameter is determined as an arbitrary value in the traditional Gaussian kernel, a modified Gaussian kernel method is used that calculates an adaptive value depending on the input vectors in the proposed kernel function. The proposed kernel function is compared with the linear, polynomial and Gaussian kernels commonly used in support vector machines. The results show that the proposed kernel function performs well on separable linear and nonlinear datasets compared to other kernel functions. It is also compared to state-of-the-art support vector machine kernels.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995) https://doi.org/10.1007/978-1-4757-3264-1

    Book  MATH  Google Scholar 

  2. Ding, L., Liao, S., Liu, Y., Liu, L., Zhu, F., Yao, Y., & Gao, X. (2020). Approximate kernel selection via matrix approximation. IEEE Trans. Neural Netw. Learn. Syst. 31(11), 4881–4891, https://doi.org/10.1109/TNNLS.2019.2958922.

  3. Cortes, C.; Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995). https://doi.org/10.1007/BF00994018

    Article  MATH  Google Scholar 

  4. Gasimov, R.N.: Augmented Lagrangian duality and nondifferentiable optimization methods in nonconvex programming. J. Global Optim. 24(2), 187–203 (2002). https://doi.org/10.1023/A:1020261001771

    Article  MathSciNet  MATH  Google Scholar 

  5. Bao, Y., Wang, T., & Qiu, G. (2014). Research on applicability of svm kernel functions used in binary classification. In: Proceedings of International Conference on Computer Science and Information Technology (pp. 833–844). Springer, New Delhi. https://doi.org/10.1007/978-81-322-1759-6_95 .

  6. Bzdok, D.; Krzywinski, M.; Altman, N.: Machine learning: supervised methods. Nat Methods 15, 5–6 (2018). https://doi.org/10.1038/nmeth.4551

    Article  Google Scholar 

  7. Osisanwo, F. Y., Akinsola, J. E. T., Awodele, O., Hinmikaiye, J. O., Olakanmi, O., & Akinjobi, J. (2017). Supervised machine learning algorithms: classification and comparison. Int. J. Comp. Trends Technol. (IJCTT), 48(3), 128–138, https://doi.org/10.14445/22312803/IJCTT-V48P126.

  8. Elen, A.; Avuçlu, E.: standardized variable distances: a distance-based machine learning method. Appl. Soft Comput. 98, 106855 (2021). https://doi.org/10.1016/j.asoc.2020.106855

    Article  Google Scholar 

  9. Bi, Q.; Goodman, K.E.; Kaminsky, J.; Lessler, J.: What is machine learning? a primer for the epidemiologist. Am. J. Epidemiol. 188(12), 2222–2239 (2019). https://doi.org/10.1093/aje/kwz189

    Article  Google Scholar 

  10. Amari, S.I.; Wu, S.: Improving support vector machine classifiers by modifying kernel functions. Neural Netw. 12(6), 783–789 (1999). https://doi.org/10.1016/S0893-6080(99)00032-5

    Article  Google Scholar 

  11. Ozer, S.; Chen, C.H.; Cirpan, H.A.: A set of new Chebyshev kernel functions for support vector machine pattern classification. Pattern Recogn. 44(7), 1435–1447 (2011). https://doi.org/10.1016/j.patcog.2010.12.017

    Article  MATH  Google Scholar 

  12. Tian, M.; Wang, W.: Some sets of orthogonal polynomial kernel functions. Appl. Soft Comput. 61, 742–756 (2017). https://doi.org/10.1016/j.asoc.2017.08.010

    Article  Google Scholar 

  13. Moghaddam, V.H.; Hamidzadeh, J.: New Hermite orthogonal polynomial kernel and combined kernels in support vector machine classifier. Pattern Recogn. 60, 921–935 (2016). https://doi.org/10.1016/j.patcog.2016.07.004

    Article  MATH  Google Scholar 

  14. Jiang, H.; Ching, W.K.; Yiu, K.F.C.; Qiu, Y.: Stationary Mahalanobis kernel SVM for credit risk evaluation. Appl. Soft Comput. 71, 407–417 (2018). https://doi.org/10.1016/j.asoc.2018.07.005

    Article  Google Scholar 

  15. Shankar, K.; Lakshmanaprabu, S.K.; Gupta, D.; Maseleno, A.; De Albuquerque, V.H.C.: Optimal feature-based multi-kernel SVM approach for thyroid disease classification. J. Supercomput. 76(2), 1128–1143 (2020). https://doi.org/10.1007/s11227-018-2469-4

    Article  Google Scholar 

  16. Ye, N., Sun, R., Liu, Y., & Cao, L. (2006) Support vector machine with orthogonal Chebyshev kernel. In: 18th International Conference on Pattern Recognition (ICPR'06) (Vol. 2, pp. 752–755). IEEE. https://doi.org/10.1109/ICPR.2006.1096

  17. Zanaty, E.A.; Afifi, A.: Support vector machines (SVMs) with universal kernels. Appl. Artif. Intell. 25(7), 575–589 (2011). https://doi.org/10.1080/08839514.2011.595280

    Article  Google Scholar 

  18. Ozguven, M. M., Yilmaz, G., Adem, K., & Kozkurt, C. Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Curr Inves Agri Curr Res 6 (1)-2019. CIACR. MS. ID, 229, https://doi.org/10.32474/CIACR.2019.06.000229.

  19. Elen, A. & Turan, M. K. (2019). Classifying white blood cells using machine learning algorithms. Int. J. Eng. Res Develop. 11(1): 141–152. https://doi.org/10.29137/umagd.498372.

  20. Yöntem, M. K., & Adem, K. (2019). Prediction of the level of alexithymia through machine learning methods applied to automatic thoughts. Current Approaches Psychiat. https://doi.org/10.18863/pgy.554788.

  21. Pan, B.; Chen, W.S.; Xu, C.; Chen, B.: A novel framework for learning geometry-aware kernels. IEEE Trans. Neural Networks Learn. Syst. 27(5), 939–951 (2015). https://doi.org/10.1109/TNNLS.2015.2429682

    Article  MathSciNet  Google Scholar 

  22. Zhang, Y.; Xie, F.; Huang, D.; Ji, M.: Support vector classifier based on fuzzy c-means and Mahalanobis distance. J. Intell. Inf. Syst. 35(2), 333–345 (2010). https://doi.org/10.1007/s10844-009-0102-y

    Article  Google Scholar 

  23. Baek, J.; Kim, E.: A new support vector machine with an optimal additive kernel. Neurocomputing 329, 279–299 (2019). https://doi.org/10.1016/j.neucom.2018.10.032

    Article  Google Scholar 

  24. Ding, X., Liu, J., Yang, F., & Cao, J. (2021) Random radial basis function kernel-based support vector machine. J. Franklin Instit., (In press). https://doi.org/10.1016/j.jfranklin.2021.10.005

  25. Ding, X.; Liu, J.; Yang, F.; Cao, J.: Random compact Gaussian kernel: application to ELM classification and regression. Knowl.-Based Syst. 217, 106848 (2021). https://doi.org/10.1016/j.knosys.2021.106848

    Article  Google Scholar 

  26. Baş, S.; Körpinar, T.: Modified roller coaster surface in space. Mathematics 7(2), 195 (2019). https://doi.org/10.3390/math7020195

    Article  Google Scholar 

  27. Mustaqeem, M.; Saqib, M.: Principal component based support vector machine (PC-SVM): a hybrid technique for software defect detection. Clust. Comput. (2021). https://doi.org/10.1007/s10586-021-03282-8

    Article  Google Scholar 

  28. Xue, S.; Yan, X.: A new kernel function of support vector regression combined with probability distribution and its application in chemometrics and the QSAR modeling. Chemom. Intell. Lab. Syst. 167, 96–101 (2017). https://doi.org/10.1016/j.chemolab.2017.05.005

    Article  Google Scholar 

  29. Padierna, L.C.; Carpio, M.; Rojas-Domínguez, A.; Puga, H.; Fraire, H.: A novel formulation of orthogonal polynomial kernel functions for SVM classifiers: the Gegenbauer family. Pattern Recogn. 84, 211–225 (2018). https://doi.org/10.1016/j.patcog.2018.07.010

    Article  Google Scholar 

  30. Jafarzadeh, S. Z., Aminian, M., & Efati, S. A set of new kernel function for support vector machines: An approach based on Chebyshev polynomials. In: ICCKE 2013 (pp. 412–416). IEEE. https://doi.org/10.1109/iccke.2013.6682848

  31. Zhou, S.-S.; Liu, H.-W.; Ye, F.: Variant of Gaussian kernel and parameter setting method for nonlinear SVM. Neurocomputing 72(13–15), 2931–2937 (2009). https://doi.org/10.1016/j.neucom.2008.07.016

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cemil Közkurt.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Elen, A., Baş, S. & Közkurt, C. An Adaptive Gaussian Kernel for Support Vector Machine. Arab J Sci Eng 47, 10579–10588 (2022). https://doi.org/10.1007/s13369-022-06654-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13369-022-06654-3

Keywords

Navigation