Skip to main content
Log in

An efficient regularized K-nearest neighbor structural twin support vector machine

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

K-nearest neighbor based structural twin support vector machine (KNN-STSVM) performs better than structural twin support vector machine (S-TSVM). It applies the intra-class KNN method, and different weights are given to the samples in one class to strengthen the structural information. For the other class, the redundant constraints are deleted by the inter-class KNN method to speed up the training process. However, the empirical risk minimization principle is implemented in the KNN-STSVM, so it easily leads to over-fitting and reduces the prediction accuracy of the classifier. To enhance the generalization ability of the classifier, we propose an efficient regularized K-nearest neighbor structural twin support vector machine, called RKNN-STSVM, by introducing a regularization term into the objective function. So there are two parts in the objective function, one of which is to maximize the margin between the two parallel hyper-planes, and the other one is to minimize the training errors of two classes of samples. Therefore the structural risk minimization principle is implemented in our RKNN-STSVM. Besides, a fast DCDM algorithm is introduced to handle relatively large-scale problems more efficiently. Comprehensive experimental results on twenty-seven benchmark datasets and two popular image datasets demonstrate the efficiency of our proposed RKNN-STSVM.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. http://archive.ics.uci.edu/ml/datasets.html

References

  1. Vapnik V (1995) The nature of statistical learning theory. Springer, New York

    Book  MATH  Google Scholar 

  2. Xu Y, Wang L (2005) Fault diagnosis system based on rough set theory and support vector machine. Lect Notes Comput Sci 3614:980–988

    Article  Google Scholar 

  3. Zhang W, Yoshida T, Tang X (2008) Text classification based on multi-word with support vector machine. Knowl-Based Syst 21(8):879–886

    Article  Google Scholar 

  4. Zhang C, Bi J, Xu S, Ramentol E, Fan G, Qiao B, Fujita H (2019) Multi-imbalance: an open-source software for multi-class imbalance learning. Knowl-Based Syst 174:137–143

    Article  Google Scholar 

  5. Jayadeva Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  MATH  Google Scholar 

  6. Mangasarian O, Wild E (2005) Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74

    Article  Google Scholar 

  7. Kumar M, Gopal M (2008) Application of smoothing technique on twin support vector machines. Pattern Recognit Lett 29(13):1842–1848

    Article  Google Scholar 

  8. Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw: the Official Journal of the International Neural Network Society 23(3):365–372

    Article  MATH  Google Scholar 

  9. Xu Y, Wang L (2012) A weighted twin support vector regression. Knowl-Based Syst 33:92–101

    Article  MathSciNet  Google Scholar 

  10. Xu Y, Wang L (2014) K-nearest neighbor-based weighted twin support vector regression. Appl Intell 41 (1):299–309

    Article  Google Scholar 

  11. Zhao J, Xu Y, Fujita H (2019) An improved non-parallel Universum support vector machine and its safe sample screening rule. Knowl-Based Syst 170:79–88

    Article  Google Scholar 

  12. Shao Y, Zhang C, Wang X, Deng N (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968

    Article  Google Scholar 

  13. Tian Y, Ju X, Qi Z, Shi Y (2014) Improved twin support vector machine. Sci China Mater 57 (2):417–432

    Article  MathSciNet  MATH  Google Scholar 

  14. Wang H, Zhou Z, Xu Y (2018) An improved ν-twin bounded support vector machine. Appl Intell 48(4):1041–1053

    Article  Google Scholar 

  15. Yang Z, Wu H, Li C, Shao Y (2016) Least squares recursive projection twin support vector machine for multi-class classification. Int J Mach Learn Cybern 7(3):411–426

    Article  Google Scholar 

  16. Tanveer M, Khan M, Ho S (2016) Robust energy-based least squares twin support vector machines. Appl Intell 45(1):174– 186

    Article  Google Scholar 

  17. Khemchandani R, Saigal P, Chandra S (2016) Improvements on ν-twin support vector machine. Neural Netw 79:97–107

    Article  Google Scholar 

  18. Tanveer M (2015) Application of smoothing techniques for linear programming twin support vector machines. Knowl Inf Syst 45(1):191–214

    Article  Google Scholar 

  19. Shao Y, Wang Z, Chen W, Deng N (2013) Least squares twin parametric-margin support vector machine for classification. Appl Intell 39(3):451–464

    Article  Google Scholar 

  20. Shao Y, Deng N, Yang Z (2012) Least squares recursive projection twin support vector machine for classification. Pattern Recognit 45(6):2299–2307

    Article  MATH  Google Scholar 

  21. Yeung D, Wang D, Ng W, Tsang E, Wang X (2007) Structured large margin machines: sensitive to data distributions. Mach Learn 68(2):171–200

    Article  Google Scholar 

  22. Xue H, Chen S, Yang Q (2011) Structural regularized support vector machine: a framework for structural large margin classifier. IEEE Trans Neural Netw 22(4):573–587

    Article  Google Scholar 

  23. Qi Z, Tian Y, Shi Y (2013) Structural twin support vector machine for classification. Knowl-Based Syst 43:74–81

    Article  Google Scholar 

  24. Xu Y, Pan X, Zhou Z, Yang Z, Zhang Y (2015) Structural least square twin support vector machine for classification. Appl Intell 42(3):527–536

    Article  Google Scholar 

  25. Ye Q, Zhao C, Gao S, Zheng H (2012) Weighted twin support vector machines with local information and its application. Neural Netw 35:31–39

    Article  MATH  Google Scholar 

  26. Pan X, Luo Y, Xu Y (2015) K-nearest neighbor based structural twin support vector machine. Knowl-Based Syst 88:34–44

    Article  Google Scholar 

  27. Mir A, Nasiri J (2018) KNN-based least squares twin support vector machine for pattern classification. Appl Intell 48(12):4551–4564

    Article  Google Scholar 

  28. Tanveer M, Shubham K, Aldhaifallah M, Ho S (2016) An efficient regularized k-nearest neighbor-based weighted twin support vector regression. Knowl-Based Syst 94:70–87

    Article  Google Scholar 

  29. Thongkam J, Xu G, Zhang Y, Huang F (2008) Support vector machine for outlier detection in breast cancer survivability prediction. Adv Web Network Technol Appl 4977:99–109

    Article  Google Scholar 

  30. Cui W, Yan X (2009) Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR. Chemom Intell Lab Syst 98(2):130– 135

    Article  Google Scholar 

  31. Xu Y, Liu C (2013) A rough margin-based one class support vector machine. Neural Comput Appl 22 (6):1077–1084

    Article  Google Scholar 

  32. Shao Y, Wang Z, Chen W, Deng N (2013) A regularization for the projection twin support vector machine. Knowl-Based Syst 37:203–210

    Article  Google Scholar 

  33. Shao Y, Zhang C, Yang Z, Jing L, Deng N (2013) An 𝜖-twin support vector machine for regression. Neural Comput Appl 23(1):175–185

    Article  Google Scholar 

  34. Tanveer M (2015) Robust and sparse linear programming twin support vector machines. Cogn Comput 7 (1):137–149

    Article  Google Scholar 

  35. Osuna E, Freund R, Girosi F (1997) An improved training algorithm for support vector machines. Neural Netw Signal Process VII(17):276–285

    Google Scholar 

  36. Platt J (1999) Fast training of support vector machines using sequential minimal optimization. Advances in kernel methods. MIT Press, Cambridge

    Google Scholar 

  37. Mavroforakis M, Theodoridis S (2006) A geometric approach to support vector machine (svm) classification. IEEE Trans Neural Netw 17(3):671–682

    Article  Google Scholar 

  38. Hsieh C, Chang K, Lin C, Keerthi S, Sundararajan S (2008) A dual coordinate descent method for large-scale linear SVM. In: Proceedings of the international conference on machine learning, vol 9. ACM, pp 408–415

  39. Ward J (1963) Hierarchical grouping to optimize an objective function. Publ Am Stat Assoc 58(301):236–244

    Article  MathSciNet  Google Scholar 

  40. Salvador S, Chan P (2004) Determining the number of clusters/segments in hierarchical clustering/segmentation algorithms. In: 16th IEEE international conference on tools with artificial intelligence. Proceedings, pp 576–584

  41. Xue H, Chen S, Yang Q (2009) Discriminatively regularized least-squares classification. Pattern Recognit 42(1):93–104

    Article  MATH  Google Scholar 

  42. Chang C, Lin C (2011) LIBSVM: a library for support vector machines. ACM

  43. Ar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  Google Scholar 

  44. Salvador G, Alberto F, Julián L, Herrera F (2010) Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf Sci 180(10):2044–2064

    Article  Google Scholar 

  45. Feifei L, Fergus R, Perona P (2006) One-shot learning of object categories. IEEE Trans Pattern Anal Mach Intell 28(4):594–611

    Article  Google Scholar 

  46. Griffin G, Holub A, Perona P (2006) The Caltech 256, Caltech Technical Report

Download references

Acknowledgments

The authors gratefully acknowledge the helpful comments of the reviewers, which have improved the presentation. This work was supported in part by National Natural Science Foundation of China (No.11671010) and Beijing Natural Science Foundation (No. 4172035).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yitian Xu.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xie, F., Xu, Y. An efficient regularized K-nearest neighbor structural twin support vector machine. Appl Intell 49, 4258–4275 (2019). https://doi.org/10.1007/s10489-019-01505-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-019-01505-5

Keywords

Navigation