Applying a novel decision rule to the sphere-structured support vector machines algorithm
- 81 Downloads
The traditional sphere-structured support vector machines algorithm is one of the learning methods. It can partition the training samples space by means of constructing the spheres with the minimum volume covering all training samples of each pattern class in high-dimensional feature space. However, the decision rule of the traditional sphere-structured support vector machines cannot assign ambiguous sample points such as some encircled by more than two spheres to valid class labels. Therefore, the traditional sphere-structured support vector machines is insufficient for obtaining the better classification performance. In this article, we propose a novel decision rule applied to the traditional sphere-structured support vector machines. This new decision rule significantly improves the performance of labeling ambiguous points. Experimental results of seven real datasets show the traditional sphere-structured support vector machines based on this new decision rule can not only acquire the better classification accuracies than the traditional sphere-structured support vector machines but also achieve the comparable performance to the classical support vector machines.
KeywordsPattern classification Sphere-structured support vector machines Decision rule Kernel functions
We thank the anonymous reviewer’s constructive comments for improving presentation of this article.
- 2.Jain AK, Duin RPW, Mao J (2000) Statistical pattern recognition: a review. IEEE Trans Pattern Anal Mach Intell 22(1):4–37Google Scholar
- 3.Burges JC (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Dis 2(2):121–167Google Scholar
- 5.Schölkopf B, Burges C, Vapnik V (1995) Extracting support data for a given task. In: Proceedings of the first international conference knowledge discovery data mining, pp 252–257Google Scholar
- 6.Tax DMJ, Duin RPW (1999) Data domain description by support vectors. In: Verleysen M (ed) Proceedings ESANN. D. Facto, Brussels, pp 251–256Google Scholar
- 8.Kukn HW, Tucker AW (1951) Nonlinear programming. In: Proceedings of second Berkeley symposium on mathematical statistics and probability, pp 481–492Google Scholar
- 9.UCI Machine Learning Repository: http://www.ics.uci.edu/~mlearn/MLRepository.html
- 10.LibSVM Datasets. http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets