Soft Computing

, Volume 11, Issue 4, pp 383–389 | Cite as

Nesting Algorithm for Multi-Classification Problems

  • Bo LiuEmail author
  • Zhifeng Hao
  • Xiaowei Yang


Support vector machines (SVMs) are originally designed for binary classifications. As for multi-classifications, they are usually converted into binary ones. In the conventional multi-classifiable algorithms, One-against-One algorithm is a very power method. However, there exists a middle unclassifiable region. In order to overcome this drawback, a novel method called Nesting Algorithm is presented in this paper. Our ideas are as follows: firstly, construct the optimal hyperplanes based on One-against-One approach. Secondly, if there exist data points in the middle unclassifiable region, select them to construct the optimal hyperplanes with the same hyperparameters. Thirdly, repeat the second step until there are no data points in the unclassifiable region or the region is disappeared. In this paper, we also prove the validity of the proposed algorithm for unclassifiable region and give the computational complexity analysis of the method. In order to examine the training accuracy and the generalization performance of the proposed algorithm, One-against-One algorithm, fuzzy least square support vector machine (FLS-SVM) and the proposed algorithm are applied to five UCI datasets. The results show that the training accuracy of the proposed algorithm is higher than the others, and its generalization performance is also comparable with them.


Support vector machines Least squares support vector machine One-against-One algorithm FLS-SVM Nesting algorithm 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Abe S (2003) Analysis of multiclass support vector machines. In: Proceeding of international conference on computational intelligence for modelling control and automation, Vienna, Austria, pp 385–396Google Scholar
  2. 2.
    Angulo C, Parra X, Català A (2003) K-SVCR a support vector machine for multi-class classification. Neurocomputing 55:57–77CrossRefGoogle Scholar
  3. 3.
    Boser BE, Guyon IM, Vapnik VN (1992) A training algorithm for optimal margin classifiers. In: Haussler D (ed) Proceeding 5th Annu, ACM workshop comput learning theory, pp 144–152Google Scholar
  4. 4.
    Bredensteniner EJ, Bennett KP (1999) Multicategory classification by support vector machines. Comput Optim Appl 12:53–79CrossRefMathSciNetGoogle Scholar
  5. 5.
    Fürnkranz J (2002) Round robin classification. J Mach Learn Res 2:721–747CrossRefzbMATHMathSciNetGoogle Scholar
  6. 6.
    Hao ZF et al (2005) Twi-map support vector machine for multi-classification problems. In: Proceeding of ISNN 2005 conference, lecture notes in computer science 3496. Chongqing, PT1, pp 869–874Google Scholar
  7. 7.
    Hsu CW, Lin CJ (2002) A comparison of methods for multiclass support vector machines. IEEE Tran Neural Netw 13:415–425CrossRefGoogle Scholar
  8. 8.
    Inoue T, Abe S (2001) Fuzzy support vector machines for pattern classification. In Proceedings of international joint conference on neural networks (IJCNN‘01) 2:1449–1454Google Scholar
  9. 9.
    KreBel UHG (1999) Pairwise classification and support vector machines. In: Schölkopf B, Burges CJ Smola AJ (eds) Advances in Kernel methods: support vector learning. The MIT Press, Cambridge, pp 255–268Google Scholar
  10. 10.
    Liu B, Hao ZF, Yang XW (2006) Binary tree support vector machine based on kernel fisher discriminant for multi-classification. In: Proceeding of ISNN 2006 conference, lecture notes in computer science (in press)Google Scholar
  11. 11.
    Murphy PM, Aha DW (1992) UCI repository of machine learning databaseGoogle Scholar
  12. 12.
    Platt JC, Cristianini N, Shawe-Taylor J (2000) Large margin DAGs for multiclass classification. In: Solla SA, Leen TK, Müller KR (eds) Advances in neural information processing systems 12. The MIT Press, Cambridge, pp 547–553Google Scholar
  13. 13.
    Rifkin R, Klautau A (2004) In defense of One-Vs-All classification. J Mach Learn Res 5:101–141Google Scholar
  14. 14.
    Saitoh S (1988) Theory of reproducing kernels and its applications. Longman, HarlowzbMATHGoogle Scholar
  15. 15.
    Suykens JAK (2000) Least squares support vector machine for classification and nonlinear modeling. Neural Netw World 10(1–2): 29–47MathSciNetGoogle Scholar
  16. 16.
    Tsujinishi D, Abe S (2003) Fuzzy least squares support vector machines for multiclass problems. Neural Netw 16:785–792CrossRefPubMedGoogle Scholar
  17. 17.
    Vapnik VN (1995) The nature of statistical learning theory. Springer, LondonzbMATHGoogle Scholar
  18. 18.
    Vapnik VN (1998) Statistical learning theory. Wiley, New YorkzbMATHGoogle Scholar
  19. 19.
    Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Netw 10(5):988–999CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2006

Authors and Affiliations

  1. 1.College of Computer Science and EngineeringSouth China University of TechnologyGuangzhouPeople’s Republic of China
  2. 2.School of Mathematical ScienceSouth China University of TechnologyGuangzhouPeople’s Republic of China

Personalised recommendations