Skip to main content

Variants and Performances of Novel Direct Learning Algorithms for L2 Support Vector Machines

  • Conference paper
Artificial Intelligence and Soft Computing (ICAISC 2014)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8468))

Included in the following conference series:

Abstract

The paper introduces a novel Direct L2 Support Vector Machine (DL2 SVM) classifier and presents the performances of its 4 variants on 12 different binary and multiclass datasets. Direct L2 SVM avoids solving quadratic programming (QP) problem and it solves the Nonnegative Least Squares (NNLS) task instead, which, unlike the related iterative algorithms, produces an impeccably accurate results. Solutions obtained by NNLS and QP are equal but NNLS needs much less CPU time. The comprehensive DL2 SVM model, as well as its three variants, are devised. The similarities with, and differences in respect to, LS SVM and proximal SVMs are pointed at too. The four DL2 SVM models performances are compared in terms of accuracy, percentage of support vectors and CPU time. A strict nested cross-validation (double resampling) is used in all experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.: Estimation of Dependences Based on Empirical Data (in Russian), Nauka, Moscow (1979); (English translation: Springer Verlag, New York 1982)

    Google Scholar 

  2. Zigic, L., Strack, R., Kecman, V.: L2 SVM Revisited - Novel Direct Learning Algorithm and Some Geometric Insights. In: MENDEL 19th International Conference on Soft Computing, Brno, Czech Republic (2013)

    Google Scholar 

  3. Huang, T.M., Kecman, V., Kopriva, I.: Kernel Based Algorithms for Mining Huge Data Sets. In: Supervised, Semi- supervised, and Unsupervised Learning. Springer, Heidelberg (2006)

    Google Scholar 

  4. Abe, S.: Support Vector Machines for Pattern Classification, 2nd edn. Springer (2010)

    Google Scholar 

  5. Suykens, J., Vandewalle, J.: Least Squares Support Vector Machine Classifiers. Neural Processing Letters (NPL) 9(3), 293–300 (1999)

    Article  MathSciNet  Google Scholar 

  6. Fung, G., Mangasarian, O.L.: Multicategory Proximal Support Vector Machine Classifiers. In: Machine Learning, pp. 1–21 (2004)

    Google Scholar 

  7. Mavroforakis, M.E., Theodoridis, S.: A geometric approach to support vector machine (SVM) classification. IEEE TNNS 17(3), 671–682 (2006)

    Google Scholar 

  8. Franc, V., Hlavc, V.: An iterative algorithm learning the maximal margin classifier. Pattern Recognition 36(9), 1985–1996 (2003)

    Article  MATH  Google Scholar 

  9. Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.K.: A fast iterative nearest point algorithm for support vector machine classifier design. IEEE TNNLS 11(1), 24–36 (2000)

    Google Scholar 

  10. Tsang, I.W., Kwok, J.T., Cheung, P.-M.: Core Vector Machines: Fast SVM Training on Very Large Data Sets. Journal of Machine Learning Research 6, 363–392 (2005)

    MATH  MathSciNet  Google Scholar 

  11. Strack, R., Kecman, V., Li, Q., Strack, B.: Sphere Support Vector Machines for Large Classification Tasks. Neurocomputing 101, 59–67 (2013)

    Article  Google Scholar 

  12. Strack, R., Kecman, V.: Minimal Norm Support Vector Machines for Large Classification Tasks. In: Proc. of the 11th IEEE International Conference on Machine Learning Applications (ICMLA 2012), Boca Raton, FL, vol. 1, pp. 209–214 (2012)

    Google Scholar 

  13. Strack, R.: Geometric Approach to Support Vector Machines Learning for Large Datasets, PhD dissertation, Virginia Commonwealth University, Richmond, VA (2013)

    Google Scholar 

  14. Kecman, V., Strack, R., Zigic, L.: Big Data Mining by L2 SVMs - Geometrical Insights Help, Seminar at CS Department, Virginia Commonwealth University, VCU, Richmond, VA (2013)

    Google Scholar 

  15. Lawson, C.L., Hanson, R.J.: Solving Least Squares Problems. Prentice-Hall, Inc., Englewood Cliffs (1974)

    MATH  Google Scholar 

  16. Reinhardt, A., Hubbard, T.: Using neural networks for prediction of the subcellular location of proteins. Nucleic Acids Res. 26, 2230–2236 (1998)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Zigic, L., Kecman, V. (2014). Variants and Performances of Novel Direct Learning Algorithms for L2 Support Vector Machines. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2014. Lecture Notes in Computer Science(), vol 8468. Springer, Cham. https://doi.org/10.1007/978-3-319-07176-3_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-07176-3_8

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-07175-6

  • Online ISBN: 978-3-319-07176-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics