Skip to main content

Improved Multi-class Support Vector Machines Using Novel Methods of Model Selection and Feature Extraction

  • Conference paper
Neural Information Processing (ICONIP 2013)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8227))

Included in the following conference series:

  • 3594 Accesses

Abstract

In this paper, to improve the generalization capability of multi-class SVMs, we propose (1) a novel model selection and (2) feature extraction by SVMs. In (1), unlike the conventional model selection in multi-class SVMs, we determine hyper-parameters, which are kernel parameter and margin parameter, for each separating hyper-plane, separately. Namely, for each separating hyper-plane, we estimate the generalization capability and select optimal values of the hyper-parameters, separately. In (2), we define the weighted vectors of decision functions determined by training multi-class SVMs as the basis vector of the subspace, and we determine the separating hyper-planes in the subspace. Thus, we can determine the new separating hyper-planes during considering the all separating hyper-planes. Using multi-class benchmark data sets, we evaluate the effectiveness of the proposed methods over the conventional method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: Statistical Learning Theory. John Wiley & Sons, New York (1998)

    MATH  Google Scholar 

  2. Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    Book  MATH  Google Scholar 

  3. Abe, S.: Support Vector Machines for Pattern Classification (Advances in Pattern Recognition). Springer, London (2010)

    Book  Google Scholar 

  4. Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)

    MATH  Google Scholar 

  5. Kitamura, T., Takeuchi, S., Abe, S., Fukui, K.: Subspace-based Support Vector Machines for Pattern Classification. Neural Networks 22(5-6), 558–567 (2009)

    Article  Google Scholar 

  6. Kitamura, T., Takeuchi, S., Abe, S.: Feature Selection and Fast Training of Subspace Based Support Vector Machines. In: International Joint Conference on Neural Networks (IJCNN 2010), pp. 1967–1972 (2010)

    Google Scholar 

  7. Rätsch, G., Onda, T., Müller, K.R.: Soft Margins for AdaBoost. Machine Learning 42(3), 287–320 (2001)

    Article  MATH  Google Scholar 

  8. http://archive.ics.uci.edu/ml

  9. Suykens, J.A.K., Vandewalle, J.: Least Squares Support Vector Machine Classifiers. Neural Processing Letters 9(3), 293–300 (1999)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kitamura, T., Ota, K. (2013). Improved Multi-class Support Vector Machines Using Novel Methods of Model Selection and Feature Extraction. In: Lee, M., Hirose, A., Hou, ZG., Kil, R.M. (eds) Neural Information Processing. ICONIP 2013. Lecture Notes in Computer Science, vol 8227. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42042-9_38

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-42042-9_38

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-42041-2

  • Online ISBN: 978-3-642-42042-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics