Abstract
In this paper, to improve the generalization capability of multi-class SVMs, we propose (1) a novel model selection and (2) feature extraction by SVMs. In (1), unlike the conventional model selection in multi-class SVMs, we determine hyper-parameters, which are kernel parameter and margin parameter, for each separating hyper-plane, separately. Namely, for each separating hyper-plane, we estimate the generalization capability and select optimal values of the hyper-parameters, separately. In (2), we define the weighted vectors of decision functions determined by training multi-class SVMs as the basis vector of the subspace, and we determine the separating hyper-planes in the subspace. Thus, we can determine the new separating hyper-planes during considering the all separating hyper-planes. Using multi-class benchmark data sets, we evaluate the effectiveness of the proposed methods over the conventional method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Vapnik, V.N.: Statistical Learning Theory. John Wiley & Sons, New York (1998)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Abe, S.: Support Vector Machines for Pattern Classification (Advances in Pattern Recognition). Springer, London (2010)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)
Kitamura, T., Takeuchi, S., Abe, S., Fukui, K.: Subspace-based Support Vector Machines for Pattern Classification. Neural Networks 22(5-6), 558–567 (2009)
Kitamura, T., Takeuchi, S., Abe, S.: Feature Selection and Fast Training of Subspace Based Support Vector Machines. In: International Joint Conference on Neural Networks (IJCNN 2010), pp. 1967–1972 (2010)
Rätsch, G., Onda, T., Müller, K.R.: Soft Margins for AdaBoost. Machine Learning 42(3), 287–320 (2001)
Suykens, J.A.K., Vandewalle, J.: Least Squares Support Vector Machine Classifiers. Neural Processing Letters 9(3), 293–300 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kitamura, T., Ota, K. (2013). Improved Multi-class Support Vector Machines Using Novel Methods of Model Selection and Feature Extraction. In: Lee, M., Hirose, A., Hou, ZG., Kil, R.M. (eds) Neural Information Processing. ICONIP 2013. Lecture Notes in Computer Science, vol 8227. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42042-9_38
Download citation
DOI: https://doi.org/10.1007/978-3-642-42042-9_38
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-42041-2
Online ISBN: 978-3-642-42042-9
eBook Packages: Computer ScienceComputer Science (R0)