Abstract
Though support vector machine has been a promising tool in machine learning, but it does not directly obtain the feature importance. Identifying a subset of features which contribute most to classification is also an important task in classification. The benefit of feature selection is twofold. It leads to parsimonious models that are often preferred in many scientific problems, and it is also crucial for achieving good classification accuracy in the presence of redundant features. We can combine SVM with various feature selection strategies. Some of them are “filters”: general feature selection methods independent of SVM; on the other hand, some are wrapper-type methods: modifications of SVM which choose important features as well as conduct training/testing. In the machine learning literature, there are several proposals for feature selection to accomplish the goal of automatic feature selection in the SVM, in some of which they applied the l 0-norm, l 1-norm SVM and got competitive performance. We proposed two models in this chapter, l p -norm C-support vector classification (l p -SVC) and l p -norm proximal support vector machine (l p -PSVM), which separately combines C-SVC and PSVM with feature selection strategy by introducing the l p -norm (0<p<1).
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Bradley, P., Mangasarian, O.: Feature selection via concave minimization and support vector machines. In: International Conference on Machine Learning. Morgan Kaufmann, San Mateo (1998)
Candes, E., Wakin, M., Boyd, S.: Enhancing sparsity by reweighted l 1 minimization. J. Fourier Anal. Appl. 14, 877–905 (2008)
Chen, X., Xu, F., Ye, Y.: Lower bound theory of nonzero entries in solutions of l 2–l p minimization. Technical report, Department of Applied Mathematics, The Hong Kong Polytechnic University (2009)
Chen, X., Zhou, W.: Smoothing nonlinear conjugate gradient method for image restoration using nonsmooth nonconvex minimization. Preprint, Department of Applied Mathematics, The Hong Kong Polytechnic University (2008)
Friedman, J., Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: Discussion of “Consistency in boosting” by W. Jiang, G. Lugosi, N. Vayatis and T. Zhang. Ann. Stat. 32, 102–107 (2004)
Fung, G., Mangasarian, O.: Proximal support vector machine classifiers. In: Proceedings of International Conference of Knowledge Discovery and Data Mining, pp. 77–86 (2001)
Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Mach. Learn. 46, 389–422 (2002)
Weston, J., Mukherjee, S., Chapelle, O., Pontil, M., Poggio, T., Vapnik, V.: Feature selection for SVMs. In: Advances in Neural Information Processing Systems, vol. 13 (2001)
Zhu, J., Rosset, S., Hastie, T., Tibshirani, R.: 1-norm support vector machines. In: Advances in Neural Information Processing Systems, vol. 16 (2004)
Zou, H., Yuan, M.: The f ∞ norm support vector machine. Stat. Sin. 18, 379–398 (2008)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2011 Springer-Verlag London Limited
About this chapter
Cite this chapter
Shi, Y., Tian, Y., Kou, G., Peng, Y., Li, J. (2011). Feature Selection via l p -Norm Support Vector Machines. In: Optimization Based Data Mining: Theory and Applications. Advanced Information and Knowledge Processing. Springer, London. https://doi.org/10.1007/978-0-85729-504-0_6
Download citation
DOI: https://doi.org/10.1007/978-0-85729-504-0_6
Publisher Name: Springer, London
Print ISBN: 978-0-85729-503-3
Online ISBN: 978-0-85729-504-0
eBook Packages: Computer ScienceComputer Science (R0)