Advertisement

Linear Penalization Support Vector Machines for Feature Selection

  • Jaime Miranda
  • Ricardo Montoya
  • Richard Weber
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3776)

Abstract

Support Vector Machines have proved to be powerful tools for classification tasks combining the minimization of classification errors and maximizing their generalization capabilities. Feature selection, however, is not considered explicitly in the basic model formulation. We propose a linearly penalized Support Vector Machines (LP-SVM) model where feature selection is performed simultaneously with model construction. Its application to a problem of customer retention and a comparison with other feature selection techniques demonstrates its effectiveness.

Keywords

Support Vector Machine Feature Selection Current Account Support Vector Regression Feature Selection Technique 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Bradley, P., Mangasarian, O.: Feature selection vía concave minimization and support vector machines. In: Machine Learning proceedings of the fifteenth International Conference (ICML 1998), pp. 82–90. Morgan Kaufmann, San Francisco (1998)Google Scholar
  2. 2.
    Bhattacharya, C.: Second Order Cone Programming Formulations for Feature Selection. Journal of Machine Learning Research, 1417–1433 (2004)Google Scholar
  3. 3.
    Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (2000)Google Scholar
  4. 4.
    Guajardo, J., Miranda, J., Weber, R.: A Hybrid Forecasting Methodology using Feature Selection and Support Vector Regression. Presented at HIS 2005 Hybrid Intelligent Systems, Rio de Janeiro (November 2005)Google Scholar
  5. 5.
    Hand, D.: Discrimination and Classification. Wiley, Chichester (1981)zbMATHGoogle Scholar
  6. 6.
    Kotler, P.: Marketing Management: Analysis, Planning, Implementation, and Control, 10th edn. Prentice Hall International, New Jersey (2000)Google Scholar
  7. 7.
    Montoya, R., Weber, R.: Support Vector Machines Penalizado para Selección de Atributos. XI CLAIO, Concepción, Chile, 27-30 de octubre de, (in Spanish) (2002)Google Scholar
  8. 8.
    Nemhauser, G., Wolsey, L.: Integer and Combinatorial Optimization. John Wiley and Sons, New York (1988)zbMATHGoogle Scholar
  9. 9.
    Partridge, D., Cang, S.: Revealing Feature Interactions in Classification Tasks. In: Abraham, A., Ruiz-del-Solar, J., Köppen, M. (eds.) Soft Computing Systems - Design, Management and Applications, pp. 394–403. IOS Press, Amsterdam (2002)Google Scholar
  10. 10.
    Reichheld, F., Sasser, E.: Zero defections: Quality comes to services, Harvard Business Review, pp. 105–111 (September-October 1990)Google Scholar
  11. 11.
    Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)Google Scholar
  12. 12.
    Vapnik, V.: Statistical Learning Theory. John Wiley and Sons, New York (1998)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Jaime Miranda
    • 1
  • Ricardo Montoya
    • 1
  • Richard Weber
    • 1
  1. 1.Department of Industrial Engineering, Faculty of Physical and Mathematical SciencesUniversity of Chile 

Personalised recommendations