Skip to main content

The Minimum Redundancy – Maximum Relevance Approach to Building Sparse Support Vector Machines

  • Conference paper
Book cover Intelligent Data Engineering and Automated Learning - IDEAL 2009 (IDEAL 2009)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 5788))

Abstract

Recently, building sparse SVMs becomes an active research topic due to its potential applications in large scale data mining tasks. One of the most popular approaches to building sparse SVMs is to select a small subset of training samples and employ them as the support vectors. In this paper, we explain that selecting the support vectors is equivalent to selecting a number of columns from the kernel matrix, and is equivalent to selecting a subset of features in the feature selection domain. Hence, we propose to use an effective feature selection algorithm, namely the Minimum Redundancy – Maximum Relevance (MRMR) algorithm to solve the support vector selection problem. MRMR algorithm was then compared to two existing methods, namely back-fitting (BF) and pre-fitting (PF) algorithms. Preliminary results showed that MRMR generally outperformed BF algorithm while it was inferior to PF algorithm, in terms of generalization performance. However, the MRMR approach was extremely efficient and significantly faster than the two compared algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)

    Book  MATH  Google Scholar 

  2. Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)

    Article  Google Scholar 

  3. Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Processing Letters 9, 293–300 (1999)

    Article  MATH  Google Scholar 

  4. Fung, G., Mangasarian, O.L.: Proximal support vector machine classifiers. In: Proceedings of Knowledge Discovery and Data Mining, San Francisco, CA, New York, pp. 77–86 (2001)

    Google Scholar 

  5. Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Smola, A.J., Mueller, K.-R.: Constructing descriptive and discriminative non-linear features: Rayleigh coefficients in kernel feature spaces. IEEE Transactions on Pattern Analysis and Machine Intelligence 25(5), 623–628 (2003)

    Article  Google Scholar 

  6. Burges, C.J.C.: Simplified support vector decision rules. In: Proceedings of the 13th International Conference on Machine Learning, Bari, Italy, pp. 71–77 (1996)

    Google Scholar 

  7. Burges, C.J.C., Schoelkopf, B.: Improving speed and accuracy of support vector learning machines. In: Advances in Neural Information Processing Systems, vol. 9, pp. 375–381. MIT Press, Cambridge (1997)

    Google Scholar 

  8. Wu, M., Schölkoph, B., Bakir, G.: A direct method for building sparse kernel learning algorithms. Journal of Machine Learning Research 7, 603–624 (2006)

    MathSciNet  MATH  Google Scholar 

  9. Lee, Y., Mangasarian, O.L.: RSVM: reduced support vector machines. In: CD Proceedings of the First SIAM International Conference on Data Mining, Chicago (2001)

    Google Scholar 

  10. Lee, Y., Mangasarian, O.L.: SSVM: A smooth support vector machine. In: Computational Optimization and applications, pp. 5–22 (2001)

    Google Scholar 

  11. Lin, K., Lin, C.: A study on reduced support vector machines. IEEE Transactions on Neural Networks 14, 1449–1459 (2003)

    Article  Google Scholar 

  12. Downs, T., Gates, K.E., Masters, A.: Exact simplification of support vector solutions. Journal of Machine Learning Research 2, 293–297 (2001)

    MATH  Google Scholar 

  13. Keerthi, S.S., Chapelle, O., DeCoste, D.: Building support vector machines with reduced classifier complexity. Journal of Machine Learning Research 8, 1–22 (2006)

    MATH  Google Scholar 

  14. Sun, P., Yao, X.: Greedy forward selection algorithms to sparse Gaussian process regression. In: Proceedings of the 2006 International Joint Conference on Neural Networks (IJCNN 2006), Vancouver, Canada, pp. 159–165 (2006)

    Google Scholar 

  15. Ding, C., Peng, H.: Minimum redundancy feature selection from microarray gene expression data. In: Proceedings of the Computational Systems Bioinformatics, pp. 523–528 (2003)

    Google Scholar 

  16. UCI Machine Learning Repository, http://www.ics.uci.edu/~mlearn/MLRepository.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yang, X., Tang, K., Yao, X. (2009). The Minimum Redundancy – Maximum Relevance Approach to Building Sparse Support Vector Machines. In: Corchado, E., Yin, H. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2009. IDEAL 2009. Lecture Notes in Computer Science, vol 5788. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04394-9_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04394-9_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04393-2

  • Online ISBN: 978-3-642-04394-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics