Skip to main content

Remainder Subset Awareness for Feature Subset Selection

  • Conference paper
  • First Online:
Research and Development in Intelligent Systems XXVI

Abstract

Feature subset selection has become more and more a common topic of research. This popularity is partly due to the growth in the number of features and application domains. It is of the greatest importance to take themost of every evaluation of the inducer, which is normally the more costly part. In this paper, a technique is proposed that takes into account the inducer evaluation both in the current subset and in the remainder subset (its complementary set) and is applicable to any sequential subset selection algorithm at a reasonable overhead in cost. Its feasibility is demonstrated on a series of benchmark data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Blake, Merz, C.J.: UCI repository of machine learning databases (1998)

    Google Scholar 

  2. Dietterich, T.G.: Approximate statistical tests for comparing supervised classification learning algorithms. Neural Computation 10, 1895–1923 (1998)

    Article  Google Scholar 

  3. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    Article  MATH  Google Scholar 

  4. John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem. In: Proc. of the 11th ICML, pp. 121–129. Morgan Kaufmann, New Brunswick, NJ, USA (1994)

    Google Scholar 

  5. Langley, P.: Selection of relevant features in machine learning. In: Proceedings of the AAAI Fall Symposium on Relevance, pp. 140–144. AAAI Press, New Orleans, LA, USA (1994)

    Google Scholar 

  6. Pudil, P., Novovicová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recognition Letters 15(11), 1119–1125 (1994)

    Article  Google Scholar 

  7. R Development Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria (2008)

    Google Scholar 

  8. Whitney, A.W.: A direct method of nonparametric measurement selection. IEEE Trans. Comput. 20(9), 1100–1103 (1971)

    Article  MATH  MathSciNet  Google Scholar 

  9. Wolpert, D., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evolutionary Computation 1(1), 67–82 (1997)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gabriel Prat-Masramon .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag London

About this paper

Cite this paper

Prat-Masramon, G., Belanche-Muñoz, L.A. (2010). Remainder Subset Awareness for Feature Subset Selection. In: Bramer, M., Ellis, R., Petridis, M. (eds) Research and Development in Intelligent Systems XXVI. Springer, London. https://doi.org/10.1007/978-1-84882-983-1_25

Download citation

  • DOI: https://doi.org/10.1007/978-1-84882-983-1_25

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-84882-982-4

  • Online ISBN: 978-1-84882-983-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics