Skip to main content

Feature Subset Selection Using Differential Evolution

  • Conference paper
Advances in Neuro-Information Processing (ICONIP 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5506))

Included in the following conference series:

Abstract

One of the fundamental motivations for feature selection is to overcome the curse of dimensionality. A novel feature selection algorithm is developed in this chapter based on a combination of Differential Evolution (DE) optimization technique and statistical feature distribution measures. The new algorithm, referred to as DEFS, utilizes the DE float number optimizer in a combinatorial optimization problem like feature selection. The proposed DEFS highly reduces the computational cost while at the same time proves to present a powerful performance. The DEFS is tested as a search procedure on different datasets with varying dimensionality. Practical results indicate the significance of the proposed DEFS in terms of solutions optimality and memory requirements.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Liu, H., Motoda, H.: Computational Methods of Feature Selection. Taylor & Francis Group, LLC, Abington (2008)

    MATH  Google Scholar 

  2. Liu, H., Dougherty, E.R., Dy, J.G., Torkkola, K.A., Tuv, E., Peng, H.A., Ding, C.A., Long, F.A., Berens, M.A., Parsons, L.A., Zhao, Z.A., Yu, L.A., FOrman, G.A.: Evolving feature selection. IEEE Intelligent Systems 20, 64–76 (2005)

    Article  Google Scholar 

  3. Al-Ani, A.: Feature Subset Selection Using Ant Colony Optimization. Int. Journal of Computational Intelligence 2, 53–58 (2005)

    Google Scholar 

  4. Tahir, M.A., Bouridane, A., Kurugollu, F., Amira, A.: Feature Selection Using Tabu Search for Improving the Classification Rate of Prostate Needle Biopsies. In: Proceedings of the 17th International Conference on Pattern Recognition (ICPR 2004), vol. 2, pp. 335–338 (2004)

    Google Scholar 

  5. Filippone, M., Masulli, F., Rovetta, S.: Supervised Classification and Gene Selection Using Simulated Annealing. In: International Joint Conference on Neural Networks, IJCNN 2006, pp. 3566–3571 (2006)

    Google Scholar 

  6. Frohlich, H., Scholkopf, B.: Feature Selection for Support Vector Machines by Means of Genetic Algorithms. In: 15th IEEE International Conference on Tools with Artificial Intelligence, November 3-5, pp. 142–148 (2003)

    Google Scholar 

  7. Dorigo, M., Stutzle, T.: Ant Colony Optimization. MIT Press, London (2004)

    MATH  Google Scholar 

  8. Kennedy, J., Eberhart, R.C., Shi, Y.: Swarm Intelligence. Morgan Kaufmann Publishers, London (2001)

    Google Scholar 

  9. Price, K.V., Storn, R.M., Lampinen, J.A.: Differential Evolution: A Practical Approach to Global Optimization. Springer, Heidelberg (2005)

    MATH  Google Scholar 

  10. Palit, A.K., Popovic, D.: Computational Intelligence in Time Series Forecasting: Theory and Engineering Applications. Springer, Heidelberg (2005)

    MATH  Google Scholar 

  11. Haupt, R.L., Haupt, S.E.: Practical Genetic Algorithms, 2nd edn. John Wiley & Sons, Chichester (2004)

    MATH  Google Scholar 

  12. Firpi, H.A., Goodman, E.: Swarmed Feature Selection. In: Proceedings of the 33rd Applied Imagery Pattern Recognition Workshop (AIPR 2004), pp. 112–118 (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Khushaba, R.N., Al-Ani, A., Al-Jumaily, A. (2009). Feature Subset Selection Using Differential Evolution. In: Köppen, M., Kasabov, N., Coghill, G. (eds) Advances in Neuro-Information Processing. ICONIP 2008. Lecture Notes in Computer Science, vol 5506. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02490-0_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02490-0_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02489-4

  • Online ISBN: 978-3-642-02490-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics