Skip to main content

Feature Selection Optimization Using a Hybrid Genetic Algorithm

  • Conference paper
  • First Online:
ICT Analysis and Applications

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 154))

Abstract

The curse of dimensionality plays a vital role in data mining and pattern recognition applications. There are two methods which can address curse of dimensionality namely—feature reduction and feature selection (FS). The application of FS is such that it selects the most relevant subset of features with the less redundancy. Main objective of the proposed method is to manipulate irrelevant features and redundant features in (high–medium–low) dimensional data. We will aim to provide higher classification accuracy. In this proposed method, it is implemented with genetic algorithm. In this paper, we are working on different datasets to train and test our model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Spolaor, N., Lorena, A. C., & Lee, H. D. (2011). Multi-objective genetic algorithm evaluation in feature selection. In International Conference on Evolutionary MultiCriterion Optimization (pp. 462–476).

    Google Scholar 

  2. Mao, Q., & Tsang, W. H. (2013). A feature selection method for multivariate performance measures. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(9), 2051–2063.

    Article  Google Scholar 

  3. Fan, M., Hu, Q., & Zhu, W. (2014). Feature selection with test cost constraint. International Journal of Approximate Reasoning, 55(1), 167–179.

    Google Scholar 

  4. Chakravarty, K., Das, D., Sinha, A., Konar, A. (2013). Feature selection by differential evolution algorithm—A case study in personnel identification.

    Google Scholar 

  5. Seera, M., & Lim, C. P. (2014). A hybrid intelligent system for medical data classification. Expert Systems with Applications 41(5), 2239–2249.

    Article  Google Scholar 

  6. Chen, G., Chen, J. (2015). A Novel Wrapper Method for Feature Selection and its Applications. B. V.: Elsevier Science Publishers.

    Google Scholar 

  7. Goldberg. (1989). Genetic Algorithms-in Search, Optimization and Machine Learning. London: AddisonWesley Publishing Company Inc.

    Google Scholar 

  8. Gunday, G., Ulusoy, G., Kilic, K., & Alpkan, L. (2011). Effects of innovation types on firm performance. Int. J. Prod. Econ., 133(2), 662–676.

    Article  Google Scholar 

  9. Dahea, W., & Fadewar, H. S. (2018). Multimodal biometric system: A review. International Journal of Research in Advanced Engineering and Technology, 4(1), 25-31.

    Google Scholar 

  10. Xue, B., Zhang, M., Browne, W. N., & Yao, X. (2016). A survey on evolutionary computation approaches to feature selection. IIEEE Transactions on Evolutionary Computation 20(4), 606–626.

    Article  Google Scholar 

  11. Kashef, S., & Nezamabadi-Pour, H. (2015). An advanced aco algorithm for feature subset selection. Neurocomputing, 147, 271–279.

    Article  Google Scholar 

  12. Harvey, D. Y., & Todd, M. D. (2014). Automated feature design for numeric sequence classification by geneticprogramming. IIEEE Transactions on Evolutionary Computation 19(4), 474–489.

    Article  Google Scholar 

  13. Bhadra, T., & Bandyopadhyay, S. (2015). Unsupervised feature selection using an improved version of differential evolution. Expert Systems with Applications 42(8), 4042–4053.

    Article  Google Scholar 

  14. Brown, G., Pocock, A., Zhao, M. J., & Lujn, M. (2012). Conditional likelihood aximization: A unifying framework for information theoretic feature selection. Journal of Machine Learning Research 13(1), 27–66.

    MATH  Google Scholar 

  15. Dong, H., Li, T., Ding, R., & Sun, J. (2018). A novel hybrid genetic algorithm with granular information for feature selection and optimization. Applied Soft Computing, 65, 33-46.

    Google Scholar 

  16. Mitra, P., Murthy, C. A., & Pal, S. K. (2002). Unsupervised feature selection using feature similarity. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(3), 301–312.

    Article  Google Scholar 

  17. He, X., Cai, D., & Niyogi, P. (2005). Laplacian score for feature selection. In International Conference on Neural Information Processing Systems (pp. 507–514).

    Google Scholar 

  18. Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine Learning, 20(3), 273–297.

    Google Scholar 

  19. Peng, H., Long, F., & Ding, C. (2005). Feature selection based on mutual information: criteria of max-dependency, maxrelevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 8, 1226–1238.

    Article  Google Scholar 

  20. Cai, D., Zhang, C., & He, X. (2010). Unsupervised feature selection for multi-cluster data. In ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 333–342).

    Google Scholar 

  21. Yu, L., & Liu, H. (2004). Efficient feature selection via analysis of relevance and redundancy. Journal of Machine Learning Research 5(12), 1205–1224.

    MathSciNet  MATH  Google Scholar 

  22. Ruiz, R. (2006). Incremental wrapper-based gene selection from microarray data for cancer classification. Pattern Recognition 39(12), 2383–2392.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to E. Padmalatha .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Padmalatha, E., Sailekhya, S., Athyaab, S.A., Harsh Raj, J. (2021). Feature Selection Optimization Using a Hybrid Genetic Algorithm. In: Fong, S., Dey, N., Joshi, A. (eds) ICT Analysis and Applications. Lecture Notes in Networks and Systems, vol 154. Springer, Singapore. https://doi.org/10.1007/978-981-15-8354-4_41

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-8354-4_41

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-8353-7

  • Online ISBN: 978-981-15-8354-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics