Skip to main content

Part of the book series: Unsupervised and Semi-Supervised Learning ((UNSESUL))

  • 331 Accesses

Abstract

Abstract

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Antoniades, A. and Took, C. C. (2016). Speeding up feature selection: A deep-inspired network pruning algorithm. In 2016 International Joint Conference on Neural Networks (IJCNN), pages 360–366.

    Google Scholar 

  • Bidgoli, A. A., Rahnamayan, S., Dehkharghanian, T., Riasatian, A., Kalra, S., Zaveri, M., Campbell, C. J., Parwani, A., Pantanowitz, L., and Tizhoosh, H. (2022). Evolutionary deep feature selection for compact representation of gigapixel images in digital pathology. Artificial Intelligence in Medicine, 132:102368.

    Article  Google Scholar 

  • Bolón-Canedo, V., Sánchez-Maroño, N., and Alonso-Betanzos, A. (2013). A review of feature selection methods on synthetic data. Knowledge and information systems, 34(3):483–519.

    Article  Google Scholar 

  • Breaban, M. and Luchian, H. (2011). A unifying criterion for unsupervised clustering and feature selection. Pattern Recognition, 44(4):854–865.

    Article  Google Scholar 

  • Dokeroglu, T., Deniz, A., and Kiziloz, H. E. (2022). A comprehensive survey on recent metaheuristics for feature selection. Neurocomputing, 494:269–296.

    Article  Google Scholar 

  • Dutta, D., Dutta, P., and Sil, J. (2014). Simultaneous feature selection and clustering with mixed features by multi objective genetic algorithm. Int. J. Hybrid Intell. Syst., 11(1):41–54.

    Google Scholar 

  • Feng, S. and Duarte, M. F. (2018). Graph autoencoder-based unsupervised feature selection with broad and local data structure preservation. Neurocomputing, 312:310–323.

    Article  Google Scholar 

  • Fu, Y., Liu, X., Sarkar, S., and Wu, T. (2021). Gaussian mixture model with feature selection: An embedded approach. Computers & Industrial Engineering, 152:107000.

    Article  Google Scholar 

  • Goodfellow, I., Bengio, Y., and Courville, A. (2016). Regularization for deep learning. Deep learning, pages 216–261.

    Google Scholar 

  • Gui, J., Sun, Z., Ji, S., Tao, D., and Tan, T. (2016). Feature selection based on structured sparsity: A comprehensive study. IEEE transactions on neural networks and learning systems, 28(7):1490–1507.

    Article  MathSciNet  Google Scholar 

  • Guo, J. and Zhu, W. (2018). Dependence guided unsupervised feature selection. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1).

    Google Scholar 

  • Hall, M. A. (1999). Correlation-based feature selection for machine learning. PhD thesis, The University of Waikato.

    Google Scholar 

  • Han, K., Wang, Y., Zhang, C., Li, C., and Xu, C. (2018). Autoencoder inspired unsupervised feature selection. In 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 2941–2945.

    Chapter  Google Scholar 

  • Hancer, E., Xue, B., and Zhang, M. (2020). A survey on feature selection approaches for clustering. Artificial Intelligence Review, 53(6):4519–4545.

    Article  Google Scholar 

  • Haq, A. U., Zeb, A., Lei, Z., and Zhang, D. (2021). Forecasting daily stock trend using multi-filter feature selection and deep learning. Expert Systems with Applications, 168:114444.

    Article  Google Scholar 

  • He, X., Cai, D., and Niyogi, P. (2005). Laplacian score for feature selection. Advances in neural information processing systems, 18.

    Google Scholar 

  • Hoque, N., Bhattacharyya, D., and Kalita, J. (2014). Mifs-nd: A mutual information-based feature selection method. Expert Systems with Applications, 41(14):6371–6385.

    Article  Google Scholar 

  • Hou, C., Nie, F., Li, X., Yi, D., and Wu, Y. (2013). Joint embedding learning and sparse regression: A framework for unsupervised feature selection. IEEE Transactions on Cybernetics, 44(6):793–804.

    Google Scholar 

  • Jha, K. and Saha, S. (2021). Incorporation of multimodal multiobjective optimization in designing a filter based feature selection technique. Applied Soft Computing, 98:106823.

    Article  Google Scholar 

  • Kennedy, J. (2003). Bare bones particle swarms. In Proceedings of the 2003 IEEE Swarm Intelligence Symposium. SIS’03 (Cat. No.03EX706), pages 80–87.

    Google Scholar 

  • Kim, Y., Street, W. N., and Menczer, F. (2002). Evolutionary model selection in unsupervised learning. Intell. Data Anal., 6(6):531–556.

    Article  Google Scholar 

  • Kira, K. and Rendell, L. A. (1992). A practical approach to feature selection. In Machine learning proceedings 1992, pages 249–256. Elsevier.

    Google Scholar 

  • Law, M., Figueiredo, M., and Jain, A. (2004). Simultaneous feature selection and clustering using mixture models. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26(9):1154–1166.

    Article  Google Scholar 

  • Liu, H. and Setiono, R. (1995). Chi2: feature selection and discretization of numeric attributes. In Proceedings of 7th IEEE International Conference on Tools with Artificial Intelligence, pages 388–391.

    Google Scholar 

  • Liu, Y., Mu, Y., Chen, K., Li, Y., and Guo, J. (2020). Daily activity feature selection in smart homes based on pearson correlation coefficient. Neural Processing Letters, 51(2):1771–1787.

    Article  Google Scholar 

  • Mirzaei, A., Pourahmadi, V., Soltani, M., and Sheikhzadeh, H. (2020). Deep feature selection using a teacher-student network. Neurocomputing, 383:396–408.

    Article  Google Scholar 

  • Özyurt, F. (2020). Efficient deep feature selection for remote sensing image recognition with fused deep learning architectures. The Journal of Supercomputing, 76(11):8413–8431.

    Article  Google Scholar 

  • Radovic, M., Ghalwash, M., Filipovic, N., and Obradovic, Z. (2017). Minimum redundancy maximum relevance feature selection approach for temporal gene expression data. BMC bioinformatics, 18(1):1–14.

    Article  Google Scholar 

  • Rahangdale, A. and Raut, S. (2019). Deep neural network regularization for feature selection in learning-to-rank. IEEE Access, 7:53988–54006.

    Article  Google Scholar 

  • Rostami, M., Berahmand, K., Nasiri, E., and Forouzandeh, S. (2021). Review of swarm intelligence-based feature selection methods. Engineering Applications of Artificial Intelligence, 100:104210.

    Article  Google Scholar 

  • Singh, V., Baranwal, N., Sevakula, R. K., Verma, N. K., and Cui, Y. (2016). Layerwise feature selection in stacked sparse auto-encoder for tumor type prediction. In 2016 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), pages 1542–1548.

    Chapter  Google Scholar 

  • Solorio-Fernández, S., Carrasco-Ochoa, J. A., and Martínez-Trinidad, J. F. (2020). A review of unsupervised feature selection methods. Artificial Intelligence Review, 53(2):907–948.

    Article  Google Scholar 

  • Toğaçar, M., Ergen, B., Cömert, Z., and Özyurt, F. (2020). A deep feature learning model for pneumonia detection applying a combination of mrmr feature selection and machine learning models. IRBM, 41(4):212–222.

    Article  Google Scholar 

  • Xue, B., Zhang, M., Browne, W. N., and Yao, X. (2015). A survey on evolutionary computation approaches to feature selection. IEEE Transactions on Evolutionary Computation, 20(4):606–626.

    Article  Google Scholar 

  • Zhang, Y., Li, H.-G., Wang, Q., and Peng, C. (2019). A filter-based bare-bone particle swarm optimization algorithm for unsupervised feature selection. Applied Intelligence, 49(8):2889–2898.

    Article  Google Scholar 

  • Zhao, Z. and Liu, H. (2007). Spectral feature selection for supervised and unsupervised learning. In Proceedings of the 24th international conference on Machine learning, pages 1151–1157.

    Google Scholar 

  • Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the royal statistical society: series B (statistical methodology), 67(2):301–320.

    Article  MathSciNet  Google Scholar 

  • Zou, Q., Ni, L., Zhang, T., and Wang, Q. (2015). Deep learning based feature selection for remote sensing scene classification. IEEE Geoscience and Remote Sensing Letters, 12(11):2321–2325.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Ros, F., Riad, R. (2024). Feature selection. In: Feature and Dimensionality Reduction for Clustering with Deep Learning . Unsupervised and Semi-Supervised Learning. Springer, Cham. https://doi.org/10.1007/978-3-031-48743-9_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-48743-9_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-48742-2

  • Online ISBN: 978-3-031-48743-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics