Skip to main content
Log in

Dynamic interaction-based feature selection algorithm for maximal relevance minimal redundancy

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

In feature selection, distinguishing the redundancy and dependency relationships between features is a challenging task. In recent years, scholars have constantly put forward some solutions, but most of them fail to effectively distinguish dependent features from redundant features. In addition, the influence of feature-relevant complementary item on candidate feature is also ignored, which further reduces the distinguishing ability. In order to improve the distinguishing ability further, the concept of feature interaction degree is proposed, on which the new discriminant criteria of feature redundancy and dependency are defined. With the discriminant criteria and the feature-relevant complementary item, the dynamic interaction weight is constructed. Then a filter feature selection algorithm DIMRMR is proposed to effectively solve the problem of confusing redundancy and dependency. The experimental results shows that the proposed algorithm can achieve the optimal classification performance on most of the datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Agrawal P, Abutarboush H F, Ganesh T, et al. (2021) Metaheuristic algorithms on feature selection: a survey of one decade of research (2009-2019). IEEE Access 9:26,766–26,791

    Article  Google Scholar 

  2. Battiti R (1994) Using mutual information for selecting features in supervised neural net learning. IEEE Trans Neural Netw 5(4):537–550

    Article  Google Scholar 

  3. Bennasar M, Hicks Y, Setchi R (2015) Feature selection using joint mutual information maximisation. Expert Syst Appl 42(22):8520–8532

    Article  Google Scholar 

  4. Chandrashekar G, Sahin F (2014) A survey on feature selection methods. Comput Electr Eng 40(1):16–28

    Article  Google Scholar 

  5. Dash M, Liu H (1997) Feature selection for classification. Intell Data Anal 1(1–4):131–156

    Article  Google Scholar 

  6. Di Mauro M, Galatro G, Fortino G, et al. (2021) Supervised feature selection techniques in network intrusion detection: a critical review. Eng Appl Artif Intell 101:104–216

    Article  Google Scholar 

  7. Dionisio A, Menezes R, Mendes D A (2004) Mutual information: a measure of dependency for nonlinear time series. Physica A: Stat Mech Appl 344(1–2):326–329

    Article  MathSciNet  Google Scholar 

  8. Dougherty J, Kohavi R, Sahami M. (1995) Supervised and unsupervised discretization of continuous features. In: Machine learning proceedings 1995. Elsevier, pp 194–202

  9. Estévez P A, Tesmer M, Perez C A, et al. (2009) Normalized mutual information feature selection. IEEE Trans Neural Netw 20(2):189–201

    Article  Google Scholar 

  10. Fodor I K (2002) A survey of dimension reduction techniques. Tech rep. Lawrence Livermore National Laboratory, CA (US)

  11. Frank A (2010) Uci machine learning repository. University of California, School of Information and Computer Science, Irvine. http://archive.ics.uci.edu/ml

    Google Scholar 

  12. Gao W, Hu L, Zhang P (2018) Class-specific mutual information variation for feature selection. Pattern Recogn 79:328–339

    Article  Google Scholar 

  13. Gao W, Hu L, Zhang P (2020) Feature redundancy term variation for mutual information-based feature selection. Appl Intell 50(4):1272–1288

    Article  Google Scholar 

  14. Gu X, Guo J, Li C, et al. (2021) A feature selection algorithm based on redundancy analysis and interaction weight. Appl Intell 51(4):2672–2686

    Article  Google Scholar 

  15. Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182

    MATH  Google Scholar 

  16. Hu L, Gao W, Zhao K et al (2018) Feature selection considering two types of feature relevancy and feature interdependency. Expert Syst Appl 93:423–434

    Article  Google Scholar 

  17. Jakulin A (2003) Attribute interactions in machine learning. PhD thesis, Univerza v Ljubljani

  18. Jakulin A, Bratko I. (2004) Testing the significance of attribute interactions. In: Proceedings of the twenty-first international conference on Machine learning, p 52

  19. Kurgan L A, Cios K J (2004) Caim discretization algorithm. IEEE Trans Knowl Data Eng 16(2):145–153

    Article  Google Scholar 

  20. Lewis D. D. (1992) Feature selection and feature extraction for text categorization. In: Speech and natural language: proceedings of a workshop held at Harriman, New York, February 23–26, 1992

  21. Li J, Cheng K, Wang S et al (2017) Feature selection: a data perspective. ACM Comput Surv (CSUR) 50(6):1–45

    Article  Google Scholar 

  22. Li Z, Liu H, Zhang Z et al (2021) Learning knowledge graph embedding with heterogeneous relation attention networks. IEEE Trans Neural Netw Learn Syst

  23. Liu H, Zheng C, Li D, et al. (2021) Edmf: efficient deep matrix factorization with review feature learning for industrial recommender system. IEEE Trans Ind Inform

  24. Naghibi T, Hoffmann S, Pfister B (2014) A semidefinite programming based search strategy for feature selection with mutual information measure. IEEE Trans Pattern Anal Mach Intell 37(8):1529–1541

    Article  Google Scholar 

  25. Peng H, Long F, Ding C (2005) Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238

    Article  Google Scholar 

  26. Pintas J T, Fernandes L A, Garcia A C B (2021) Feature selection methods for text classification: a systematic literature review. Artif Intell Rev 1–52

  27. Rong M, Gong D, Gao X (2019) Feature selection and its use in big data: challenges, methods, and trends. IEEE Access 7:19:709–19:725

    Article  Google Scholar 

  28. Shannon C E (2001) A mathematical theory of communication. ACM SIGMOBILE Mob Comput Commun Rev 5(1):3–55

    Article  MathSciNet  Google Scholar 

  29. Sun X, Liu Y, Li J et al (2012) Feature evaluation and selection with cooperative game theory. Pattern Recognit 45(8):2992– 3002

    Article  Google Scholar 

  30. Sun X, Liu Y, Xu M, et al. (2013) Feature selection using dynamic weights for classification. Knowl-Based Syst 37:541–549

    Article  Google Scholar 

  31. Thomas M, Joy A T (2006) Elements of information theory

  32. Vergara J R, Estévez P A (2014) A review of feature selection methods based on mutual information. Neural Comput Appl 24(1):175–186

    Article  Google Scholar 

  33. Wang J, Wei J M, Yang Z, et al. (2017) Feature selection by maximizing independent classification information. IEEE Trans Knowl Data Eng 29(4):828–841

    Article  Google Scholar 

  34. Weston J, Elisseeff A, Schölkopf B, et al. (2003) Use of the zero norm with linear models and kernel methods. J Mach Learn Res 3:1439–1461

    MathSciNet  MATH  Google Scholar 

  35. Yang H, Moody J. (1999) Feature selection based on joint mutual information. In: Proceedings of international ICSC symposium on advances in intelligent data analysis. Citeseer, pp 22–25

  36. Yu L, Liu H (2004) Efficient feature selection via analysis of relevance and redundancy. J Mach Learn Res 5:1205– 1224

    MathSciNet  MATH  Google Scholar 

  37. Zeng Z, Zhang H, Zhang R et al (2015) A novel feature selection method considering feature interaction. Pattern Recogn 48(8):2656–2666

    Article  Google Scholar 

  38. Zhou H, Wang X, Zhu R (2021) Feature selection based on mutual information with correlation coefficient. Appl Intell 1–18

Download references

Acknowledgments

This work was funded by the National Natural Science Foundation of China(61572229, 61872158) and the Project of Jilin Provincial Education Department(JJKH20181060SK).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Aifeng Xie or Jianqi Zhu.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yin, K., Xie, A., Zhai, J. et al. Dynamic interaction-based feature selection algorithm for maximal relevance minimal redundancy. Appl Intell 53, 8910–8926 (2023). https://doi.org/10.1007/s10489-022-03922-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03922-5

Keywords

Navigation