Applied Intelligence

, Volume 49, Issue 3, pp 1172–1184 | Cite as

A new method for feature selection based on intelligent water drops

  • Mohammad Hossein KhosraviEmail author
  • Parsa Bagherzadeh


One of the trending research areas of data mining and machine learning is feature selection. Feature selection is used as a technique for improving classification accuracy of a classifier as well as a more convenient way for visualization of data. In this paper, a new method for feature subset selection, based on intelligent water drops algorithm is proposed. Intelligent water drops algorithm is a metaheuristic algorithm which is inspired from movement of water drops in nature. In the proposed method, a new objective function which is suitable for intelligent water drops algorithm is introduced. The objective function is designed such that the selected feature vector would obtain a good classification accuracy as well as providing a good generalization degree. According to the experiments, the use of proposed approach leads to more accurate results as well as significant reduction in number of features.


Intelligent water drops Multi-objective optimization Supervised feature selection Class scatter matrices 


  1. 1.
    Alba E, Garcia-Nieto J, Jourdan L, Talbi EG (2007) Gene selection in cancer classification using PSO/SVM and GA/SVM hybrid algorithms. In: IEEE congress on evolutionary computation, 2007. CEC 2007. pp 284–290. IEEEGoogle Scholar
  2. 2.
    Baeza-Yates R, Ribeiro-Neto B, et al. (1999) Modern information retrieval, vol 463. ACM press, New YorkGoogle Scholar
  3. 3.
    Blake CL, Merz CJ (1998) UCI repository of machine learning databases. Irvine, CA: University of california. Department of Information and Computer Science 55.
  4. 4.
    Canuto AM, Nascimento DS (2012) A genetic-based approach to features selection for ensembles using a hybrid and adaptive fitness function. In: The 2012 international joint conference on neural networks (IJCNN), pp 1–8. IEEEGoogle Scholar
  5. 5.
    Chen D, Chan KC, Wu X (2008) Gene expression analyses using genetic algorithm based hybrid approaches. In: IEEE congress on evolutionary computation, 2008. CEC 2008.(IEEE world congress on computational intelligence). pp 963–969. IEEEGoogle Scholar
  6. 6.
    Chen TC, Hsieh YC, You PS, Lee YC (2010) Feature selection and classification by using grid computing based evolutionary approach for the microarray data. In: 2010 3rd IEEE international conference on computer science and information technology (ICCSIT), vol 9, pp 85–89. IEEEGoogle Scholar
  7. 7.
    Da Silva SF, Ribeiro MX, Neto JdEB, Traina-Jr C, Traina AJ (2011) Improving the ranking quality of medical image retrieval using a genetic feature selection method. Decis Support Syst 51(4):810–820CrossRefGoogle Scholar
  8. 8.
    Dash M, Liu H (1997) Feature selection for classification. Intell Data Anal 1(1-4):131–156CrossRefGoogle Scholar
  9. 9.
    Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(Jan):1–30MathSciNetzbMATHGoogle Scholar
  10. 10.
    Derrac J, García S, Herrera F (2009) A first study on the use of coevolutionary algorithms for instance and feature selection. Hybrid Artificial Intelligence Systems, pp 557–564Google Scholar
  11. 11.
    Domeniconi C, Peng J, Gunopulos D (2002) Locally adaptive metric nearest-neighbor classification. IEEE Trans Pattern Anal Mach Intell 24(9):1281–1285CrossRefGoogle Scholar
  12. 12.
    Dutta D, Dutta P, Sil J (2012) Simultaneous feature selection and clustering for categorical features using multi objective genetic algorithm. In: 2012 12th international conference on hybrid intelligent systems (HIS), pp 191–196. IEEEGoogle Scholar
  13. 13.
    Friedman M (1937) The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J Am Stat Assoc 32(200):675–701CrossRefzbMATHGoogle Scholar
  14. 14.
    Friedman M (1940) A comparison of alternative tests of significance for the problem of m rankings. Ann Math Stat 11(1):86–92MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Gu S, Cheng R, Jin Y (2018) Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft Comput 22(3):811–822. CrossRefGoogle Scholar
  16. 16.
    Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3 (Mar):1157–1182zbMATHGoogle Scholar
  17. 17.
    Hastie T, Tibshirani R (1996) Discriminant adaptive nearest neighbor classification. IEEE Trans Pattern Anal Mach Intell 18(6):607–616CrossRefGoogle Scholar
  18. 18.
    Hong JH, Cho SB (2006) Efficient huge-scale feature selection with speciated genetic algorithm. Pattern Recogn Lett 27(2):143–150CrossRefGoogle Scholar
  19. 19.
    Huang CL, Dun JF (2008) A distributed PSO–SVM hybrid system with feature selection and parameter optimization. Appl Soft Comput 8(4):1381–1391CrossRefGoogle Scholar
  20. 20.
    Iswandy K, Koenig A (2006) Feature-level fusion by multi-objective binary particle swarm based unbiased feature selection for optimized sensor system design. In: 2006 IEEE international conference on multisensor fusion and integration for intelligent systems, pp 365–370. IEEEGoogle Scholar
  21. 21.
    Jeong YS, Shin KS, Jeong MK (2015) An evolutionary algorithm with the partial sequential forward floating search mutation for large-scale feature selection problems. J Oper Res Soc 66(4):529–538CrossRefGoogle Scholar
  22. 22.
    Kushwaha P, Welekar RR (2016) Feature selection for image retrieval based on genetic algorithm. IJIMAI 4(2):16–21CrossRefGoogle Scholar
  23. 23.
    Li R, Lu J, Zhang Y, Zhao T (2010) Dynamic adaboost learning with feature selection based on parallel genetic algorithm for image annotation. Knowl-Based Syst 23(3):195–201CrossRefGoogle Scholar
  24. 24.
    Li Y, Zhang S, Zeng X (2009) Research of multi-population agent genetic algorithm for feature selection. Expert Syst Appl 36(9):11,570–11,581CrossRefGoogle Scholar
  25. 25.
    Liu H, Motoda H (2007) Computational methods of feature selection. CRC PressGoogle Scholar
  26. 26.
    Liu Y, Wang G, Chen H, Dong H, Zhu X, Wang S (2011) An improved particle swarm optimization for feature selection. J Bionic Eng 8(2):191–200CrossRefGoogle Scholar
  27. 27.
    Lu J, Zhao T, Zhang Y (2008) Feature selection based-on genetic algorithm for image annotation. Knowl-Based Syst 21(8):887– 891CrossRefGoogle Scholar
  28. 28.
    Mafarja MM, Mirjalili S (2017) Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312CrossRefGoogle Scholar
  29. 29.
    Nilsson R (2007) Statistical feature selection: with applications in life science. Ph.D. thesis, Institutionen för fysik, kemi och biologiGoogle Scholar
  30. 30.
    Oreski S, Oreski G (2014) Genetic algorithm-based heuristic for feature selection in credit risk assessment. Expert Syst Appl 41(4):2052–2064CrossRefGoogle Scholar
  31. 31.
    Perez M, Marwala T (2012) Microarray data feature selection using hybrid genetic algorithm simulated annealing. In: 2012 IEEE 27th convention of electrical & electronics engineers in Israel (IEEEI), pp 1–5. IEEEGoogle Scholar
  32. 32.
    Pujari D, Yakkundimath R, Byadgi AS (2016) SVM And ANN based classification of plant diseases using feature reduction technique. IJIMAI 3(7):6–14CrossRefGoogle Scholar
  33. 33.
    Seo JH, Lee YH, Kim YH (2014) Feature selection for very short-term heavy rainfall prediction using evolutionary computation. Adv Meteorol 2014:4CrossRefGoogle Scholar
  34. 34.
    Shah-Hosseini H (2008) Intelligent water drops algorithm: a new optimization method for solving the multiple knapsack problem. Intern J Intel Comput Cybern 1(2):193–212MathSciNetCrossRefzbMATHGoogle Scholar
  35. 35.
    Shoorehdeli MA, Teshnehlab M, Moghaddam HA (2006) Feature subset selection for face detection using genetic algorithms and particle swarm optimization. In: Proceedings of the 2006 IEEE International Conference on networking, Sensing and Control, 2006. ICNSC’06. pp 686–690. IEEEGoogle Scholar
  36. 36.
    Siedlecki W, Sklansky J (1989) A note on genetic algorithms for large-scale feature selection. Pattern Recogn Lett 10(5):335– 347CrossRefzbMATHGoogle Scholar
  37. 37.
    Souza F, Matias T, Araójo R (2011) Co-evolutionary genetic multilayer perceptron for feature selection and model design. In: 2011 IEEE 16th conference on Emerging technologies & factory automation (ETFA), pp 1–7. IEEEGoogle Scholar
  38. 38.
    Strearns S (1976) On selecting features for pattern classifiers. Proc ICPR, pp 1976Google Scholar
  39. 39.
    Tahir MA, Bouridane A, Kurugollu F (2007) Simultaneous feature selection and feature weighting using hybrid tabu search/k-nearest neighbor classifier. Pattern Recogn Lett 28(4):438– 446CrossRefGoogle Scholar
  40. 40.
    Tan F, Fu X, Zhang Y, Bourgeois AG (2008) A genetic algorithm-based method for feature subset selection. Soft Computing-A Fusion of Foundations. Meteorol Appl 12(2):111–120Google Scholar
  41. 41.
    Tang EK, Suganthan PN, Yao X (2005) Feature selection for microarray data using least squares SVM and particle swarm optimization. In: Proceedings of the 2005 IEEE Symposium on computational Intelligence in Bioinformatics and Computational Biology, 2005. CIBCB’05. pp 1–8. IEEEGoogle Scholar
  42. 42.
    Theodoridis S, Koutroumbas K (2009) Pattern recognition–fourth editionGoogle Scholar
  43. 43.
    Vignolo LD, Milone DH, Scharcanski J (2013) Feature selection for face recognition based on multi-objective evolutionary wrappers. Expert Syst Appl 40(13):5077–5084CrossRefGoogle Scholar
  44. 44.
    Whitney AW (1971) A direct method of nonparametric measurement selection. IEEE Trans Comput 100 (9):1100–1103CrossRefzbMATHGoogle Scholar
  45. 45.
    Winkler SM, Affenzeller M, Jacak W, Stekel H (2011) Identification of cancer diagnosis estimation models using evolutionary algorithms: a case study for breast cancer, melanoma, and cancer in the respiratory system. In: Proceedings of the 13th annual conference companion on Genetic and evolutionary computation, pp 503–510. ACMGoogle Scholar
  46. 46.
    Xue B, Zhang M, Browne WN (2012) New fitness functions in binary particle swarm optimisation for feature selection. In: 2012 IEEE congress on Evolutionary computation (CEC), pp 1–8. IEEEGoogle Scholar
  47. 47.
    Xue B, Zhang M, Browne WN (2013) Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans Cybern 43(6):1656–1671CrossRefGoogle Scholar
  48. 48.
    Xue B, Zhang M, Browne WN (2013) Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans Cybern 43(6):1656–1671CrossRefGoogle Scholar
  49. 49.
    Yang J, Honavar V (1998) Feature subset selection using a genetic algorithm. IEEE Intell Syst Appl 13(2):44–49CrossRefGoogle Scholar
  50. 50.
    Zamalloa M, Bordel G, Rodríguez LJ, Peñagarikano M (2006) Feature selection based on genetic algorithms for speaker recognition. In: The IEEE odyssey 2006: Speaker and language recognition workshop, 2006. pp 1–8. IEEEGoogle Scholar
  51. 51.
    Zhang C, Hu H (2005) Using PSO algorithm to evolve an optimum input subset for a SVM in time series forecasting. In: 2005 IEEE international conference on systems, man and cybernetics, vol 4, pp 3793–3796. IEEEGoogle Scholar
  52. 52.
    Zhang Y, Gong D, Hu Y, Zhang W (2015) Feature selection algorithm based on bare bones particle swarm optimization. Neurocomputing 148:150–157CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Faculty of Electrical and Computer EngineeringUniversity of BirjandBirjandIran
  2. 2.Department of Computer Science and Software EngineeringConcordia UniversityMontrealCanada

Personalised recommendations