Advertisement

Ensemble Learning via Multimodal Multiobjective Differential Evolution and Feature Selection

Conference paper
  • 400 Downloads
Part of the Communications in Computer and Information Science book series (CCIS, volume 1159)

Abstract

Ensemble learning is an important element in machine learning. However, two essential tasks, including training base classifiers and finding a suitable ensemble balance for the diversity and accuracy of these base classifiers, are need to be achieved. In this paper, a novel ensemble method, which utilizes a multimodal multiobjective differential evolution (MMODE) algorithm to select feature subsets and optimize base classifiers parameters, is proposed. Moreover, three methods including minimum error ensemble, all Pareto sets ensemble, and error reduction ensemble are employed to construct ensemble classifiers for executing classification tasks. Experimental results on several benchmark classification databases evidence that the proposed algorithm is valid.

Keywords

Multimodal multiobjective optimization Feature selection Ensemble learning Classifier parameter 

Notes

Acknowledgments

This work is supported by the National Natural Science Foundation of China (61976237, 61922072, 61876169, 61673404).

References

  1. 1.
    Song, Y., et al.: Gaussian derivative models and ensemble extreme learning machine for texture image classification. Neurocomputing 277, 53–64 (2018)CrossRefGoogle Scholar
  2. 2.
    Piri, S., Delen, D., Liu, T., Zolbanin, H.M.: A data analytics approach to building a clinical decision support system for diabetic retinopathy: developing and deploying a model ensemble. Decis. Support Syst. 101, 12–27 (2017)CrossRefGoogle Scholar
  3. 3.
    Zhao, Z., Jiao, L., Liu, F., Zhao, J., Chen, P.: Semisupervised discriminant feature learning for SAR image category via sparse ensemble. IEEE Trans. Geosci. Remote Sens. 54(6), 3532–3547 (2016)CrossRefGoogle Scholar
  4. 4.
    Breiman, L.: Bagging predictors. Mach. Learn 24(2), 123–140 (1996)zbMATHGoogle Scholar
  5. 5.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)CrossRefGoogle Scholar
  7. 7.
    Rodriguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: a new classifier ensemble method. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1619–1630 (2006)CrossRefGoogle Scholar
  8. 8.
    Fernández, A., Carmona, C.J., Jose del Jesus, M., Herrera, F.: A Pareto-based ensemble with feature and instance selection for learning from multi-class imbalanced datasets. Int. J. Neural Syst. 27(06), 1750028 (2017)CrossRefGoogle Scholar
  9. 9.
    Albukhanajer, W.A., Jin, Y., Briffa, J.A.: Classifier ensembles for image identification using multi-objective Pareto features. Neurocomputing 238, 316–327 (2017)CrossRefGoogle Scholar
  10. 10.
    Lyu, H., Wan, M., Han, J., Liu, R., Wang, C.: A filter feature selection method based on the maximal information coefficient and Gram-Schmidt orthogonalization for biomedical data mining. Comput. Biol. Med. 89, 264–274 (2017)CrossRefGoogle Scholar
  11. 11.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)zbMATHGoogle Scholar
  12. 12.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1–2), 273–324 (1997)CrossRefGoogle Scholar
  13. 13.
    Xue, X., Yao, M., Wu, Z.: A novel ensemble-based wrapper method for feature selection using extreme learning machine and genetic algorithm. Knowl. Inf. Syst. 57(2), 389–412 (2017).  https://doi.org/10.1007/s10115-017-1131-4CrossRefGoogle Scholar
  14. 14.
    Zhang, Y., Gong, D., Cheng, J.: Multi-objective particle swarm optimization approach for cost-based feature selection in classification. IEEE/ACM Trans. Comput. Biol. Bioinf. (TCBB) 14(1), 64–75 (2017)CrossRefGoogle Scholar
  15. 15.
    Quinlan, J.R.: Improved use of continuous attributes in C4.5. J. Artif. Intell. Res. 4, 77–90 (1996)CrossRefGoogle Scholar
  16. 16.
    Kamyab, S., Eftekhari, M.: Feature selection using multimodal optimization techniques. Neurocomputing 171, 586–597 (2016)CrossRefGoogle Scholar
  17. 17.
    Pan, L., Li, L., He, C., Tan, K.C.: A subregion division-based evolutionary algorithm with effective mating selection for many-objective optimization. IEEE Trans. Cybern. (2019).  https://doi.org/10.1109/TCYB.2019.2906679
  18. 18.
    He, C., Tian, Y., Jin, Y., Zhang, X., Pan, L.: A radial space division based evolutionary algorithm for many-objective optimization. Appl. Soft Comput. 61, 603–621 (2017)CrossRefGoogle Scholar
  19. 19.
    Yue, C., Qu, B., Liang, J.: A multiobjective particle swarm optimizer using ring topology for solving multimodal multiobjective problems. IEEE Trans. Evol. Comput. 22(5), 805–817 (2017)CrossRefGoogle Scholar
  20. 20.
    Deb, K., Tiwari, S.: Omni-optimizer: a procedure for single and multi-objective optimization. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 47–61. Springer, Heidelberg (2005).  https://doi.org/10.1007/978-3-540-31880-4_4CrossRefzbMATHGoogle Scholar
  21. 21.
    Liang, J., Guo, Q., Yue, C., Qu, B., Yu, K.: A self-organizing multi-objective particle swarm optimization algorithm for multimodal multi-objective problems. In: Tan, Y., Shi, Y., Tang, Q. (eds.) ICSI 2018. LNCS, vol. 10941, pp. 550–560. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-93815-8_52CrossRefGoogle Scholar
  22. 22.
    Dua, D., Graff, C.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml
  23. 23.
    Tanabe, R., Ishibuchi, H.: A review of evolutionary multimodal multiobjective optimization. IEEE Trans. Evol. Comput. 24(1), 193–200 (2020). ISSN 1941-0026CrossRefGoogle Scholar
  24. 24.
    Li, X., Epitropakis, M.G., Deb, K., Engelbrecht, A.: Seeking multiple solutions: an updated survey on niching methods and their applications. IEEE Trans. Evol. Comput. 21(4), 518–538 (2017)CrossRefGoogle Scholar
  25. 25.
    Liang, J., et al.: Multimodal multiobjective optimization with differential evolution. Swarm Evol. Comput. 44, 1028–1059 (2019)CrossRefGoogle Scholar
  26. 26.
    Shir, O.M., Preuss, M., Naujoks, B., Emmerich, M.: Enhancing decision space diversity in evolutionary multiobjective algorithms. In: Ehrgott, M., Fonseca, C.M., Gandibleux, X., Hao, J.-K., Sevaux, M. (eds.) EMO 2009. LNCS, vol. 5467, pp. 95–109. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-642-01020-0_12CrossRefGoogle Scholar
  27. 27.
    Sikdar, U.K., Ekbal, A., Saha, S.: MODE: multiobjective differential evolution for feature selection and classifier ensemble. Soft Comput. 19(12), 3529–3549 (2015).  https://doi.org/10.1007/s00500-014-1565-5CrossRefGoogle Scholar
  28. 28.
    Whitney, A.W.: A direct method of nonparametric measurement selection. IEEE Trans. Comput. 100(9), 1100–1103 (1971)CrossRefGoogle Scholar
  29. 29.
    Marill, T., Green, D.: On the effectiveness of receptors in recognition systems. IEEE Trans. Inf. Theory 9(1), 11–17 (1963)CrossRefGoogle Scholar
  30. 30.
    Yusta, S.C.: Different metaheuristic strategies to solve the feature selection problem. Pattern Recogn. Lett. 30(5), 525–534 (2009)CrossRefGoogle Scholar
  31. 31.
    Pan, L., He, C., Tian, Y., Wang, H., Zhang, X., Jin, Y.: A classification-based surrogate-assisted evolutionary algorithm for expensive many-objective optimization. IEEE Trans. Evol. Comput. 23(1), 74–88 (2018)CrossRefGoogle Scholar
  32. 32.
    Pan, L., He, C., Tian, Y., Su, Y., Zhang, X.: A region division based diversity maintaining approach for many-objective optimization. Integr. Comput. Aided Eng. 24(3), 279–296 (2017)CrossRefGoogle Scholar
  33. 33.
    Wang, X., Yang, J., Teng, X., Xia, W., Jensen, R.: Feature selection based on rough sets and particle swarm optimization. Pattern Recogn. Lett. 28(4), 459–471 (2007)CrossRefGoogle Scholar
  34. 34.
    Yu, K., Qu, B., Yue, C., Ge, S., Chen, X., Liang, J.: A performance-guided jaya algorithm for parameters identification of photovoltaic cell and module. Appl. Energy 237, 241–257 (2019)CrossRefGoogle Scholar
  35. 35.
    Huang, C.L., Wang, C.J.: A GA-based feature selection and parameters optimizationfor support vector machines. Expert Syst. Appl. 31(2), 231–240 (2006)CrossRefGoogle Scholar
  36. 36.
    Wan, Y., Wang, M., Ye, Z., Lai, X.: A feature selection method based on modified binary coded ant colony optimization algorithm. Appl. Soft Comput. 49, 248–258 (2016)CrossRefGoogle Scholar
  37. 37.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)CrossRefGoogle Scholar
  38. 38.
    Huang, G.B., Chen, L., Siew, C.K., et al.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks 17(4), 879–892 (2006)CrossRefGoogle Scholar
  39. 39.
    Feng, G., Huang, G.B., Lin, Q., Gay, R.: Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans. Neural Networks 20(8), 1352–1357 (2009)CrossRefGoogle Scholar
  40. 40.
    Xu, Z., Yao, M., Wu, Z., Dai, W.: Incremental regularized extreme learning machine and it’s enhancement. Neurocomputing 174, 134–142 (2016)CrossRefGoogle Scholar
  41. 41.
    Cao, J., Lin, Z., Huang, G.B., Liu, N.: Voting based extreme learning machine. Inf. Sci. 185(1), 66–77 (2012)MathSciNetCrossRefGoogle Scholar
  42. 42.
    Rosales-Perez, A., Garcia, S., Gonzalez, J.A., Coello, C.A.C., Herrera, F.: An evolutionary multi-objective model and instance selection for support vector machines with Pareto-based ensembles. IEEE Trans. Evol. Comput. 21(6), 863–877 (2017)CrossRefGoogle Scholar
  43. 43.
    García-Nieto, J., Alba, E., Jourdan, L., Talbi, E.: Sensitivity and specificity based multiobjective approach for feature selection: application to cancer diagnosis. Inf. Process. Lett. 109(16), 887–896 (2009)MathSciNetCrossRefGoogle Scholar
  44. 44.
    Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. (TIST) 2(3), 27 (2011)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.School of Electrical EngineeringZhengzhou UniversityZhengzhouChina

Personalised recommendations