Skip to main content
Log in

Multimodal Perturbation and Cluster Pruning Based Selective Ensemble Classifier and Its Iron Industrial Application

  • Regular Papers
  • Intelligent Control and Applications
  • Published:
International Journal of Control, Automation and Systems Aims and scope Submit manuscript

Abstract

The selective ensemble aims to search the optimal subset balanced accuracy and diversity from the original base classifier set to construct an ensemble classifier with strong generalization performance. A selective ensemble classifier named BRFS-APCSC is proposed in this paper, which realizes the generation and selection of a set of accurate and diverse base classifiers respectively. In the first step, a multimodal perturbation method is introduced to train distinct base classifiers. The method perturbs the sample space by Bootstrap and disturbs the feature space under a newly proposed semi-random feature selection, which is a combination of the core attribute theory and the improved maximum relevance minimum redundancy algorithm. Then, to search the optimal classifier subset, affinity propagation clustering is added to cluster base classifiers in the first step, then the base classifiers are regarded as features so that the improved maximum relevance minimum redundancy algorithm is applied to select parts of base classifiers from each cluster for integration. UCI datasets and an actual dataset of semi-decarbonization are employed to verify the performance of BRFS-APCSC. The experimental results demonstrate that BRFS-APCSC has significantly difference with other selective ensemble methods and improve the classification accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. X. Fan, S. Hu, and J. He, “A target recognition method for maritime surveillance radars based on hybrid ensemble selection,” International Journal of Systems Science, vol. 48, no. 15, pp. 3334–3345, 2017.

    Article  MATH  Google Scholar 

  2. S. A. Wibowo, H. Lee, E. K. Kim, and S. Kim, “Collaborative learning based on convolutional features and correlation filter for visual tracking,” International Journal of Control, Automation, and Systems, vol. 16, no. 1, pp. 335–349, 2018.

    Article  Google Scholar 

  3. M. Sung, J. Kim, M. Lee, B. Kim, T. Kim, J. KIm and S.-C. Yu, “Realistic sonar image simulation using deep learning for underwater object detection,” International Journal of Control, Automation, and Systems, vol. 18, no. 3, pp. 523–534, 2020.

    Article  Google Scholar 

  4. Z.-H. Zhou, J. Wu, and W. Tang, “Ensembling neural networks: Many could be better than all,” Artificial intelligence, vol. 137, no. 1–2, pp. 239–263, 2002.

    Article  MathSciNet  MATH  Google Scholar 

  5. Z.-H. Zhou, Y. Yu, and C. Qian, Evolutionary learning: Advances in Theories and Algorithms, Springer, 2019.

  6. R. Ranjbar, H. Parvin, and F. Rad, “Clustering ensemble selection considering quality and diversity,” Research in Computing Science, vol. 102, pp. 89–99, 2015.

    Article  Google Scholar 

  7. L. Wei, S. Wan, J. Guo, and K. K. Wong, “A novel hierarchical selective ensemble classifier with bioinformatics application,” Artificial intelligence in medicine, vol. 83, pp. 82–90, 2017.

    Article  Google Scholar 

  8. P. Bühlmann, “Bagging, boosting and ensemble methods,” Handbook of Computational Statistics, Springer, 2012, pp. 985–1022.

  9. H. Altínçay, “Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbation,” Applied Soft Computing, vol. 7, no. 3, pp. 1072–1083, 2007.

    Article  MathSciNet  Google Scholar 

  10. L. L. Presti and M. La Cascia, “Boosting Hankel matrices for face emotion recognition and pain detection,” Computer Vision and Image Understanding, vol. 156, pp. 19–33, 2017.

    Article  Google Scholar 

  11. J. J. Rodriguez, L. I. Kuncheva, and C. J. Alonso, “Rotation forest: A new classifier ensemble method,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1619–1630, 2006.

    Article  Google Scholar 

  12. A. Marqués, V. García, and J. S. Sánchez, “Two-level classifier ensembles for credit risk assessment,” Expert Systems with Applications, vol. 39, no. 12, pp. 10916–10922, 2012.

    Article  Google Scholar 

  13. F. Jiang, X. Yu, H. Zhao, D. Gong, and J. Du, “Ensemble learning based on random super-reduct and resampling,” Artificial Intelligence Review, vol. 54, no. 4, pp. 3115–3140, 2021.

    Article  Google Scholar 

  14. G. Li, M. R. Mahmoudi, S. N. Qasem, B. A. Tuan, and K.-H. Pho, “Cluster ensemble of valid small clusters,” Journal of Intelligent & Fuzzy Systems, vol. 39, no. 1, pp. 525–542, 2020.

    Article  Google Scholar 

  15. R. Choudhary and S. Shukla, “A clustering based ensemble of weighted kernelized extreme learning machine for class imbalance learning,” Expert Systems with Applications, vol. 164, 2021.

  16. W. Liu, X. Yue, C. Zhong, and J. Zhou, “Clustering ensemble selection with analytic hierarchy process,” Neural Information Processing (Communications in Computer and Information Science), pp. 41–49, 2020.

  17. Z. Wang, S. Zhang, and Y. He, “Selective ensemble learning human activity recognition model based on diversity measurement cluster,” Computer Science, vol. 45, no. 01, pp. 307–312, 2018.

    Google Scholar 

  18. H. Zhang and L. Cao, “A spectral clustering based ensemble pruning approach,” Neurocomputing, vol. 139, pp. 289–297, 2014.

    Article  Google Scholar 

  19. M. Galar, A. Fernández, E. Barrenechea, H. Bustince, and F. Herrera, “Ordering-based pruning for improving the performance of ensembles of classifiers in the framework of imbalanced datasets,” Information Sciences, vol. 354, pp. 178–196, 2016.

    Article  Google Scholar 

  20. X. Xia, T. Lin, and Z. Chen, “Maximum relevancy maximum complementary based ordered aggregation for ensemble pruning,” Applied Intelligence, vol. 48, no. 9, pp. 2568–2579, 2017.

    Article  Google Scholar 

  21. H. Ykhlef and D. Bouchaffra, “Induced subgraph game for ensemble selection,” International Journal on Artificial Intelligence Tools, vol. 26, no. 1, 2017.

  22. X. Zhu, Z. Ni, L. Ni, F. Jin, M. Cheng, and J. Li, “Improved discrete artificial fish swarm algorithm combined with margin distance minimization for ensemble pruning,” Computers & Industrial Engineering, vol. 128, pp. 32–46, 2019.

    Article  Google Scholar 

  23. Z. Ni, P. Xia, X. Zhu, Y. Ding, and L. Ni, “A novel ensemble pruning approach based on information exchange glowworm swarm optimization and complementarity measure,” Journal of Intelligent & Fuzzy Systems, vol. 39, no. 6, pp. 8299–8313, 2020.

    Article  Google Scholar 

  24. Y. Shen, K. Zheng, C. Wu, C. Niu, and Y. Tang., “An ensemble method based on selection using bat algorithm for intrusion detection,” The Computer Journal, vol. 61, no. 4, pp. 526–538, 2018.

    Article  Google Scholar 

  25. X. Zhang, “Improvement of echo state network generalization by selective ensemble learning based on BPSO,” Automation, Control and Intelligent Systems, vol. 4, no. 6, 2016.

  26. Z. Pawlak, “Rough set theory and its applications to data analysis,” Cybernetics & Systems, vol. 29, no. 7, pp. 661–688, 1998.

    Article  MATH  Google Scholar 

  27. S. Cheriguene, N. Azizi, N. Dey, A. S. Ashour, and A. Ziani, “A new hybrid classifier selection model based on mRMR method and diversity measures,” International Journal of Machine Learning and Cybernetics, vol. 10, no. 5, pp. 1189–1204, 2019.

    Article  Google Scholar 

  28. O. Kurşun, C. O. Şakar, O. Favorov, N. Aydin, and S. F. Gürgen, “Using covariates for improving the minimum redundancy maximum relevance feature selection method,” Turkish Journal of Electrical Engineering & Computer Sciences, vol. 18, no. 6, pp. 975–989, 2010.

    Google Scholar 

  29. C. O. Sakar, O. Kursun, and F. Gurgen, “A feature selection method based on kernel canonical correlation analysis and the minimum Redundancy-Maximum Relevance filter method,” Expert Systems with Applications, vol. 39, no. 3, pp. 3432–3437, 2012.

    Article  Google Scholar 

  30. Z. Wang, M. Li, and J. Li, “A multi-objective evolutionary algorithm for feature selection based on mutual information with a new redundancy measure,” Information Sciences, vol. 307, pp. 73–88, 2015.

    Article  MathSciNet  MATH  Google Scholar 

  31. B. J. Frey and D. Dueck, “Clustering by passing messages between data points,” Science, vol. 315, no. 5814, pp. 972–976, 2007.

    Article  MathSciNet  MATH  Google Scholar 

  32. Q. Dai, “A novel ensemble pruning algorithm based on randomized greedy selective strategy and ballot,” Neurocomputing, vol. 122, pp. 258–265, 2013.

    Article  Google Scholar 

  33. M. Wu, “Heuristic parallel selective ensemble algorithm based on clustering and improved simulated annealing,” The Journal of Supercomputing, vol. 76, no. 5, pp. 3702–3712, 2020.

    Article  Google Scholar 

  34. C. Qian, Y. Yu, and Z.-H. Zhou, “Pareto ensemble pruning,” Proc. of 29th AAAI Conference on Artificial Intelligence, 2015.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuefeng Yan.

Ethics declarations

All authors certify that they have no affiliations with or involvement in any organization or entity with any financial interest or non-financial interest in the subject matter or materials discussed in this manuscript.

Additional information

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by the National Natural Science Foundation of China (Grant 21878081), the National key research and development program of China (Grant 2021YFC2101100).

Qiannan Wu received her B.S. degree in automation from Jiangnan University, Wuxi, China. Her research interests include feature space analysis, selective ensemble learning, and their practical applications.

Yifei Sun received his B.E. degree in control science and engineering from East China University of Science and Technology, Shanghai, China, in 2022. He is currently working toward a Ph.D. degree in control science and engineering. His current research interests include industrial data analytics, industrial process soft measurements, and time series prediction.

Lihua Lv was born in Jilin, China, in 1967. He received his Ph.D. degree in automatic control from the University of Zhe Jiang, China in 2001. Since May 2001, he has worked at Central Research Institute Intelligent Manufacturing Research Institute of BAOSHAN IRON& STEEL CO., LTD, where he currently holds a Chief Researcher position. His current research interests include industry data analysis, intelligent control, and saving energy control.

Xuefeng Yan received his B.S. degree in biochemical engineering and a Ph.D. degree in control theory engineering from Zhejiang University, Hangzhou, China, in 1995 and 2002, respectively. He is currently a Professor with East China University of Science and Technology, Shanghai, China. His current research interests include complex chemical process modeling, optimizing and controlling, process monitoring, fault diagnosis, and intelligent information processing.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, Q., Sun, Y., Lv, L. et al. Multimodal Perturbation and Cluster Pruning Based Selective Ensemble Classifier and Its Iron Industrial Application. Int. J. Control Autom. Syst. 21, 3813–3824 (2023). https://doi.org/10.1007/s12555-022-0697-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12555-022-0697-0

Keywords

Navigation