Abstract
PRISM, RISE, C4.5 and CN2 are popular classification algorithms for solving binary and multi-class classification problems. These are simple yet powerful algorithms for limited training whereas a few state-of-the-art classifiers like Random Forest, Xgboost etc. fail to perform well without extensive training. In case of limited training exceptions play a vital role. A exception handling strategy can boost the performances of these algorithms. The underlying strategies and most importantly the output formats of these algorithms are completely different from one another. PRISM produces modular rules but with no scope of handling exceptions. RISE, however, provides the most intelligible form of rules but again the exceptions are not taken care of. C4.5, on the other hand, yields decision trees which are neither quite comprehensible nor too easy to manipulate by both machines and humans. Moreover, a tree cannot be exception tolerant. CN2 induces modular rules by finding out best complexes that cover maximum instances, and collectively cover all. This paper proposes an ensemble which uses these algorithms individually as base classifiers and improves the performance by using several methodologies like transposing the outputs into a similar format like that of RISE-induced rules, bagging, appending exceptions to rule set along with a default rule, eliminating inefficient rules and employing a new combination method called “Clustering of rules according to their specificity”. The ensemble is named as ETEL (Exception Tolerant Ensemble Learner) and empirical study shows that ETEL outperforms some of the state-of-the-art ensembles precisely AdaBoost and Random Forest significantly.
Similar content being viewed by others
Data Availability
There is no additional data associated with this manuscript.
References
Asuncion A, Newman D (2007) “UCI machine learning repository”,
Breiman L (1996) “Bagging predictors” Machine learning, vol. 24, no. 2, pp. 123–140, https://doi.org/10.1007/BF00058655
Busse G, Jerzy W (1993) Selected algorithms of machine learning from examples. Fundamenta Informaticae 18(2):193–207
Cendrowska J (1987) Prism: an algorithm for inducing modular rules. Int J Man Mach Stud 27(4):349–370. https://doi.org/10.1016/S0020-7373(87)80003-2
Chen T, Guestrin C (2016) “Xgboost: A scalable tree boosting system”, Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining, pp. 785–794, https://doi.org/10.1145/2939672.2939785
Cho S, Kim JH (1995) Multiple network fusion using fuzzy logic. IEEE Trans Neural Networks 6(2):497–501. https://doi.org/10.1109/72.363487
Clark P, Niblett T (1989) The CN2 induction algorithm” machine learning 3.4. 261–283. https://doi.org/10.1023/A:1022641700528
Dasarathy V, Sheela B (1979) “A composite classifier system design: Concepts and methodology”, Proceedings of the IEEE, vol. 67, no.5, pp. 708–713, https://doi.org/10.1109/PROC.1979.11321
Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30
Derbeko P, El-Yaniv R, Meir R (2002) “Variance optimized bagging”, European conference on machine learning, Springer, Berlin, Heidelberg, pp. 60–72, https://doi.org/10.1007/3-540-36755-1_6
Domingos P (1994) “The RISE system: Conquering without separating”, In Proceedings Sixth International Conference on Tools with Artificial Intelligence, pp. 704–707, https://doi.org/10.1109/TAI.1994.346421
Domingos P (1996a) “Using partitioning to speed up specific-to-general rule induction”, In Proceedings of the AAAI-96 Workshop on Integrating Multiple Learned Models, pp. 29–34,
Domingos P (1996b) “Unifying instance-based and rule-based induction”, Machine Learning 24.2, pp. 141–168, https://doi.org/10.1007/BF00058656
Dong X, Yu Z, Cao W, Shi Y, Ma Q (2020) “A survey on ensemble learning”, Frontiers of Computer Science, Springer, 14(2), pp.241–258,
Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm”, icml. 96:148–156
Freund Y, Iyer R, Schapire R, Singer Y (2003) An efficient boosting algorithm for combining preferences. J Mach Learn Res 4:933–969
Hansen KL, Salamon P (1990) “Neural network ensembles”, IEEE transactions on pattern analysis and machine intelligence, vol. 12, no. 10, pp. 993–1001, https://doi.org/10.1109/34.58871
Kuncheva L (2014) Combining pattern classifiers: methods and algorithms. John Wiley & Sons
McCarthy J (1959) “Discussion of oliver selfridge,“Pandemonium: A paradigm for learning”, Symposium on the mechanization of thought processes. HM Stationery Office: London,
Michie D (1986) “Technology Lecture: The superarticulacy phenomenon in the context of software manufacture”, Proceedings of the Royal Society of London, A Mathematical and Physical Sciences, vol. 405, no. 1829, pp. 185–212, https://doi.org/10.1098/rspa.1986.0049
Mohammad A, Rezaeenour J, Hadavandi E (2014) Effective intrusion detection with a neural network ensemble using fuzzy clustering and stacking combination method. J Comput Secur 1(4):293–305
Pina A, Zaverucha G (2004) “SUNRISE: Improving the Performance of the RISE Algorithm”, Knowledge Discovery in Databases: PKDD 2004, pp. 518–520, https://doi.org/10.1007/978-3-540-30116-5_52
Pohlert T (2014) “The pairwise multiple comparison of mean ranks package (PMCMR)”, R package, pp.2004–2006,
Polikar R (2012) “Ensemble learning”, Ensemble machine learning. Springer, Boston, MA, pp 1–34. https://doi.org/10.1007/978-1-4419-9326-7_1
Quinlan JR (1993) “C4.5: programs for machine learning”, Morgan kaufmann, vol. 1, https://doi.org/10.1007/BF00993309
Quinlan R (2004) “Data mining tools see5 and c5.0”,RuleQuest Research,
Quinlan JR (2014) C4. 5: programs for machine learning. Elsevier
Rohlfing T, Russakoff DB, Maurer CR (2004) Performance-based classifier combination in atlas-based image segmentation using expectation-maximization parameter estimation. IEEE Trans Med Imaging 23(8):983–994. https://doi.org/10.1109/TMI.2004.830803
Ruggieri S (2002) “Efficient c4.5 [classification algorithm]”, IEEE transactions on knowledge and data engineering, vol. 14, no. 2, pp. 438–444, https://doi.org/10.1109/69.991727
Ruta D, Gabrys B (2005) Classifier selection for majority voting”. 6:63–81 Information fusion 1
Sikder S, Metya SK, Goswami RS (2019a) Exception included, ordered rule induction from the Set of Exemplars (ExIORISE). Int J Innovative Technol Exploring Eng (IJITEE) 9:57–62. https://doi.org/10.35940/ijitee.B1039.1292S19
Sikder S, Metya SK, Goswami R (2019b) “Exception-Tolerant Decision Tree/Rule Based Classifiers”, Ingénierie des Systèmes d Inf, vol. 24, no. 5, pp. 553–558, https://doi.org/10.18280/isi.240514
Wen Y, Dustin T, Jimmy B (2020) “Batchensemble: an alternative approach to efficient ensemble and lifelong learning”, Eighth International Conference on Learning Representations, https://doi.org/10.48550/arXiv.2002.06715
Wu Y, Liu L, Xie Z, Chow KH, Wei W (2021) “Boosting ensemble accuracy by revisiting ensemble diversity metrics”, In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition,
Zheng Z, Balaji P, Zheng H (2004) “A DEA approach for model combination”, Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining, pp. 755–760, https://doi.org/10.1145/1014052.1014152
Zimmerman DW, Zumbo BD (1993) Relative power of the wilcoxon test,the Friedman test, and repeated-measures ANOVA on ranks. J Experimental Educ 62(1):75–86. https://doi.org/10.1080/00220973.1993.9943832
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Sikder, S., Dadure, P. & Metya, S. Fast and efficient exception tolerant ensemble for limited training. Evolving Systems 14, 1025–1034 (2023). https://doi.org/10.1007/s12530-022-09483-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12530-022-09483-9