Skip to main content

CRC: Consolidated Rules Construction for Expressive Ensemble Classification

  • Conference paper
  • First Online:
Artificial Intelligence XXXIX (SGAI-AI 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13652))

  • 601 Accesses


Predictive modelling is one of the most important data mining tasks, where data mining models are trained on data with ground truth information and then applied to previously unseen data to predict the ground truth of a target variable. Ensemble models are often used for predictive modelling, since ensemble models tend to improve accuracy compared with standalone classification models. Although ensemble models are very accurate, they are opaque and predictions derived from these models are difficult to interpret by human analysts. However, explainability of classification models is needed in many critical applications such as stock market analysis, credit risk evaluation, intrusion detection, etc. A recent development of the authors of this paper is ReG-Rules, an ensemble learner that aims to extract a classification (prediction) committee, which comprises the first rule from each base classifier that fired. The rules are interpretable by humans, thus ReG-Rules is a step towards explainable ensemble classification. Since there is a set of matching rules presented to the human analyst for each prediction, there are still numerous rules that need to be considered for explaining the model to the human analyst. This paper introduces an extension of ReG-Rules termed Consolidated Rules Construction (CRC). CRC merges individual base classification models into a single rule set, that is then applied for each prediction. Only one rule is presented to the human analyst per prediction. Therefore, CRC is more explainable than ReG-Rules. Empirical evaluation also shows that CRC is competitive with ReG-Rules with respect to various performance measures.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions


  1. Almutairi, M., Stahl, F., Bramer, M.: Improving modular classification rule induction with g-prism using dynamic rule term boundaries. In: Bramer, M., Petridis, M. (eds.) SGAI 2017. LNCS (LNAI), vol. 10630, pp. 115–128. Springer, Cham (2017).

    Chapter  Google Scholar 

  2. Almutairi, M., Stahl, F., Bramer, M.: A rule-based classifier with accurate and fast rule term induction for continuous attributes. In: 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 413–420. IEEE (2018)

    Google Scholar 

  3. Almutairi, M., Stahl, F., Bramer, M.: Reg-rules: an explainable rule-based ensemble learner for classification. IEEE Access 9, 52015–52035 (2021).

    Article  Google Scholar 

  4. Almutairi, M., Stahl, F., Jennings, M., Le, T., Bramer, M.: Towards expressive modular rule induction for numerical attributes. In: Bramer, M., Petridis, M. (eds.) Research and Development in Intelligent Systems XXXIII, pp. 229–235. Springer, Cham (2016).

    Chapter  Google Scholar 

  5. Almutairi, M.K.: ManalAlmutairi/PhD_Project_Codes: G-Rules-IQR. ReG-Rules and CRC, October 2021.

  6. Bramer, M.: An information-theoretic approach to the pre-pruning of classification rules. In: Musen, M.A., Neumann, B., Studer, R. (eds.) IIP 2002. ITIFIP, vol. 93, pp. 201–212. Springer, Boston, MA (2002).

    Chapter  Google Scholar 

  7. Bramer, M.: Principles of Data Mining, vol. 530. Springer, London (2016).

    Book  MATH  Google Scholar 

  8. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)

    Article  MATH  Google Scholar 

  9. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  10. Cendrowska, J.: PRISM: an algorithm for inducing modular rules. Int. J. Man Mach. Stud. 27(4), 349–370 (1987)

    Article  MATH  Google Scholar 

  11. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  12. Fürnkranz, J., Gamberger, D., Lavrač, N.: Foundations of Rule Learning. Springer, Heidelberg (2012).

    Book  MATH  Google Scholar 

  13. Han, J., Pei, J., Kamber, M.: Data Mining: Concepts and Techniques. Elsevier, Amsterdam (2011)

    MATH  Google Scholar 

  14. Ho, T.K.: Random decision forests. In: Proceedings of the Third International Conference on Document Analysis and Recognition, vol. 1, pp. 278–282. IEEE (1995)

    Google Scholar 

  15. Johnston, B., Mathur, I.: Applied Supervised Learning with Python: Use Scikit-Learn to Build Predictive Models from Real-World Datasets and Prepare Yourself for the Future of Machine Learning. Packt Publishing, Birmingham (2019).

  16. Lichman, M.: UCI machine learning repository (2013).

  17. R Development Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria (2020)., ISBN 3-900051-07-0

  18. Rokach, L.: Ensemble-based classifiers. Artif. Intell. Rev. 33(1–2), 1–39 (2010).

    Article  Google Scholar 

  19. Rokach, L.: Ensemble Learning: Pattern Classification Using Ensemble Methods, vol. 85. World Scientific, Singapore (2019)

    Google Scholar 

  20. Sabzevari, M., Martínez-Muñoz, G., Suárez, A.: Vote-boosting ensembles. Pattern Recogn. 83, 119–133 (2018)

    Article  Google Scholar 

  21. Stahl, F., Bramer, M.: Random PRISM: a noise-tolerant alternative to random forests. Expert. Syst. 31(5), 411–420 (2014)

    Article  Google Scholar 

  22. Vong, C.M., Du, J.: Accurate and efficient sequential ensemble learning for highly imbalanced multi-class data. Neural Netw. (2020)

    Google Scholar 

  23. Witten, I.H., Frank, E., Hall, M.A., Pal, C.J.: Data Mining: Practical Machine Learning Tools and Techniques. Morgan Kaufmann, Burlington (2016)

    Google Scholar 

  24. Zhou, Z.H.: Ensemble Methods: Foundations and Algorithms. Chapman and Hall/CRC, Boca Raton (2012)

    Google Scholar 

Download references


The research in this paper is partly supported by the Ministry for Science and Culture, Lower Saxony, Germany, through funds from the Niedersächsische Vorab (ZN3480).

Author information

Authors and Affiliations


Corresponding author

Correspondence to Frederic Stahl .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Almutairi, M., Stahl, F., Bramer, M. (2022). CRC: Consolidated Rules Construction for Expressive Ensemble Classification. In: Bramer, M., Stahl, F. (eds) Artificial Intelligence XXXIX. SGAI-AI 2022. Lecture Notes in Computer Science(), vol 13652. Springer, Cham.

Download citation

  • DOI:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-21440-0

  • Online ISBN: 978-3-031-21441-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics