Managing Monotonicity in Classification by a Pruned AdaBoost

  • Sergio González
  • Francisco Herrera
  • Salvador García
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9648)

Abstract

In classification problems with ordinal monotonic constraints, the class variable should raise in accordance with a subset of explanatory variables. Models generated by standard classifiers do not guarantee to fulfill these monotonicity constraints. Therefore, some algorithms have been designed to deal with these problems. In the particular case of the decision trees, the growing and pruning mechanisms have been modified in order to produce monotonic trees. Recently, also ensembles have been adapted toward this problem, providing a good trade-off between accuracy and monotonicity degree. In this paper we study the behaviour of these decision tree mechanisms built on an AdaBoost scheme. We combine these techniques with a simple ensemble pruning method based on the degree of monotonicity. After an exhaustive experimental analysis, we deduce that the AdaBoost achieves a better predictive performance than standard algorithms, while holding also the monotonicity restriction.

Keywords

Monotonic classification Decision tree induction AdaBoost Ensemble pruning 

Notes

Acknowledgments

This work was partially supported by the Spanish Ministry of Science and Technology under project TIN2014-57251-P and the Andalusian Research Plans P11-TIC-7765, P10-TIC-6858.

References

  1. 1.
    Ben-David, A., Sterling, L., Pao, Y.H.: Learning, classification of monotonic ordinal concepts. Comput. Intell. 5, 45–49 (1989)CrossRefGoogle Scholar
  2. 2.
    Kotłowski, W., Słowiński, R.: On nonparametric ordinal classification with monotonicity constraints. IEEE Trans. Knowl. Data Eng. 25, 2576–2589 (2013)CrossRefGoogle Scholar
  3. 3.
    Furnkranz, J., Gamberger, D., Lavrac, N.: Foundations of Rule Learning. Springer, Berlin (2012)CrossRefMATHGoogle Scholar
  4. 4.
    Rokach, L., Maimon, O.: Data Mining with Decision Trees: Theory and Applications, 2nd edn. World Scientific, River Edge (2014)CrossRefMATHGoogle Scholar
  5. 5.
    Garca, J., Fardoun, H., Alghazzawi, D., Cano, J.R., Garca, S.:Mongel: monotonic nested generalized exemplar learning. Pattern Anal. Appl. 1–12 (2015)Google Scholar
  6. 6.
    Wozniak, M., Graña, M., Corchado, E.: A survey of multiple classifier systems as hybrid systems. Inf. Fusion 16, 3–17 (2014)CrossRefGoogle Scholar
  7. 7.
    Sousa, R., Cardoso, J.: Ensemble of decision trees with global constraints for ordinal classification. In: 2011 11th International Conference on Intelligent Systems Design and Applications (ISDA), pp. 1164–1169 (2011)Google Scholar
  8. 8.
    González, S., Herrera, F., García, S.: Monotonic random forest with an ensemble pruning mechanism based on the degree of monotonicity. New Gener. Comput. 33, 367–388 (2015)CrossRefGoogle Scholar
  9. 9.
    Martínez-Muñoz, G., Hernández-Lobato, D., Suárez, A.: An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans. Pattern Anal. Mach. Intell. 31, 245–259 (2009)CrossRefGoogle Scholar
  10. 10.
    Dembczyński, K., Kotłowski, W., Słowiński, R.: Ensemble of decision rules for ordinal classification with monotonicity constraints. In: Wang, G., Li, T., Grzymala-Busse, J.W., Miao, D., Skowron, A., Yao, Y. (eds.) RSKT 2008. LNCS (LNAI), vol. 5009, pp. 260–267. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  11. 11.
    Dembczyński, K., Kotłowski, W., Słowiński, R.: Learning rule ensembles for ordinal classification with monotonicity constraints. Fundamenta Informaticae 94, 163–178 (2009)MathSciNetMATHGoogle Scholar
  12. 12.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55, 119–139 (1997)MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    Ben-David, A.: Monotonicity maintenance in information-theoretic machine learning algorithms. Mach. Learn. 19, 29–43 (1995)Google Scholar
  14. 14.
    Webb, G.: Multiboosting: a technique for combining boosting and wagging. Mach. Learn. 40, 159–196 (2000)CrossRefGoogle Scholar
  15. 15.
    Alcala-Fdez, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.: KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Logic Soft Comput. 17, 255–287 (2011)Google Scholar
  16. 16.
    Duivesteijn, W., Feelders, A.: Nearest neighbour classification with monotonicity constraints. In: Daelemans, W., Goethals, B., Morik, K. (eds.) ECML PKDD 2008, Part I. LNCS (LNAI), vol. 5211, pp. 301–316. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  17. 17.
    Xia, F., Zhang, W., Li, F., Yang, Y.: Ranking with decision tree. Knowl. Inf. Syst. 17, 381–395 (2008)CrossRefGoogle Scholar
  18. 18.
    Japkowicz, N., Shah, M. (eds.): Evaluating Learning Algorithms: A Classification Perspective. Cambridge University Press, Cambridge (2011)MATHGoogle Scholar
  19. 19.
    García, S., Fernández, A., Luengo, J., Herrera, F.: Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf. Sci. 180, 2044–2064 (2010)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Sergio González
    • 1
  • Francisco Herrera
    • 1
  • Salvador García
    • 1
  1. 1.Department of Computer Science and Artificial IntelligenceUniversity of GranadaGranadaSpain

Personalised recommendations