A Bayesian Approach for Combining Ensembles of GP Classifiers
Recently, ensemble techniques have also attracted the attention of Genetic Programing (GP) researchers. The goal is to further improve GP classification performances. Among the ensemble techniques, also bagging and boosting have been taken into account. These techniques improve classification accuracy by combining the responses of different classifiers by using a majority vote rule. However, it is really hard to ensure that classifiers in the ensemble be appropriately diverse, so as to avoid correlated errors. Our approach tries to cope with this problem, designing a framework for effectively combine GP-based ensemble by means of a Bayesian Network. The proposed system uses two different approaches. The first one applies a boosting technique to a GP–based classification algorithm in order to generate an effective decision trees ensemble. The second module uses a Bayesian network for combining the responses provided by such ensemble and select the most appropriate decision trees. The Bayesian network is learned by means of a specifically devised Evolutionary algorithm. Preliminary experimental results confirmed the effectiveness of the proposed approach.
Unable to display preview. Download preview PDF.
- 3.De Stefano, C., Fontanella, F., Marrocco, C., Scotto di Freca, A.: A hybrid evolutionary algorithm for bayesian networks learning: An application to classifier combination. In: EvoApplications (1). pp. 221–230 (2010)Google Scholar
- 4.Folino, G., Pizzuti, C., Spezzano, G.: A cellular genetic programming approach to classification. In: Proc. Of the Genetic and Evolutionary Computation Conference (GECCO 1999), pp. 1015–1020. Morgan Kaufmann, Orlando (1999)Google Scholar
- 6.Freund, Y., Shapire, R.: Experiments with a new boosting algorithm. In: Proceedings of the 13th Int. Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
- 7.Iba, H.: Bagging, boosting, and bloating in genetic programming. In: Proc. Of the Genetic and Evolutionary Computation Conference (GECCO 1999), pp. 1053–1060. Morgan Kaufmann, Orlando (1999)Google Scholar
- 9.Kuncheva, L., Shipp, C.: An investigation into how adaboost affects classifier diversity. In: Proc. of IPMU (2002)Google Scholar
- 12.Nikolaev, N., Slavov, V.: Inductive genetic programming with decision trees. In: Proceedings of the 9th International Conference on Machine Learning, Prague, Czech Republic (1997)Google Scholar
- 14.Quinlan, J.R.: C4.5 Programs for Machine Learning. Morgan Kaufmann, San Mateo (1993)Google Scholar