Genetically Evolved Trees Representing Ensembles

  • Ulf Johansson
  • Tuve Löfström
  • Rikard König
  • Lars Niklasson
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4029)


We have recently proposed a novel algorithm for ensemble creation called GEMS (Genetic Ensemble Member Selection). GEMS first trains a fixed number of neural networks (here twenty) and then uses genetic programming to combine these networks into an ensemble. The use of genetic programming makes it possible for GEMS to not only consider ensembles of different sizes, but also to use ensembles as intermediate building blocks. In this paper, which is the first extensive study of GEMS, the representation language is extended to include tests partitioning the data, further increasing flexibility. In addition, several micro techniques are applied to reduce overfitting, which appears to be the main problem for this powerful algorithm. The experiments show that GEMS, when evaluated on 15 publicly available data sets, obtains very high accuracy, clearly outperforming both straightforward ensemble designs and standard decision tree algorithms.


Hide Layer Genetic Programming Representation Language Result Vector Neural Network Ensemble 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence 12(10), 993–1001 (1990)CrossRefGoogle Scholar
  2. 2.
    Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems, vol. 2, pp. 650–659. Morgan Kaufmann, San Mateo (1995)Google Scholar
  3. 3.
    Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)MATHMathSciNetGoogle Scholar
  4. 4.
    Shapire, R.: The strength of weak learnability. Machine Learning 5(2), 197–227 (1990)Google Scholar
  5. 5.
    Zhou, Z.-H., Wu, J.-X., Jiang, Y., Chen, S.-F.: Genetic algorithm based selective neural network ensemble. In: 17th International Joint Conference of Artificial Intelligence, Seattle, WA, vol. 2, pp. 797–802 (2001)Google Scholar
  6. 6.
    Zhou, Z.-H., Wu, J.-X., Tang, W.: Ensembling Neural Networks: Many Could Be Better Than All. Artificial Intelligence 137(1-2), 239–263 (2002)MATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Opitz, D., Shavlik, J.: Actively searching for an effective neural-network ensemble. Connection Science 8(3/4), 337–353 (1996)CrossRefGoogle Scholar
  8. 8.
    Johansson, U., Löfström, T., König, R., Niklasson, L.: Introducing GEMS - a Novel Technique for Ensemble Creation. In: 19th Florida Artificial Intelligence Research Society Conference (FLAIRS) 2006, AAAI Press, Melbourne (to appear, 2006)Google Scholar
  9. 9.
    Johansson, U., Löfström, T., Niklasson, L.: Obtaining Accurate Neural Network Ensembles. In: International Conference on Computational Intelligence for Modelling Control and Automation - CIMCA’2005 (in Press)Google Scholar
  10. 10.
    Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases, University of California, Department of Information and Computer Science (1998)Google Scholar
  11. 11.
    Quinlan, J.R.: See5 version 2.02 (2005),
  12. 12.
    Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees, Wadsworth International Group (1984)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ulf Johansson
    • 1
  • Tuve Löfström
    • 1
  • Rikard König
    • 1
  • Lars Niklasson
    • 2
  1. 1.School of Business and InformaticsUniversity of BoråsSweden
  2. 2.School of Humanities and InformaticsUniversity of SkövdeSweden

Personalised recommendations