Advertisement

Hybrid Dynamic Learning Systems for Regression

  • Kaushala Dias
  • Terry Windeatt
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9095)

Abstract

Methods of introducing diversity into ensemble learning predictors for regression problems are presented. Two methods are proposed in this paper, one involving pruning and the other a hybrid approach. In these ensemble learning approaches, diversity is introduced while simultaneously training, as part of the same learning process. Here not all members of the ensemble are trained in the same manner, but selectively trained, resulting in a diverse selection of ensemble members that have strengths in different parts of the training set. The result is that the prediction accuracy and generalization ability of the trained ensemble is enhanced. Pruning and hybrid heuristics attempt to combine accurate yet complementary members; therefore these methods enhance the performance by dynamically modifying the pruned aggregation through distributing the ensemble member selection over the entire dataset. A comparison is drawn with Negative Correlation Learning and a static ensemble pruning approach used in regression to highlight the performance improvement yielded by the dynamic methods. Experimental comparison is made using Multiple Layer Perceptron predictors on benchmark datasets, and on a signal calibration application.

Keywords

Ensemble methods Ensemble pruning Ensemble learning Neural networks 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Tsoumakas, G., Partalas, I., Vlahavas, I.: An Ensemble Pruning Primer. In: Okun, O., Valentini, G. (eds.) Applications of Supervised and Unsupervised Ensemble Methods. SCI, vol. 245, pp. 1–13. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  2. 2.
    Windeatt, T., Zor, C.: Ensemble Pruning using Spectral Coefficients. IEEE Trans. Neural Network. Learning Syst. 24(4), 673–678 (2013)CrossRefGoogle Scholar
  3. 3.
    Hernández-Lobato, D., Martínez-Muñoz, G.: Suárez A, Empirical Analysis and Evaluation of Approximate Techniques for Pruning Regression Bagging Ensembles. Neurocomputing 74, 2250–2264 (2011)CrossRefGoogle Scholar
  4. 4.
    Olvera-Lopez J., Carrasco-Ochao J., Martinez-Trinidad J., Kittler J.: A review of instance selection methods. Artificial Intelligence Review 34(2) 133–143 (2010)Google Scholar
  5. 5.
    Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: a survey and categorization. Information Fusion 6(1), 5–20 (2005)CrossRefGoogle Scholar
  6. 6.
    Chen H., Yao X.: Multiobjective Neural Network Ensembles Based on Regularized Negative Correlation Learning. IEEE Trans. Knowledge and Data Engineering 22(12), 1738–1751 (2010)Google Scholar
  7. 7.
    Dos Santos, E.M., Sabourin, R., Maupin, P.: A Dynamic Overproduce-and-choose Strategy for the selection of Classifier Ensembles. Pattern Recognition 41, 2993–3009 (2008)MATHCrossRefGoogle Scholar
  8. 8.
    Zhau, Z.-H., Wu, J., Tang, W.: Ensembling Neural Networks: many could be better than all. Artificial Intelligence 137, 239–263 (2002)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Dias K., Windeatt T.: Dynamic Ensemble Selection and Instantaneous Pruning for Regression. In: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2014, pp. 643–648 (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Centre for Vision Speech and Signal ProcessingUniversity of SurreyGuildfordUK

Personalised recommendations