Advertisement

Local Negative Correlation with Resampling

  • Ricardo Ñanculef
  • Carlos Valle
  • Héctor Allende
  • Claudio Moraga
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4224)

Abstract

This paper deals with a learning algorithm which combines two well known methods to generate ensemble diversity – error negative correlation and resampling. In this algorithm, a set of learners iteratively and synchronously improve their state considering information about the performance of a fixed number of other learners in the ensemble, to generate a sort of local negative correlation. Resampling allows the base algorithm to control the impact of highly influential data points which in turns can improve its generalization error. The resulting algorithm can be viewed as a generalization of bagging, where each learner no longer is independent but can be locally coupled with other learners. We will demonstrate our technique on two real data sets using neural networks ensembles.

Keywords

Training Pattern Generalization Error Ensemble Learning Neighborhood Function Individual Error 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998)Google Scholar
  2. 2.
    Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)MATHMathSciNetGoogle Scholar
  3. 3.
    Brown, G.: Diversity in neural network ensembles, Ph.D. thesis, School of Computer Science, University of Birmingham (2003)Google Scholar
  4. 4.
    Harris, R., Brown, G., Wyatt, J., Yao, X.: Diversity creation methods: A survey and categorisation. Information Fusion Journal (Special issue on Diversity in Multiple Classifier Systems) 6(1), 5–20 (2004)Google Scholar
  5. 5.
    Grandvalet, Y.: Bagging down-weights leverage points. In: IJCNN, vol. 4, pp. 505–510 (2000)Google Scholar
  6. 6.
    Goandvalet, Y.: Bagging equalizes influence. Machine Learning 55(3), 251–270 (2004)Google Scholar
  7. 7.
    Bousquet, O., Elisseeff, A.: Stability and generalization. Journal of Machine Learning Research 2, 499–526 (2002)MATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Allende, H., Ñanculef, R., Valle, C., Moraga, C.: Ensemble learning with local diversity. In: ICANN 2006 (2006) (to appear)Google Scholar
  9. 9.
    Rosen, B.: Ensemble learning using decorrelated neural networks. Connection Science (Special Issue on Combining Artificial Neural Networks: Ensemble Approaches) 8(3-4), 373–384 (1999)Google Scholar
  10. 10.
    Rifkin, R., Poggio, T., Mukherjee, S.: Bagging regularizes, Tech. Report 214/AI Memo 2002-003, MIT CBCL (2002)Google Scholar
  11. 11.
    Tikhonov, A., Arsenin, V.: Solutions of ill-posed problems. Winston (1977)Google Scholar
  12. 12.
    Tumer, K., Ghosh, J.: Error correlation and error reduction in ensemble classifiers. Connection Science 8(3-4), 385–403 (1996)CrossRefGoogle Scholar
  13. 13.
    Vlachos, P.: StatLib datasets archive (2005)Google Scholar
  14. 14.
    Yao, X., Lui, Y.: Ensemble learning via negative correlation. Neural Networks 12(10), 1399–1404 (1999)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ricardo Ñanculef
    • 1
  • Carlos Valle
    • 1
  • Héctor Allende
    • 1
  • Claudio Moraga
    • 2
    • 3
  1. 1.Departamento de InformáticaUniversidad Técnica Federico Santa MaríaValparaísoChile
  2. 2.European Centre for Soft ComputingMieres, AsturiasSpain
  3. 3.Dortmund UniversityDortmundGermany

Personalised recommendations