Advertisement

Improving the Combination Module with a Neural Network

  • Carlos Hernández-Espinosa
  • Joaquín Torres-Sospedra
  • Mercedes Fernández-Redondo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4113)

Abstract

In this paper we propose two versions of Stacked Generalization as the combination module of an ensemble of neural networks. The first version only uses the information provided by expert networks. The second one uses the information provided by experts and the input data of the pattern that is being classified. Finally, we have performed a comparison among 6 classical combination methods and the two versions of Stacked Generalization in order to get the best method. The results show that the methods based on Stacked Generalization are better than classical combination methods.

Keywords

Neural Network Combination Method Hide Unit Error Reduction Combination Module 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Tumer, K., Ghosh, J.: Error Correlation and Error Reduction in Ensemble Classifiers. Connection Science 8(3-4), 385–403 (1996)CrossRefGoogle Scholar
  2. 2.
    Raviv, Y., Intratorr, N.: Bootstrapping with Noise: An Effective Regularization Technique. Connection Science, Special issue on Combining Estimators 8, 356–372 (1996)Google Scholar
  3. 3.
    Hernandez-Espinosa, C., Fernandez-Redondo, M., Torres-Sospedra, J.: Ensembles of Multilayer Feedforward for Classification Problems. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds.) ICONIP 2004. LNCS, vol. 3316, pp. 744–749. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  4. 4.
    Hernandez-Espinosa, C., Torres-Sospedra, J., Fernandez-Redondo, M.: New Experiments on Ensembles of Multilayer Feedforward for Classification Problems. In: Proceedings of International Conference on Neural Networks, IJCNN 2005, Montreal, Canada, pp. 1120–1124 (2005)Google Scholar
  5. 5.
    Torres-Sospedra, J., Fernandez-Redondo, M., Hernandez-Espinosa, C.: A Research on Combination Methods for Ensembles of Multilayer Feedforward. In: Proceedings of International Conference on Neural Networks, IJCNN 2005, Montreal, Canada, pp. 1125–1130 (2005)Google Scholar
  6. 6.
    Xu, L., Krzyzak, A., Suen, C.: Methods of Combining Multiple Classifiers and Their Applications to Handwriting Recognition. IEEE Transactions on Systems, Man, and Cybernetics 22(3), 418–435 (1992)CrossRefGoogle Scholar
  7. 7.
    Verikas, A., Lipnickas, A., Malmqvist, K., Bacauskiene, M., Gelzinis, A.: Soft Combination of Neural Classifiers: A Comparative Study. Pattern Recognition Letters 20(4), 429–444 (1999)CrossRefGoogle Scholar
  8. 8.
    Jimenez, D., Walsh, N.: Dynamically Weighted Ensemble Neural Networks for Classification. IEEE World Congress on Computational Intelligence 1, 753–756 (1998)Google Scholar
  9. 9.
    Wolpert, D.H.: Stacked Generalization. Neural Networks 5(6), 1289–1301 (1994)MathSciNetGoogle Scholar
  10. 10.
    Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases (1998)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Carlos Hernández-Espinosa
    • 1
  • Joaquín Torres-Sospedra
    • 1
  • Mercedes Fernández-Redondo
    • 1
  1. 1.Departamento de Ingenieria y Ciencia de los ComputadoresUniversitat Jaume ICastellonSpain

Personalised recommendations