Improving the Combination Module with a Neural Network
In this paper we propose two versions of Stacked Generalization as the combination module of an ensemble of neural networks. The first version only uses the information provided by expert networks. The second one uses the information provided by experts and the input data of the pattern that is being classified. Finally, we have performed a comparison among 6 classical combination methods and the two versions of Stacked Generalization in order to get the best method. The results show that the methods based on Stacked Generalization are better than classical combination methods.
KeywordsNeural Network Combination Method Hide Unit Error Reduction Combination Module
Unable to display preview. Download preview PDF.
- 2.Raviv, Y., Intratorr, N.: Bootstrapping with Noise: An Effective Regularization Technique. Connection Science, Special issue on Combining Estimators 8, 356–372 (1996)Google Scholar
- 4.Hernandez-Espinosa, C., Torres-Sospedra, J., Fernandez-Redondo, M.: New Experiments on Ensembles of Multilayer Feedforward for Classification Problems. In: Proceedings of International Conference on Neural Networks, IJCNN 2005, Montreal, Canada, pp. 1120–1124 (2005)Google Scholar
- 5.Torres-Sospedra, J., Fernandez-Redondo, M., Hernandez-Espinosa, C.: A Research on Combination Methods for Ensembles of Multilayer Feedforward. In: Proceedings of International Conference on Neural Networks, IJCNN 2005, Montreal, Canada, pp. 1125–1130 (2005)Google Scholar
- 8.Jimenez, D., Walsh, N.: Dynamically Weighted Ensemble Neural Networks for Classification. IEEE World Congress on Computational Intelligence 1, 753–756 (1998)Google Scholar
- 10.Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases (1998)Google Scholar