Advertisement

First Experiments on Ensembles of Radial Basis Functions

  • Carlos Hernández-Espinosa
  • Mercedes Fernández-Redondo
  • Joaquín Torres-Sospedra
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3077)

Abstract

Building an ensemble of classifiers is an useful way to improve the performance with respect to a single classifier. In the case of neural networks the bibliography has centered on the use of Multilayer Feedforward. However, there are other interesting networks like Radial Basis Functions (RBF) that can be used as elements of the ensemble. Furthermore, as pointed out recently the network RBF can also be trained by gradient descent, so all the methods of constructing the ensemble designed for Multilayer Feedforward are also applicable to RBF. In this paper we present the results of using eleven methods to construct an ensemble of RBF networks. We have trained ensembles of a reduced number of networks (3 and 9) to keep the computational cost low. The results show that the best method is in general the Simple Ensemble.

Keywords

Radial Basis Function Radial Basis Function Neural Network Ensemble Method Error Reduction Training Pattern 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Tumer, K., Ghosh, J.: Error correlation and error reduction in ensemble classifiers. Connection Science 8(3-4), 385–404 (1996)CrossRefGoogle Scholar
  2. 2.
    Raviv, Y., Intrator, N.: Bootstrapping with Noise: An Effective Regularization Technique. Connection Science 8(3-4), 355–372 (1996)CrossRefGoogle Scholar
  3. 3.
    Karayiannis, N.B.: Reformulated Radial Basis Neural Networks Trained by Gradient Descent. IEEE Trans. On Neural Networks 10(3), 657–671 (1999)CrossRefGoogle Scholar
  4. 4.
    Karayiannis, N.B., Randolph-Gips, M.M.: On the Construction and Training of Reformulated Radial Basis Function Neural Networks. IEEE Trans. On Neural Networks 14(4), 835–846 (2003)CrossRefGoogle Scholar
  5. 5.
    Drucker, H., Cortes, C., Jackel, D.: Boosting and Other Ensemble Methods. Neural Computation 6, 1289–1301 (1994)zbMATHCrossRefGoogle Scholar
  6. 6.
    Freund, Y., Schapire, R.: Experiments with a New Boosting Algorithm. In: Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
  7. 7.
    Rosen, B.: Ensemble Learning Using Decorrelated Neural Networks. Connection Science 8(3-4), 373–383 (1996)CrossRefGoogle Scholar
  8. 8.
    Auda, G., Kamel, M.: EVOL: Ensembles Voting On-Line. In: Proc. of the World Congress on Computational Intelligence, pp. 1356–1360 (1998)Google Scholar
  9. 9.
    Liu, Y., Yao, X.: A Cooperative Ensemble Learning System. In: Proc. of the World Congress on Computational Intelligence, pp. 2202–2207 (1998)Google Scholar
  10. 10.
    Jang, M., Cho, S.: Ensemble Learning Using Observational Learning Theory. In: Proceedings of the International Joint Conference. on Neural Networks, vol. 2, pp. 1281–1286 (1999)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Carlos Hernández-Espinosa
    • 1
  • Mercedes Fernández-Redondo
    • 1
  • Joaquín Torres-Sospedra
    • 1
  1. 1.Dept. de Ingeniería y Ciencia de los ComputadoresUniversidad Jaume ICastellonSpain

Personalised recommendations