Alternative OVA Proposals for Cooperative Competitive RBFN Design in Classification Tasks

  • Francisco Charte Ojeda
  • Antonio Jesús Rivera Rivas
  • María Dolores Pérez-Godoy
  • María Jose del Jesus
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7902)


In the Machine Learning field when the multi-class classification problem is addressed, one possibility is to transform the data set in binary data sets using techniques such as One-Versus-All. One classifier must be trained for each binary data set and their outputs combined in order to obtain the final predicted class. The determination of the strategy used to combine the output of the binary classifiers is an interesting research area.

In this paper different OVA strategies are developed and tested using as base classifier a cooperative-competitive RBFN design algorithm, CO2RBFN. One advantage of the obtained models is that they obtain as output for a given class a continuous value proportional to its level of confidence. Concretely three OVA strategies have been tested: the classical one, one based on the difference among outputs and another one based in a voting scheme, that has obtained the best results.


OVA RBFNs Multi-class classification 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Rifkin, R., Klautau, A.: In defense of one-vs-all classification. Journal of Machine Learning Research 5, 101–141 (2004)zbMATHMathSciNetGoogle Scholar
  2. 2.
    Broomhead, D., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Systems 2, 321–355 (1988)zbMATHMathSciNetGoogle Scholar
  3. 3.
    Buchtala, O., Klimek, M., Sick, B.: Evolutionary optimization of radial basis function classifiers for data mining applications. IEEE Transactions on System, Man, and Cybernetics, B 35(5), 928–947 (2005)CrossRefGoogle Scholar
  4. 4.
    Goldberg, D.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading (1989)zbMATHGoogle Scholar
  5. 5.
    Harpham, C., Dawson, C., Brown, M.: A review of genetic algorithms applied to training radial basis function networks. Neural Computing and Applications 13, 193–201 (2004)CrossRefGoogle Scholar
  6. 6.
    Whitehead, B., Choate, T.: Cooperative-competitive genetic evolution of radial basis function centers and widths for time series prediction. IEEE Transactions on Neural Networks 7(4), 869–880 (1996)CrossRefGoogle Scholar
  7. 7.
    Pérez-Godoy, M., Rivera, A., del Jesus, M., Berlanga, F.: co 2 rbfn: An evolutionary cooperative-competitive RBFN design algorithm for classification problems. Soft Computing 14(9), 953–971 (2010)CrossRefGoogle Scholar
  8. 8.
    Widrow, B., Lehr, M.: 30 years of adaptive neural networks: perceptron, madaline and backpropagation. Proceedings of the IEEE 78(9), 1415–1442 (1990)CrossRefGoogle Scholar
  9. 9.
    Mandani, E., Assilian, S.: An experiment in linguistic synthesis with a fuzzy logic controller. International Journal of Man-Machine Studies 7(1), 1–13 (1975)CrossRefGoogle Scholar
  10. 10.
    Alcalá-Fdez, J., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.: Keel data-mining software tool: Data set repository, integration of algorithms and experimental analysis framework. Journal of Multiple-Valued Logic and Soft Computing 17(2-3), 255–287 (2011)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Francisco Charte Ojeda
    • 1
  • Antonio Jesús Rivera Rivas
    • 2
  • María Dolores Pérez-Godoy
    • 2
  • María Jose del Jesus
    • 2
  1. 1.Dept. of Computer Science and Artificial InteligenceUniversity of GranadaSpain
  2. 2.Dept. of Computer ScienceUniversity of JaénSpain

Personalised recommendations