Advertisement

Simultaneous Optimization of Weights and Structure of an RBF Neural Network

  • Virginie Lefort
  • Carole Knibbe
  • Guillaume Beslon
  • Joël Favrel
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3871)

Abstract

We propose here a new evolutionary algorithm, the RBF-Gene algorithm, to optimize Radial Basis Function Neural Networks. Unlike other works on this subject, our algorithm can evolve both the structure and the numerical parameters of the network: it is able to evolve the number of neurons and their weights.

The RBF-Gene algorithm’s behavior is shown on a simple toy problem, the 2D sine wave. Results on a classical benchmark are then presented. They show that our algorithm is able to fit the data very well while keeping the structure simple – the solution can be applied generally.

Keywords

Genetic Algorithm Genetic Code Hide Neuron Radial Basis Function Neural Network Good Individual 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Haykin, S.: Neural Networks - A Comprehensive Foundation, 2nd edn. Prentice-Hall, Englewood Cliffs (1999)MATHGoogle Scholar
  2. 2.
    Blanco, A., Delgado, M., Pegalajar, M.: A real-coded genetic algorithm for training recurrent neural networks. Neural Networks 14, 93–105 (2001)CrossRefGoogle Scholar
  3. 3.
    Kuşçu, I., Thornton, C.: Design of artificial neural networks using genetic algorithms: review and prospect. Technical Report 319, Cognitive and Computing Sciences, University of Sussex, Falmer, Brighton, Sussex, UK (1994)Google Scholar
  4. 4.
    Arotaritei, D., Negoita, M.G.: Optimization of Recurrent NN by GA with Variable Length Genotype. In: AI 2002. LNCS (LNAI), pp. 681–692. Springer, Heidelberg (2002)Google Scholar
  5. 5.
    Goldberg, D.E.: Genetic Algorithms in Search Optimization and Machine Learning. Addison-Wesley, Reading (1989)MATHGoogle Scholar
  6. 6.
    MacLeod, C., Maxwell, G.M.: Incremental Evolution in ANNs: Neural Nets which Grow. Artificial Intelligence Review 16, 201–224 (2001)CrossRefMATHGoogle Scholar
  7. 7.
    Barrios, D., Manrique, D., Plaza, R.M., Juan, R.: An Algebraic Model for Generating and Adapting Neural Networks by Means of Optimization Methods. Annals of Mathematics and Artificial Intelligence 33, 93–111 (2001)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Cliff, D., Harvey, I., Husbands, P.: Incremental evolution of neural network architectures for adaptive behaviour. Technical Report Cognitive Science Research Paper CSRP256, University of Sussex - School of Cognitive and Computing Science, Brighton BN1 9QH, England, UK (1992)Google Scholar
  9. 9.
    Thierens, D.: Non-Redundant Genetic Coding of Neural Networks. In: International Conference on Evolutionary Computation, pp. 571–575 (1996)Google Scholar
  10. 10.
    Levenick, J.R.: Inserting introns improves genetic algorithm success rate: Taking a cue from biology. In: Belew, R., Booker, L. (eds.) Proceedings of the Fourth International Conference on Genetic Algorithms, San Mateo, CA, pp. 123–127. Morgan Kaufmann, San Francisco (1991)Google Scholar
  11. 11.
    Wu, A.S., Lindsay, R.K.: A survey of intron research in genetics. In: Voigt, H.-M., Ebeling, W., Ingo, R., Hans-Paul, S. (eds.) Parallel Problem Solving From Nature IV. Proceedings of the International Conference on Evolutionary Computation, Berlin, Germany, vol. 1141, pp. 101–110. Springer, Heidelberg (1996)Google Scholar
  12. 12.
    Burke, D.S., De Jong, K.A., Grefenstette, J.J., Ramsey, C.L., Wu, A.S.: Putting more genetics into genetic algorithms. Evolutionary Computation 6, 387–410 (1998)CrossRefGoogle Scholar
  13. 13.
    Whitley, D.L., Barbulescu, L., Watson, J.P.: Local Search and High Precision Gray codes: convergence Results and Neighborhoods (2000)Google Scholar
  14. 14.
    Knibbe, C., Beslon, G., Lefort, V., Chaudier, F., Fayard, J.: Self-adaptation of Genome Size in Artificial Organisms. In: Capcarrère, M.S., Freitas, A.A., Bentley, P.J., Johnson, C.G., Timmis, J. (eds.) ECAL 2005. LNCS (LNAI), vol. 3630, pp. 423–432. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  15. 15.
    Lefort, V., Knibbe, C., Beslon, G., Favrel, J.: The RBF-Gene Model. In: Proceedings of GECCO 04, Late Breaking Papers (2004)Google Scholar
  16. 16.
    Lefort, V., Knibbe, C., Beslon, G., Favrel, J.: A bio-inspired genetic algorithm with a self-organizing genome: The RBF-gene model. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3103, pp. 406–407. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  17. 17.
    Orr, M.J.L., Hallam, J., Takezawa, K., Murray, A.F., Ninomiya, S., Oide, M., Leonard, T.: Combining Regression Trees and Radial Basis Function Networks. International Journal of Neural Systems 10, 453–465 (2000)CrossRefGoogle Scholar
  18. 18.
    Blake, C., Merz, C.: UCI repository of machine learning databases (1998)Google Scholar
  19. 19.
    Madigan, D., Ridgeway, G.: Discussion of Least Angle Regression by Efron et al. The Annals of Statistics 32, 465–469 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Virginie Lefort
    • 1
  • Carole Knibbe
    • 1
  • Guillaume Beslon
    • 1
  • Joël Favrel
    • 1
  1. 1.INSA-IF/PRISMaVilleurbanneFrance

Personalised recommendations