Evolutionary Network Minimization: Adaptive Implicit Pruning of Successful Agents

  • Zohar Ganon
  • Alon Keinan
  • Eytan Ruppin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2801)


Neurocontroller minimization is beneficial for constructing small parsimonious networks that permit a better understanding of their workings. This paper presents a novel, Evolutionary Network Minimization (ENM) algorithm which is applied to fully recurrent neurocontrollers. ENM is a simple, standard genetic algorithm with an additional step in which small weights are irreversibly eliminated. ENM has a unique combination of features which distinguish it from previous evolutionary minimization algorithms: 1. An explicit penalty term is not added to the fitness function. 2. Minimization begins after functional neurocontrollers have been successfully evolved. 3. Successful minimization relies solely on the workings of a drift that removes unimportant weights and, importantly, on continuing adaptive modifications of the magnitudes of the remaining weights. Our results testify that ENM is successful in extensively minimizing recurrent evolved neurocontrollers while keeping their fitness intact and maintaining their principal functional characteristics.


Successful Agent Neural Information Processing System Move Motor Command Neuron Pruning Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Reed, R.: Pruning algorithms – a survey. IEEE Transactions on Neural Networks 4, 740–747 (1993)CrossRefGoogle Scholar
  2. 2.
    Lecun, Y., Denker, J., Solla, S., Howard, R.E., Jackel, L.D.: Optimal brain damage. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems, San Mateo, CA, vol. 2, pp. 598–605. Morgan Kauffman, San Francisco (1990)Google Scholar
  3. 3.
    Hassibi, B., Stork, D.G.: Second order derivatives for network pruning: Optimal brain surgeon. In: Hanson, S.J., Cowan, J.D., Giles, C.L. (eds.) Advances in Neural Information Processing Systems, San Mateo, CA, vol. 5, pp. 164–171. Morgan Kaufmann, San Francisco (1993)Google Scholar
  4. 4.
    Aharonov, R., Segev, L., Meilijson, I., Ruppin, E.: Localization of function via lesion analysis. Neural Computation 15, 885–913 (2003)zbMATHCrossRefGoogle Scholar
  5. 5.
    Keinan, A., Hilgetag, C.C., Meilijson, I., Ruppin, E.: Fair attribution of functional contribution in artificial and biological networks (2003) (submitted)Google Scholar
  6. 6.
    Zhang, B.T., Mühlenbein, H.: Genetic programming of minimal neural nets using Occam’s razor. In: Forrest, S. (ed.) Proceedings of the 5th International Conference on Genetic Algorithms, ICGA 1993. University of Illinois at Urbana-Champaign, pp. 342–349. Morgan Kaufmann, San Francisco (1993)Google Scholar
  7. 7.
    Kalganova, T., Miller, J.: Evolving more efficient digital circuits by allowing circuit layout evolution and multi-objective fitness. In: Stoica, A., Lohn, J., Keymeulen, D. (eds.) The First NASA/DoD Workshop on Evolvable Hardware, Pasadena, California, pp. 54–63. IEEE Computer Society, Los Alamitos (1999)CrossRefGoogle Scholar
  8. 8.
    Whitley, D., Starkweather, T., Bogart, C.: Genetic algorithms and neural networks: Optimizing connections and connectivity. In: Parallel Computing, vol. 14, pp. 347–361 (1990)Google Scholar
  9. 9.
    Aharonov-Barki, R., Beker, T., Ruppin, E.: Emergence of memory-driven command neurons in evolved artificial agents. Neural Computation 13, 691–716 (2001)zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Zohar Ganon
    • 1
  • Alon Keinan
    • 1
  • Eytan Ruppin
    • 1
    • 2
  1. 1.School of Computer SciencesTel-Aviv UniversityTel-AvivIsrael
  2. 2.School of MedicineTel-Aviv UniversityTel-AvivIsrael

Personalised recommendations