Advertisement

Compressed Network Complexity Search

  • Faustino Gomez
  • Jan Koutník
  • Jürgen Schmidhuber
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7491)

Abstract

Indirect encoding schemes for neural network phenotypes can represent large networks compactly. In previous work, we presented a new approach where networks are encoded indirectly as a set of Fourier-type coefficients that decorrelate weight matrices such that they can often be represented by a small number of genes, effectively reducing the search space dimensionality, and speed up search. Up to now, the complexity of networks using this encoding was fixed a priori, both in terms of (1) the number of free parameters (topology) and (2) the number of coefficients. In this paper, we introduce a method, called Compressed Network Complexity Search (CNCS), for automatically determining network complexity that favors parsimonious solutions. CNCS maintains a probability distribution over complexity classes that it uses to select which class to optimize. Class probabilities are adapted based on their expected fitness. Starting with a prior biased toward the simplest networks, the distribution grows gradually until a solution is found. Experiments on two benchmark control problems, including a challenging non-linear version of the helicopter hovering task, demonstrate that the method consistently finds simple solutions.

Keywords

Compression Ratio Bias Weight Search Space Dimensionality Search Distribution Recurrent Weight 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Abbeel, P., Ganapathi, V., Ng, A.Y.: Learning vehicular dynamics, with application to modeling helicopters. In: NIPS (2005)Google Scholar
  2. 2.
    Dürr, P., Mattiussi, C., Floreano, D.: Neuroevolution with Analog Genetic Encoding. In: Runarsson, T.P., Beyer, H.-G., Burke, E.K., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) PPSN 2006. LNCS, vol. 4193, pp. 671–680. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  3. 3.
    Gruau, F.: Cellular encoding of genetic neural networks. Technical Report RR-92-21, Ecole Normale Superieure de Lyon, Institut IMAG, Lyon, France (1992)Google Scholar
  4. 4.
    Kitano, H.: Designing neural networks using genetic algorithms with graph generation system. Complex Systems 4, 461–476 (1990)zbMATHGoogle Scholar
  5. 5.
    Koutník, J., Gomez, F., Schmidhuber, J.: Evolving neural networks in compressed weight space. In: Proceedings of the Conference on Genetic and Evolutionary Computation, GECCO 2010 (2010)Google Scholar
  6. 6.
    Koutník, J., Gomez, F., Schmidhuber, J.: Searching for minimal neural networks in fourier space. In: Proc. of the 4th Conf. on Artificial General Intelligence (2010)Google Scholar
  7. 7.
    Levin, L.A.: Universal sequential search problems. Problems of Information Transmission 9(3), 265–266 (1973)Google Scholar
  8. 8.
    Parzen, E.: On estimation of a probability density function and mode. The Annals of Mathematical Statistics 33(3), 1065–1076 (1962)MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    Schmidhuber, J.: Discovering neural nets with low Kolmogorov complexity and high generalization capability. Neural Networks 10(5), 857–873 (1997)CrossRefGoogle Scholar
  10. 10.
    Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evolutionary Computation 10, 99–127 (2002)CrossRefGoogle Scholar
  11. 11.
    Stanley, K.O., Miikkulainen, R.: A taxonomy for artificial embryogeny. Artificial Life 9(2), 93–130 (2003)CrossRefGoogle Scholar
  12. 12.
    Wierstra, D., Schaul, T., Peters, J., Schmidhuber, J.: Natural Evolution Strategies. In: Proceedings of the Congress on Evolutionary Computation (CEC 2008), Hongkong. IEEE Press (2008)Google Scholar
  13. 13.
    Wierstra, D., Schaul, T., Sun, T.G.Y., Schmidhuber, J.: Natural evolution strategies. Technical report (2011), arXiv:1106.4487v1Google Scholar
  14. 14.
    Woolley, B.G., Stanley, K.O.: Evolving a Single Scalable Controller for an Octopus Arm with a Variable Number of Segments. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN XI. LNCS, vol. 6239, pp. 270–279. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  15. 15.
    Zhang, B.-T., Muhlenbein, H.: Evolving optimal neural networks using genetic algorithms with Occam’s razor. Complex Systems 7, 199–220 (1993)Google Scholar
  16. 16.
    Zhang, B.-T., Muhlenbein, H.: Balancing accuracy and parsimony in genetic programming. Evolutionary Computation 3, 17–38 (1995)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Faustino Gomez
    • 1
  • Jan Koutník
    • 1
  • Jürgen Schmidhuber
    • 1
  1. 1.IDSIAUSI-SUPSIManno-LuganoSwitzerland

Personalised recommendations