Accelerating the evolutionary-gradient-search procedure: Individual step sizes

  • Ralf Salomon
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1498)


Recent research has proposed the evolutionary-gradient-search procedure that uses the evolutionary scheme to estimate a gradient direction and that performs the parameter updates in a steepest-descent form. On several test functions, the procedure has shown faster convergence than other evolutionary algorithms. However, the procedure also exhibits similar deficiencies as steepest-descent methods. This paper explores to which extent the adoption of individual step sizes, as known from evolution strategies, can be beneficially used. It turns out that they considerably accelerate convergence.


Evolutionary Algorithm Evolution Strategy Correlate Mutation Acceleration Method Mutation Vector 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bäck, T., Schwefel, H.-P.: An Overview of Evolutionary Algorithms for Parameter Optimization. Evolutionary Computation 1(1) (1993) 1–23Google Scholar
  2. 2.
    Bäck, T., Kursawe, F.: Evolutionary Algorithms for Fuzzy Logic: A Brief Overview. In: Bouchon-Meunier, B., Yager, R.R., Zadeh, L.A. (eds.): Fuzzy Logic and Soft Computing, Vol. IV. World Scientific, Singapore (1995) 3–10Google Scholar
  3. 3.
    Beyer, H.-G.: An Alternative Explanation for the Manner in which Genetic Algorithms Operate. BioSystems 41 (1997) 1–15CrossRefGoogle Scholar
  4. 4.
    Fogel, D.B.: Evolutionary Computation: Toward a New Philosophy of Machine Learning Intelligence. IEEE Press, Jersy, NJ (1995)Google Scholar
  5. 5.
    Fogel, L.J.: “Autonomous Automata”. Industrial Research 4 (1962) 14–19Google Scholar
  6. 6.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley Publishing Company (1989)Google Scholar
  7. 7.
    Hansen, N., Ostermeier, A.: Adapting Arbitrary Normal Mutation Distributions in Evolution Strategies: The Covariance Matrix Adaptation. In: Proceedings of The 1996 IEEE International Conference on Evolutionary Computation (IECEC'96). IEEE (1996) 312–317Google Scholar
  8. 8.
    Hansen, N., Ostermeier, A.: Convergence Properties of Evolution Strategies with the Derandomized Covariance Matrix Adaptation: The (Μ/Μ I, λ)-CMA-ES. In: Zimmermann, H.-J. (ed.): Proceedings of The Fifth Congress on Intelligent Techniques and Soft Computing EUFIT'97. Verlag Mainz, Achen, (1997) 650–654Google Scholar
  9. 9.
    Luenberger, D.G.: Linear and Nonlinear Programming. Addison-Wesley, Menlo Park, CA (1984)Google Scholar
  10. 10.
    Mühlenbein, H., Schlierkamp-Voosen, D.: Predictive Models for the Breeder Genetic Algorithm I. Evolutionary Computation 1(1) (1993) 25–50.Google Scholar
  11. 11.
    Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes in C. Cambridge University Press, Cambridge, UK (1994)Google Scholar
  12. 12.
    Rechenberg, I.: Evolutionsstrategie. Frommann-Holzboog, Stuttgart (1994)Google Scholar
  13. 13.
    Rumelhart et al. (eds.): Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 2. The MIT Press, Cambridge, MA (1986)Google Scholar
  14. 14.
    Salomon, R.: Reevaluating Genetic Algorithm Performance under Coordinate Rotation of Benchmark Functions; A survey of some theoretical and practical aspects of genetic algorithms. BioSystems 39(3) (1996) 263–278CrossRefGoogle Scholar
  15. 15.
    Salomon, R.: The Evolutionary-Gradient-Search Procedure. In: Koza, J. et al. (eds.): Genetic Programming 1998: Proceedings of the Third Annual Conference, July 22–25, 1998. Morgan Kaufmann, San Francisco, CA (1998)Google Scholar
  16. 16.
    Salomon, R., van Hemmen, J.L.: Accelerating backpropagation through dynamic self-adaptation. Neural Networks 9(4) (1996) 589–601CrossRefGoogle Scholar
  17. 17.
    Schwefel, H.-P.: Evolution and Optimum Seeking. John Wiley and Sons, NY (1995)Google Scholar
  18. 18.
    Schwefel, H.-P.: Evolutionary Computation — A Study on Collective Learning. In: Callaos, N., Khoong, C.M., Cohen, E. (eds.): Proceedings of the World Multiconference on Systemics, Cybernetics and Informatics, vol. 2. Int'l Inst. of Informatics and Systemics, Orlando FL (1997) 198–205Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Ralf Salomon
    • 1
  1. 1.AI Lab, Department of Computer ScienceUniversity of ZurichZurichSwitzerland

Personalised recommendations