Evolutionary Learning Optimum-Seeking on Parallel Computer Architectures

  • Hans-Paul Schwefel
Part of the Advances in Simulation book series (ADVS.SIMULATION, volume 1)


On the one hand side many people admire the often strikingly efficient results of organic evolution. On the other hand side, however, they presuppose mutation and selection to be a rather prodigal and unefficient trial-and-error strategy. Taking into account the parallel processing of a heterogeneous population and sexual propagation with recombination as well as the endogenous adaptation of strategy characteristics, simulated evolution reveals a couple of interesting, sometimes surprising, properties of nature’s learning-by-doing algorithm. ‘Survival of the fittest', often taken as Darwin’s view, turns out to be a bad advice. Individual death, forgetting, and even regression show up to be necessary ingredients of the life game. Whether the process should be named gradualistic or punctualistic, is a matter of the observer’s point of view. He even may observe ‘long cycles’.


Internal Model Simulated Evolution Parallel Computer Architecture Progress Velocity Endogenous Adaptation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Ashby, W.R. (1960) Design for a brain. New York: Wiley, 2nd ed.Google Scholar
  2. [2]
    Box, G.E.P. and Draper, N.R. (1969) Evolutionary operation: A statistical method for process improvement. New York: Wiley.Google Scholar
  3. [3]
    Box, M.J. (1965) A new method of constrained optimization and a comparison with other methods. Computer Journal, 8, 42–52.MathSciNetMATHGoogle Scholar
  4. [4]
    Bremermann, H.J. (1968) Numerical optimization procedures derived from biological evolution processes. In: Cybernetic problems in bionics. (H.L. Oestreicher and D.R. Moore, eds.), New York: Gordon and Breach.Google Scholar
  5. [5]
    Brooks, S.H. (1958) A discussion of random methods for seeking maxima. Oper. Res., 6, 244–251.MathSciNetCrossRefGoogle Scholar
  6. [6]
    Meadows, D., Richardson, J. and Bruckmann, G. (1982) Groping in the dark-The first decade of Global Modelling. Chichester: Wiley.Google Scholar
  7. [7]
    Montroll, E.W. and Shuler, K.E. (1979) Dynamics of technological evolution: random walk model for the research enterprise. Proc. Natl. Acad. Sci. USA. 76., 6038–6034.MathSciNetCrossRefGoogle Scholar
  8. [8]
    Neider, J.A. and Mead, R. (1965) A simplex method for function minimization. Computer Journal, 7, 308–313.Google Scholar
  9. [9]
    Rastrigin, L.A. (1965) Sluchainyi poisk v zadachakh optimisatsii mnogoparametricheskikh sistem. Riga: Zinatne.Google Scholar
  10. [10]
    Rechenberg, I. (1973) Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Stuttgart: Frommann-Holzboog.Google Scholar
  11. [11]
    Schumer, M.A. and Steiglitz, K. (1968) Adaptive step size random search. IEEE Trans., AC-13, 270–276.Google Scholar
  12. [12]
    Schwefel, H.P. (1981) Numerical optimization of computer models. Chichester: Wiley.MATHGoogle Scholar
  13. [13]
    Stanley, S.M. (1981) The new evolutionary timetable. New York: Basic Books.Google Scholar

Copyright information

© Akademie-Verlag Berlin 1988

Authors and Affiliations

  • Hans-Paul Schwefel
    • 1
  1. 1.Department of Computer ScienceUniversity of DortmundDortmund 50Fed. Rep. Germany

Personalised recommendations