On risky methods for local selection under noise

  • Günter Rudolph
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1498)


The choice of the selection method used in an evolutionary algorithm may have considerable impacts on the behavior of the entire algorithm. Therefore, earlier work was devoted to the characterization of selection methods by means of certain distinguishing measures that may guide the design of an evolutionary algorithm for a specific task. Here, a complementary characterization of selection methods is proposed, which is useful in the presence of noise. This characterization is derived from the interpretation of iterated selection procedures as sequential non-parametric statistical tests. From this point of view, a selection method is risky if there exists a parameterization of the noise distributions, such that the population is more often directed into the wrong than into the correct direction, i.e., if the error probability is larger than 1/2. It is shown that this characterization actually partitions the set of selection methods into two non-empty sets by presenting an element of each set.


Evolutionary Algorithm Selection Method Markov Chain Model Absorption Probability Random Walk Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    D.E. Goldberg and K. Deb. A comparative analysis of selection schemes used in genetic algorithms. In G. J. E. Rawlins, editor, Foundations of Genetic Algorithms, pages 69–93. Morgan Kaufmann, San Mateo (CA), 1991.Google Scholar
  2. 2.
    M. de la Maza and B. Tidor. An analysis of selection procedures with particular attention paid to proportional and Boltzman selection. In S. Forrest, editor, Proceedings of the Fifth International Conference on Genetic Algorithms, pages 124–131. Morgan Kaufmann, San Mateo (CA), 1993.Google Scholar
  3. 3.
    T. Bäck. Selective pressure in evolutionary algorithms: A characterization of selection mechanisms. In Proceedings of the First IEEE Conference on Evolutionary Computation, Vol. 1, pages 57–62. IEEE Press, Piscataway (NJ), 1994.Google Scholar
  4. 4.
    T. Blickle and L. Thiele. A comparison of selection schemes used in evolutionary algorithms. Evolutionary Computation, 4(4):361–394, 1996.Google Scholar
  5. 5.
    U. Chakraborty, K. Deb, and M. Chakraborty. Analysis of selection algorithms: A Markov chain approach. Evolutionary Computation, 4(2):133–167, 1996.Google Scholar
  6. 6.
    B. L. Miller and D. E. Goldberg. Genetic algorithms, selection schemes, and the varying effects of noise. Evolutionary Computation, 4(2):113–131, 1996.Google Scholar
  7. 7.
    Y. Sakamoto and D. E. Goldberg. Takeover time in a noisy environment. In T. Bäck, editor, Proceedings of the 7th International Conference on Genetic Algorithms, pages 160–165. Morgan Kaufmann, San Francisco (CA), 1997.Google Scholar
  8. 8.
    D.B. Fogel and A. Ghozeil. The schema theorem and the misallocation of trials in the presence of stochastic effects. In Proceedings of the 7th Annual Conference on Evolutionary Programming. Springer, Berlin, 1998.Google Scholar
  9. 9.
    G. Rudolph. Reflections on bandit problems and selection methods in uncertain environments. In T. Bäck, editor, Proceedings of the 7th International Conference on Genetic Algorithms, pages 166–173. Morgan Kaufmann, San Fransisco (CA), 1997.Google Scholar
  10. 10.
    M. Iosifescu. Finite Markov Processes and Their Applications. Wiley, Chichester, 1980.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Günter Rudolph
    • 1
  1. 1.Fachbereich InformatikUniversität DortmundDortmundGermany

Personalised recommendations