Advertisement

Theory of Evolutionary Algorithms and Genetic Programming

  • Stefan Droste
  • Thomas Jansen
  • Günter Rudolph
  • Hans-Paul Schwefel
  • Karsten Tinnefeld
  • Ingo Wegener
Part of the Natural Computing Series book series (NCS)

Summary

Randomized search heuristics are an alternative to specialized and problem-specific algorithms. They are applied to NP-hard problems with the hope of being efficient in typical cases. They are an alternative if no problem-specific algorithm is available. And they are the only choice in black-box optimization where the function to be optimized is not known. Evolutionary algorithms (EA) are a special class of randomized algorithms with many successful applications. However, the theory of evolutionary algorithms is in its infancy. Here many new contributions to constructing such a theory are presented and discussed.

Keywords

Evolutionary Algorithm Boolean Function Multiobjective Optimization Mutation Probability Search Point 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    D. Ackley. A Connectionist Machine for Genetic Hillclimbers. Kluwer, Boston, MA, 1987.CrossRefGoogle Scholar
  2. 2.
    T. Bäck. Optimal mutation rates in genetic search. In S. Forrest, editor, Proceedings of the 5th International Conference on Genetic Algorithms (ICGA 93), pages 2–8. Morgan Kaufmann, San Francisco, CA, 1993.Google Scholar
  3. 3.
    T. Bäck. An overview of parameter control methods by self-adaptation in evolutionary algorithms. Fundamenta Informaticae, 35: 51 66, 1998.Google Scholar
  4. 4.
    R. E. Bryant. Graph-based algorithms for Boolean function manipulation. IEEE Transactions on Computers, 35: 677–691, 1986.zbMATHCrossRefGoogle Scholar
  5. 5.
    U. Chakraborty, K. Deb, and M. Chakraborty. Analysis of selection algorithms: A Markov chain approach. Evolutionary Computation, 4 (2): 133–167, 1996.CrossRefGoogle Scholar
  6. 6.
    K. Chellapilla. Evolving computer programs without subtree crossover. IEEE Transactions on Evolutionary Computation, 1 (3): 209–216, 1997.CrossRefGoogle Scholar
  7. 7.
    K. Chellapilla. A preliminary investigation into evolving modular programs without subtree crossover. In J. R. Koza, W. Banzhaf, K. Chellapilla, K. Deb, M. Dorigo, D. B. Fogel, M. H. Garzon, D. E. Goldberg, H. Iba, and R. Riolo, editors, Proceedings of the Third Genetic Programming Conference (GP 98), pages 23–31. Morgan Kaufmann, San Francisco, CA, 1998.Google Scholar
  8. 8.
    S. Droste. Efficient genetic programming for finding good generalizing Boolean functions. In J. R. Koza, K. Deb, M. Dorigo, D. B. Fogel, M. Garzon, H. Iba, and R. L. Riolo, editors, Proceedings of the Second Genetic Programming Conference (GP 97), pages 82–87. Morgan Kaufmann, San Francisco, CA, 1997.Google Scholar
  9. 9.
    S. Droste. Analysis of an evolutionary algorithm for a dynamically changing objective function. In The 2002 IEEE Congress on Evolutionary Computation (CEC 2002),2002. (Accepted).Google Scholar
  10. 10.
    S. Droste, D. Heutelbeck, and I. Wegener. Distributed hybrid genetic programming for learning Boolean functions. In M. Schoenauer, editor, Proceedings of the 6th Parallel Problem Solving from Nature (PPSN VI), volume 1917 of LNCS, pages 181–190. Springer, Berlin, 2000.Google Scholar
  11. 11.
    S. Droste, T. Jansen, K. Tinnefeld, and I. Wegener. A new framework for the valuation of algorithms for black-box optimization. Technical Report CI-118/01, Collaborative Research Center 531, University of Dortmund, 2001.Google Scholar
  12. 12.
    S. Droste, T. Jansen, and I. Wegener. On the optimization of unimodal functions with the (1 + 1) evolutionary algorithm. In A. E. Eiben, T. Bäck, M. Schoenauer, and H.-P. Schwefel, editors, Proceedings of the 5th Parallel Problem Solving from Nature (PPSN V), volume 1498 of LNCS, pages 13–22. Springer, Berlin, 1998.Google Scholar
  13. 13.
    S. Droste, T. Jansen, and I. Wegener. A rigorous complexity analysis of the (1 + 1) evolutionary algorithm for linear functions with Boolean inputs. In Proceedings of the Third IEEE International Conference on Evolutionary Computation (ICEC 98), pages 499–504. IEEE Press, Piscataway, NJ, 1998.Google Scholar
  14. 14.
    S. Droste, T. Jansen, and I. Wegener. A rigorous complexity analysis of the (1 + 1) evolutionary algorithm for separable functions with Boolean inputs. Evolutionary Computation, 6 (2): 185–196, 1998.CrossRefGoogle Scholar
  15. 15.
    S. Droste, T. Jansen, and I. Wegener. Perhaps not a free lunch but at least a free appetizer. In W. Banzhaf, J. Daida, A. E. Eiben, M. H. Garzon, V. Honavar, M. Jakiela, and R. E. Smith, editors, Proceedings of the First Genetic and Evolutionary Computation Conference (GECCO 99), pages 833–839. Morgan Kaufmann, San Francisco, CA, 1999.Google Scholar
  16. 16.
    S. Droste, T. Jansen, and I. Wegener. Dynamic parameter control in simple evolutionary algorithms. In W. N. Martin and W. M. Spears, editors, Proceedings of the Sixth Foundations of Genetic Algorithms Workshop (FOGA 6), pages 275–294. Morgan Kaufmann, San Francisco, CA, 2000.Google Scholar
  17. 17.
    S. Droste, T. Jansen, and I. Wegener. A natural and simple function which is hard for all evolutionary algorithms. In Proceedings of Third Asia-Pacific Conference on Simulated Evolution and Learning (SEAL 2000), pages 27042709. IEEE Press, Piscataway, NJ, 2000.Google Scholar
  18. 18.
    S. Droste, T. Jansen, and I. Wegener. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science,2002. (Accepted).Google Scholar
  19. 19.
    S. Droste, T. Jansen, and I. Wegener. Optimization with randomized search heuristics - the (A)NFL theorem, realistic scenarios, and difficult functions. Theoretical Computer Science,2002. (Accepted for publication in a special issue on natural computation).Google Scholar
  20. 20.
    S. Droste and D. Wiesmann. On representation and genetic operators in evolutionary algorithms. Technical Report CI-4198, Collaborative Research Center 531, University of Dortmund, 1998.Google Scholar
  21. 21.
    S. Droste and D. Wiesmann. Metric based evolutionary algorithms. In R. Poli, W. Banzhaf, W. B. Langdon, J. F. Miller, P. Nordin, and T. C. Fogarty, editors, Proceedings of the Third European Workshop on Genetic Programming (EuroGP 2000), volume 1802 of LNCS, pages 29–43. Springer, Berlin, 2000.Google Scholar
  22. 22.
    A. E. Eiben and G. Rudolph. Theory of evolutionary algorithms: A bird’s eye view. Theoretical Computer Science, 229: 3–9, 1999.MathSciNetzbMATHCrossRefGoogle Scholar
  23. 23.
    P. C. Fishburn. Interval Orders and Interval Graphs: A Study of Partially Ordered Sets. Wiley, New York, 1985.zbMATHGoogle Scholar
  24. 24.
    S. Forrest and M. Mitchell. Relative building-block fitness and the building-block hypothesis. In Proceedings of the 2nd Workshop on the Foundations of Genetic Algorithms (FOGA II), pages 109–126, 1993.Google Scholar
  25. 25.
    J. Gamier, L. Kallel, and M. Schoenauer. Rigorous hitting times for binary mutations. Evolutionary Computation, 7 (2): 173–203, 1999.CrossRefGoogle Scholar
  26. 26.
    D. E. Goldberg. Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading, MA, 1989.zbMATHGoogle Scholar
  27. 27.
    D. E. Goldberg and K. Deb. A comparative analysis of selection schemes used in genetic algorithms. In G. J. E. Rawlins, editor, Proceedings of the 1st Workshop on Foundations of Genetic Algorithms (FOGA I), pages 69–93. Morgan Kaufmann, San Mateo, CA, 1991.Google Scholar
  28. 28.
    J. Guddat, F. Guerra Vasquez, K. Tammer, and K. Wendler. Multiobjective and Stochastic Optimization Based on Parametric Optimization. Akademie-Verlag, Berlin, 1985.zbMATHGoogle Scholar
  29. 29.
    T. Hanne. On the convergence of multiobjective evolutionary algorithms. European Journal of Operational Research, 117 (3): 553–564, 1999.zbMATHCrossRefGoogle Scholar
  30. 30.
    E. Hansen. Global Optimization Using Interval Analysis. M. Dekker, New York, 1992.Google Scholar
  31. 31.
    J. Horn, D. E. Goldberg, and K. Deb. Long path problems. In Y. Davidor, H.-P. Schwefel, and R. Männer, editors, Proceedings of the 3rd Parallel Problem Solving From Nature (PPSN III), volume 866 of LNCS, pages 149–158. Springer, Berlin, 1994.Google Scholar
  32. 32.
    T. Jansen. On classifications of fitness functions. In L. Kallel, B. Naudts, and A. Rogers, editors, Theoretical Aspects of Evolutionary Computing, pages 371–386. Springer, Berlin, 2001.Google Scholar
  33. 33.
    T. Jansen and I. Wegener. On the choice of the mutation probability for the (1+1)-EA. In M. Schoenauer et al., editors, Proceedings of the Sixth Parallel Problem Solving from Nature (PPSN VI), volume 1917 of LNCS, pages 89–98. Springer, Berlin, 2000.Google Scholar
  34. 34.
    T. Jansen and I. Wegener. On the choice of the mutation probability for the (1+1)-EA. Technical Report CI-92/00, Collaborative Research Center 531, University of Dortmund, 2000.Google Scholar
  35. 35.
    T. Jansen and I. Wegener. Evolutionary algorithms - how to cope with plateaus of constant fitness and when to reject strings of the same fitness. IEEE Transactions on Evolutionary Computation,2001. (Accepted).Google Scholar
  36. 36.
    T. Jansen and I. Wegener. On the utility of populations in evolutionary algorithms. In L. Spector, E. D. Goodman, A. Wu, W. B. Langdon, H.-M. Voigt, M. Gen, S. Sen, M. Dorigo, S. Pezeshk, M. H. Garzon, and E. Burke, editors, Proceedings of the Third Genetic and Evolutionary Computation Conference (GECCO 2001), pages 375–382. Morgan Kaufmann, San Francisco, CA, 2001.Google Scholar
  37. 37.
    T. Jansen and I. Wegener. Real royal road functions–where crossover is provably essential. In L. Spector, E. D. Goodman, A. Wu, W. B. Langdon, H.-M. Voigt, M. Gen, S. Sen, M. Dorigo, S. Pezeshk, M. H. Garzon, and E. Burke, editors, Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2001), pages 1034–1041. Morgan Kaufmann, San Francisco, CA, 2001.Google Scholar
  38. 38.
    T. Jansen and I. Wegener. On the analysis of evolutionary algorithms - A proof that crossover really can help. Algorithmica,2002. (Accepted).Google Scholar
  39. 39.
    J. R. Koza. Genetic Programming. MIT Press, Cambridge, MA, 1992.zbMATHGoogle Scholar
  40. 40.
    J. R. Koza. Genetic Programming II: Automatic Discovery of Reusable Programs. MIT Press, Cambridge, MA, 1994.zbMATHGoogle Scholar
  41. 41.
    M. Krause, P. Savickÿ, and I. Wegener. Approximations by OBDDs and the variable ordering problem. In International Colloquium on Automata, Languages, and Programming (ICALP 99), volume 1644 of LNCS, pages 493–502, 1999.Google Scholar
  42. 42.
    G. Laumanns, G. Rudolph, and H.-P. Schwefel. A spatial predator-prey approach to multi-objective optimization. In T. Bäck, A.E. Eiben, M. Schoenauer, and H.-P. Schwefel, editors, Proceedings of the 5th Parallel Problem Solving From Nature (PPSN V), volume 1498 of LNCS, pages 241–249. Springer, Berlin, 1998.Google Scholar
  43. 43.
    M. Laumanns, G. Rudolph, and H.-P. Schwefel. Approximating the pareto set: Concepts, diversity issues, and performance assessment. Technical Report CI-72/99, Collaborative Research Center 531, University of Dortmund, 1999.Google Scholar
  44. 44.
    M. Laumanns, G. Rudolph, and H.-P. Schwefel. Mutation control and convergence in evolutionary multi-objective optimization. In R. Matousek and P. Osmera, editors, Proceedings of the Seventh International Conference on Soft Computing (MENDEL 2001), pages 24–29. Brno University of Technology, Czech Republic, 2001.Google Scholar
  45. 45.
    D. C. Llewellyn, C. Tovey, and M. Trick. Local optimization on graphs. Discrete Applied Mathematics, 23: 157–178, 1989.MathSciNetzbMATHCrossRefGoogle Scholar
  46. 46.
    M. Mitchell, S. Forrest, and J. H. Holland. The royal road function for genetic algorithms: Fitness landscapes and GA performance. In Proceedings of the 1st European Conference on Artificial Life, pages 245–254. MIT Press, Cambridge, MA, 1994.Google Scholar
  47. 47.
    M. Mitchell, J. H. Holland, and S. Forrest. When will a genetic algorithm outperform hill climbing? In J. Cowan, G. Tesauro, and J. Alspector, editors, Advances in Neural Information Processing Systems. Morgan Kaufmann, San Francisco, CA, 1994.Google Scholar
  48. 48.
    R. Motwani and P. Raghavan. Randomized Algorithms. Cambridge University Press, Cambridge, 1995.zbMATHGoogle Scholar
  49. 49.
    H. Mühlenbein. How genetic algorithms really work. Mutation and hillclimbing. In R. Männer and R. Manderick, editors, Proceedings of the 2nd Parallel Problem Solving from Nature (PPSN II), pages 15–25. North-Holland, Amsterdam, 1992.Google Scholar
  50. 50.
    C. H. Papadimitriou. Computational Complexity. Addison-Wesley, Reading, MA, 1994.zbMATHGoogle Scholar
  51. 51.
    A. Prügel-Bennett. The mixing rate of different crossover operators. In W. N. Martin and W. M. Spears, editors, Proceedings of the Sixth Foundations of Genetic Algorithms Workshop (FOGA VI), pages 261–274. Morgan Kaufmann, San Francisco, CA, 2000.Google Scholar
  52. 52.
    Y. Rabani, Y. Rabinovich, and A. Sinclair. A computational view of population genetics. Random Structures hi Algorithms, 12, 1998.Google Scholar
  53. 53.
    I. Rechenberg. Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog Verlag, Stuttgart, 1973.Google Scholar
  54. 54.
    I. Rosenberg. Reduction of bivalent maximization to the quadratic case. Cahiers du Centre d’Etudes de Recherche Operationnelle, 17: 71–74, 1975.zbMATHGoogle Scholar
  55. 55.
    G. Rudolph. Convergence Properties of Evolutionary Algorithms. Dr. Kovac, Hamburg, 1997.Google Scholar
  56. 56.
    G. Rudolph. How mutation and selection solve long path-problems in polynomial expected time. Evolutionary Computation, 4 (2): 195–205, 1997.MathSciNetCrossRefGoogle Scholar
  57. 57.
    G. Rudolph. Evolutionary search for minimal elements in partially ordered finite sets. In V. W. Porto, N. Saravanan, D. Waagen, and A. E. Eiben, editors, Proceedings of the 7th Annual Conference on Evolutionary Programming, volume 1447 of LNCS, pages 345–353. Springer, Berlin, 1998.Google Scholar
  58. 58.
    G. Rudolph. Finite Markov chain results in evolutionary computation: A tour d’horizon. Fundamenta Informaticae, 35 (1–4): 67–89, 1998.MathSciNetzbMATHGoogle Scholar
  59. 59.
    G. Rudolph. On a multi-objective evolutionary algorithm and its convergence to the Pareto set. In Proceedings of the 1998 IEEE International Conference on Evolutionary Computation, pages 511–516. IEEE Press, Piscataway, NJ, 1998.Google Scholar
  60. 60.
    G. Rudolph. The fundamental matrix of the general random walk with absorbing boundaries. Technical Report CI-75, Collaborative Research Center 531, University of Dortmund, 1999.Google Scholar
  61. 61.
    G. Rudolph. Self-adaptation and global convergence: A counter-example. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC 99), pages 646–651, Volume 1. IEEE Press, Piscataway, NJ, 1999.Google Scholar
  62. 62.
    G. Rudolph. On takeover times in spatially structured populations: Array and ring. In K. K. Lai, O. Katai, M. Gen, and B. Lin, editors, Proceedings of the Second Asia-Pacific Conference on Genetic Algorithms and Applications (APGA 2000), pages 144–151. Global-Link Publishing Company, Hong Kong, PR China, 2000.Google Scholar
  63. 63.
    G. Rudolph. Takeover times and probabilities of non-generational selection rules. In L. D. Whitley et al., editor, Proceedings of the Second Genetic and Evolutionary Computation Conference (GECCO 2000), pages 903–910. Morgan Kaufmann, San Francisco, CA, 2000.Google Scholar
  64. 64.
    G. Rudolph. Evolutionary search under partially ordered finite sets. In M. F. Sebaaly, editor, Proceedings of the International NAISO Congress on Information Science Innovations (ISI 2001), pages 818–822, Dubai, U.A.E., 2001. ICSC Academic Press.Google Scholar
  65. 65.
    G. Rudolph. A partial order approach to noisy fitness functions. In J.-H. Kim, B.-T. Zhang, G. Fogel, and I. Kuscu, editors, Proceedings of the 2001 IEEE Congress on Evolutionary Computation (CEC 2001), pages 318–325. IEEE Press, Piscataway, NJ, 2001.Google Scholar
  66. 66.
    G. Rudolph. Some theoretical properties of evolutionary algorithms under partially ordered fitness values. In C. Fabian and I. Intorsureanu, editors, Proceedings of the Evolutionary Algorithms Workshop (EAW 2001), pages 922, Bucharest, Romania, 2001.Google Scholar
  67. 67.
    G. Rudolph. Takeover times of noisy non-generational selection rules that undo extinction. In V. Kurkova et al., editors, Proceedings of the Fifth International Conference on Artificial Neural Networks and Genetic Algorithms (ICANNGA 2001), pages 268–271. Springer, Berlin, 2001.Google Scholar
  68. 68.
    G. Rudolph and A. Agapie. Convergence properties of some multi-objective evolutionary algorithms. In A. Zalzala et al., editors, Proceedings of the 2000 IEEE Congress on Evolutionary Computation (CEC 2000), pages 1010–1016, Volume 2. IEEE Press, Piscataway, NJ, 2000.Google Scholar
  69. 69.
    H.-P. Schwefel. Numerische Optimierung von Computer-Modellen mittels der Evolutionsstrategie. Birkhäuser, Basel, 1977.zbMATHGoogle Scholar
  70. 70.
    D. Sieling. The nonapproximability of OBDD minimization. Information and Computation, 2002. (To appear).Google Scholar
  71. 71.
    J. Smith and F. Vavak. Replacement strategies in steady state genetic algorithms: Static environments. In W. Banzhaf and C. Reeves, editors, Proceedings of the Fifth Foundations of Genetic Algorithms Workshop (FOGA V), pages 219–233. Morgan Kaufmann, San Francisco, CA, 1999.Google Scholar
  72. 72.
    S. A. Stanhope and J. M. Daida. (1+1) genetic algorithm fitness dynamics in a changing environment. In P.J. Angeline et al., editor, Proceedings of the Congress on Evolutionary Computation (CEC 99), pages 1851–1858, IEEE Press, Piscataway, NJ, 1999.Google Scholar
  73. 73.
    K. Tinnefeld and I. Wegener. Evolutionary algorithms for shortest path problems. Technical report, 2002. Preprint.Google Scholar
  74. 74.
    R. A. Watson, G. S. Hornby, and J. B. Pollack. Modelling building-block interdependency. In Proceedings of the 5th Parallel Problem Solving from Nature (PPSN V), volume 1498 of LNCS, pages 97–106, 1998.Google Scholar
  75. 75.
    R. A. Watson and J. B. Pollack. Hierarchically-consistent test problems for genetic algorithms. In Proceedings of Congress of Evolutionary Computation (CEC 99), pages 1406–1413, 1999.Google Scholar
  76. 76.
    I. Wegener. Branching Programs and Binary Decision Diagrams - Theory and Applications. SIAM, 2000.Google Scholar
  77. 77.
    I. Wegener. On the design and analysis of evolutionary algorithms. In Workshop on Algorithm Engineering as a New Paradigm (RIMS), pages 36–47, 2000.Google Scholar
  78. 78.
    I. Wegener. On the expected runtime and the success probability of evolutionary algorithms (invited paper). In 26th Workshop on Graph-Theoretic Concepts in Computer Science (WG 2000), volume 1928 of LNCS, pages 1–10. Springer, Berlin, 2000.Google Scholar
  79. 79.
    I. Wegener. Theoretical aspects of evolutionary algorithms (invited paper). In International Colloquium on Automata, Languages, and Programming (ICALP 2001), volume 2076 of LNCS, pages 64–78. Springer, Berlin, 2001.Google Scholar
  80. 80.
    I. Wegener. Methods for the analysis of evolutionary algorithms on pseudo-Boolean functions. In R. Sarker, X. Yao, and M. Mohammadian, editors, Evolutionary Optimization. Kluwer, 2002. (To appear).Google Scholar
  81. 81.
    I. Wegener and C. Witt. On the behaviour of the (1+1) evolutionary algorithm on quadratic pseudo-boolean functions. Technical Report CI-97/00, Collaborative Research Center 531, University of Dortmund, 2000.Google Scholar
  82. 82.
    K. Weinert, J. Mehnen, and G. Rudolph. Dynamic neighborhood structures in parallel evolution strategies. Technical Report CI-112/01, Collaborative Research Center 531, University of Dortmund, 2001.Google Scholar
  83. 83.
    D. H. Wolpert and W. G. Macready. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1 (1): 67–82, 1997.CrossRefGoogle Scholar
  84. 84.
    M. Yanagiya. Efficient genetic programming based on binary decision diagrams. In Proceedings of the First International IEEE Conference on Evolutionary Computation (ICEC 94), pages 234–239. IEEE Press, Piscataway, NJ, 1994.Google Scholar
  85. 85.
    A. C. Yao. Probabilistic computations: Towards a unified measure of complexity. In Proceedings of the 17th IEEE Symposium on Foundations of Computer Science (FOCS 77), pages 222 — 227, 1977.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Stefan Droste
    • 1
  • Thomas Jansen
    • 1
  • Günter Rudolph
    • 2
  • Hans-Paul Schwefel
    • 2
  • Karsten Tinnefeld
    • 1
  • Ingo Wegener
    • 1
  1. 1.Department of Computer Science, Informatik IIUniversity of DortmundDortmundGermany
  2. 2.Department of Computer Science, Informatik XIUniversity of DortmundDortmundGermany

Personalised recommendations