Advertisement

Why Is Optimization Difficult?

  • Thomas Weise
  • Michael Zapf
  • Raymond Chiong
  • Antonio J. Nebro
Part of the Studies in Computational Intelligence book series (SCI, volume 193)

Abstract

This chapter aims to address some of the fundamental issues that are often encountered in optimization problems, making them difficult to solve. These issues include premature convergence, ruggedness, causality, deceptiveness, neutrality, epistasis, robustness, overfitting, oversimplification, multi-objectivity, dynamic fitness, the No Free Lunch Theorem, etc. We explain why these issues make optimization problems hard to solve and present some possible countermeasures for dealing with them. By doing this, we hope to help both practitioners and fellow researchers to create more efficient optimization applications and novel algorithms.

Keywords

Genetic Algorithm Particle Swarm Optimization Evolutionary Algorithm Pareto Front Evolutionary Computation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ackley, D.H.: A connectionist machine for genetic hillclimbing. The Springer International Series in Engineering and Computer Science, vol. 28. Kluwer Academic Publishers, Dordrecht (1987)Google Scholar
  2. 2.
    Altenberg, L.: The schema theorem and price’s theorem. Foundations of Genetic Algorithms 3, 23–49 (1994)Google Scholar
  3. 3.
    Altenberg, L.: Genome growth and the evolution of the genotype-phenotype map. In: Evolution and Biocomputation – Computational Models of Evolution, pp. 205–259. Springer, Heidelberg (1995)Google Scholar
  4. 4.
    Altenberg, L.: Nk fitness landscapes. In: Handbook of Evolutionary Computation, ch.. B2.7.2. Oxford University Press, Oxford (1996)Google Scholar
  5. 5.
    Altenberg, L.: Fitness distance correlation analysis: An instructive counterexample. In: Proceedings of the International Conference on Genetic Algorithms, ICGA, pp. 57–64 (1997)Google Scholar
  6. 6.
    Amitrano, C., Peliti, L., Saber, M.: Population dynamics in a spin-glass model of chemical evolution. Journal of Molecular Evolution 29(6), 513–525 (1989)CrossRefGoogle Scholar
  7. 7.
    Amor, H.B., Rettinger, A.: Intelligent exploration for genetic algorithms: Using self-organizing maps in evolutionary computation. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 1531–1538 (2005) doi:10.1145/1068009.1068250Google Scholar
  8. 8.
    Angeline, P.J., Pollack, J.: Evolutionary module acquisition. In: The Second Annual Conference on Evolutionary Programming, Evolutionary Programming Society, pp. 154–163 (1993)Google Scholar
  9. 9.
    Aragón, V.S., Esquivel, S.C.: An evolutionary algorithm to track changes of optimum value locations in dynamic environments. Journal of Computer Science & Technology (JCS&T) 4(3), 127–133 (2004); invited paperGoogle Scholar
  10. 10.
    Bachmann, P.G.H.: Die Analytische Zahlentheorie / Dargestellt von Paul Bachmann, Zahlentheorie: Versuch einer Gesamtdarstellung dieser Wissenschaft in ihren Haupttheilen, vol. Zweiter Theil. B. G. Teubner, Leipzig, Germany (1894)Google Scholar
  11. 11.
    Bäck, T., Hammel, U.: Evolution strategies applied to perturbed objective functions. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, vol. 1, pp. 40–45 (1994) doi:10.1109/ICEC.1994.350045Google Scholar
  12. 12.
    Bak, P., Sneppen, K.: Punctuated equilibrium and criticality in a simple model of evolution. Physical Review Letters 71, 4083–4086 (1993)CrossRefGoogle Scholar
  13. 13.
    Baldwin, J.M.: A new factor in evolution. The American Naturalist 30, 441–451 (1896)CrossRefGoogle Scholar
  14. 14.
    Barnett, L.: Tangled webs: Evolutionary dynamics on fitness landscapes with neutrality. Master’s thesis, School of Cognitive Science, University of East Sussex, Brighton, UK (1997)Google Scholar
  15. 15.
    Barnett, L.: Ruggedness and neutrality – the nkp family of fitness landscapes. In: Artificial Life VI: Proceedings of the sixth international conference on Artificial life, pp. 18–27 (1998)Google Scholar
  16. 16.
    Bateson, W.: Mendel’s Principles of Heredity. Cambridge University Press, Cambridge (1909)Google Scholar
  17. 17.
    Beaudoin, W., Verel, S., Collard, P., Escazut, C.: Deceptiveness and neutrality the nd family of fitness landscapes. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 507–514 (2006) doi:10.1145/1143997.1144091Google Scholar
  18. 18.
    Beerenwinkel, N., Pachter, L., Sturmfels, B.: Epistasis and shapes of fitness landscapes. Eprint arXiv:q-bio/0603034 (Quantitative Biology, Populations and Evolution) (accessed 2007-08-05) (2006), http://arxiv.org/abs/q-bio.PE/0603034
  19. 19.
    Bergman, A., Feldman, M.W.: Recombination dynamics and the fitness landscape. Physica D: Nonlinear Phenomena 56, 57–67 (1992)zbMATHCrossRefGoogle Scholar
  20. 20.
    Bethke, A.D.: Genetic algorithms as function optimizers. PhD thesis, University of Michigan, Ann Arbor, MI, USA (1980)Google Scholar
  21. 21.
    Beyer, H.-G.: Toward a theory of evolution strategies: Some asymptotical results from the (1, + λ)-theory. Evolutionary Computation 1(2), 165–188 (1993)CrossRefGoogle Scholar
  22. 22.
    Beyer, H.-G.: Toward a theory of evolution strategies: The (μ, λ)-theory. Evolutionary Computation 2(4), 381–407 (1994)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Blackwell, T.: Particle swarm optimization in dynamic environments. In: Evolutionary Computation in Dynamic and Uncertain Environments, ch. 2, pp. 29–52. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  24. 24.
    Bledsoe, W.W., Browning, I.: Pattern recognition and reading by machine. In: Proceedings of the Eastern Joint Computer Conference (EJCC) – Papers and Discussions Presented at the Joint IRE - AIEE - ACM Computer Conference, pp. 225–232 (1959)Google Scholar
  25. 25.
    Blum, C., Roli, A.: Metaheuristics in combinatorial optimization: Overview and conceptual comparison. ACM Computing Surveys 35(3), 268–308 (2003)CrossRefGoogle Scholar
  26. 26.
    Bonner, J.T.: On Development: The Biology of Form, new ed edn. Commonwealth Fund Publications, Harvard University Press (1974)Google Scholar
  27. 27.
    Bornberg-Bauer, E., Chan, H.S.: Modeling evolutionary landscapes: Mutational stability, topology, and superfunnels in sequence space. Proceedings of the National Academy of Science of the United States of Americs (PNAS) – Biophysics 96(19), 10689–10694 (1999)CrossRefGoogle Scholar
  28. 28.
    Bosman, P.A.N., Thierens, D.: Multi-objective optimization with diversity preserving mixture-based iterated density estimation evolutionary algorithms. International Journal Approximate Reasoning 31(3), 259–289 (2002)zbMATHMathSciNetCrossRefGoogle Scholar
  29. 29.
    Bosman, P.A.N., Thierens, D.: A thorough documentation of obtained results on real-valued continuous and combinatorial multi-objective optimization problems using diversity preserving mixture-based iterated density estimation evolutionary algorithms. Tech. Rep. UU-CS-2002-052, Institute of Information and Computing Sciences, Utrecht University, P.O. Box 80.089, 3508 TB Utrecht, The Netherlands (2002)Google Scholar
  30. 30.
    Brameier, M.F., Banzhaf, W.: Explicit control of diversity and effective variation distance in linear genetic programming. In: Foster, J.A., Lutton, E., Miller, J., Ryan, C., Tettamanzi, A.G.B. (eds.) EuroGP 2002. LNCS, vol. 2278, pp. 37–49. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  31. 31.
    Branke, J.: Creating robust solutions by means of evolutionary algorithms. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 119–128. Springer, Heidelberg (1998)CrossRefGoogle Scholar
  32. 32.
    Branke, J.: Memory enhanced evolutionary algorithms for changing optimization problems. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, vol. 3, pp. 1875–1882 (1999) doi:10.1109/CEC.1999.785502Google Scholar
  33. 33.
    Branke, J.: The moving peaks benchmark. Tech. rep., Institute AIFB, University of Karlsruhe, Germany (accessed 2007-08-19) (1999), http://www.aifb.uni-karlsruhe.de/~jbr/MovPeaks/ Presented in [32]
  34. 34.
    Branke, J.: Evolutionary optimization in dynamic environments. PhD thesis, Universität Karlsruhe (TH), Fakultät für Wirtschaftswissenschaften (2000)Google Scholar
  35. 35.
    Branke, J.: Evolutionary Optimization in Dynamic Environments. Genetic Algorithms and Evolutionary Computation, vol. 3. Kluwer Academic Publishers, Dordrecht (2001)Google Scholar
  36. 36.
    Branke, J., Salihoğlu, E., Uyar, Ş.: Towards an analysis of dynamic environments. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 1433–1440 (2005)Google Scholar
  37. 37.
    Bremermann, H.J.: Optimization through evolution and recombination. Self-Organizing systems pp. 93–100 (1962)Google Scholar
  38. 38.
    Burke, E.K., Gustafson, S.M., Kendall, G.: Survey and analysis of diversity measures in genetic programming. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 716–723 (2002)Google Scholar
  39. 39.
    Burke, E.K., Gustafson, S.M., Kendall, G., Krasnogor, N.: Is increasing diversity in genetic programming beneficial? an analysis of the effects on fitness. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, pp. 1398–1405 (2003)Google Scholar
  40. 40.
    Burke, E.K., Gustafson, S.M., Kendall, G.: Diversity in genetic programming: An analysis of measures and correlation with fitness. IEEE Transactions on Evolutionary Computation 8(1), 47–62 (2004)CrossRefGoogle Scholar
  41. 41.
    Cantú-Paz, E., Pelikan, M., Goldberg, D.E.: Linkage problem, distribution estimation, and bayesian networks. Evolutionary Computation 8(3), 311–340 (2000)CrossRefGoogle Scholar
  42. 42.
    Carlisle, A.J.: Applying the particle swarm optimizer to non-stationary environments. PhD thesis, Graduate Faculty of Auburn University (2002)Google Scholar
  43. 43.
    Carlisle, A.J., Dozier, G.V.: Tracking changing extrema with adaptive particle swarm optimizer. In: Proceedings of the 5th Biannual World Automation Congress, WAC 2002, Orlando, Florida, USA, vol. 13, pp. 265–270 (2002) doi:10.1109/WAC.2002.1049555Google Scholar
  44. 44.
    Carroll, C.W.: An operations research approach to the economic optimization of a kraft pulping process. PhD thesis, Institute of Paper Chemistry, Appleton, Wisconsin, USA (1959)Google Scholar
  45. 45.
    Ceollo Coello, C.A., Lamont, G.B., van Veldhuizen, D.A.: Evolutionary Algorithms for Solving Multi-Objective Problems, Genetic and Evolutionary Computation. Genetic and Evolutionary Computation (1st edn., 2002, 2nd edn., 2007), vol. 5. Kluwer Academic Publishers, Springer (2007)Google Scholar
  46. 46.
    Chen, Y.p.: Extending the Scalability of Linkage Learning Genetic Algorithms – Theory & Practice. Studies in Fuzziness and Soft Computing, vol. 190. Springer, Heidelberg (2006)zbMATHGoogle Scholar
  47. 47.
    Cohoon, J.P., Hegde, S.U., Martin, W.N., Richards, D.: Punctuated equilibria: a parallel genetic algorithm. In: Proceedings of the Second International Conference on Genetic algorithms and their Application, pp. 148–154 (1987)Google Scholar
  48. 48.
    Courant, R.: Variational methods for the solution of problems of equilibrium and vibrations. Bulletin of the American Mathematical Society 49(1), 1–23 (1943)zbMATHMathSciNetCrossRefGoogle Scholar
  49. 49.
    Cousins, S.H.: Species diversity measurement: Choosing the right index. Trends in Ecology and Evolution (TREE) 6(6), 190–192 (1991)CrossRefGoogle Scholar
  50. 50.
    Das, I., Dennis, J.E.: A closer look at drawbacks of minimizing weighted sums of objectives for pareto set generation in multicriteria optimization problems. Structural optimization 14(1), 63–69 (1997)CrossRefGoogle Scholar
  51. 51.
    Davidor, Y.: Epistasis variance: A viewpoint on GA-hardness. In: Proceedings of the First Workshop on Foundations of Genetic Algorithms, pp. 23–35 (1990)Google Scholar
  52. 52.
    Dawkins, R.: The evolution of evolvability. In: ALIFE – Artificial Life: Proceedings of the Interdisciplinary Workshop on the Synthesis and Simulation of Living Systems, pp. 201–220 (1987)Google Scholar
  53. 53.
    de Jong, E.D., Watson, R.A., Pollack, J.B.: Reducing bloat and promoting diversity using multi-objective methods. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 11–18 (2001)Google Scholar
  54. 54.
    de Lamarck, J.B.P.A.d.C.: Philosophie zoologique – ou Exposition des considérations relatives à l’histoire naturelle des Animaux. Dentu / G. Baillière, Paris, France/Harvard University (1809)Google Scholar
  55. 55.
    Deb, K.: Genetic algorithms in multimodal function optimization. Master’s thesis, The Clearinghouse for Genetic algorithms, University of Alabama, Tuscaloosa, tCGA Report No. 89002 (1989)Google Scholar
  56. 56.
    Deb, K.: Solving goal programming problems using multi-objective genetic algorithms. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, pp. 77–84 (1999) doi:10.1109/CEC.1999.781910Google Scholar
  57. 57.
    Deb, K.: Genetic algorithms for optimization. KanGAL Report 2001002, Kanpur Genetic Algorithms Laboratory (KanGAL), Kanpur, PIN 208 016, India (2001)Google Scholar
  58. 58.
    Deb, K.: Nonlinear goal programming using multi-objective genetic algorithms. Journal of the Operational Research Society 52(3), 291–302 (2001)zbMATHMathSciNetCrossRefGoogle Scholar
  59. 59.
    Deb, K., Goldberg, D.E.: Analyzing deception in trap functions. In: Foundations of Genetic Algorithms 2, pp. 93–108 (1993)Google Scholar
  60. 60.
    Deb, K., Goldberg, D.E.: Sufficient conditions for deceptive and easy binary functions. Annals of Mathematics and Artificial Intelligence 10(4), 385–408 (1994)zbMATHMathSciNetCrossRefGoogle Scholar
  61. 61.
    Deb, K., Agrawal, S., Pratab, A., Meyarivan, T.: A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II. In: Proceedings of the International Conference on Parallel Problem Solving from Nature, PPSN, pp. 849–858 (2000); KanGAL Report No. 200001Google Scholar
  62. 62.
    Deb, K., Pratab, A., Agrawal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6(2), 182–197 (2002)CrossRefGoogle Scholar
  63. 63.
    Deb, K., Sinha, A., Kukkonen, S.: Multi-objective test problems, linkages, and evolutionary methodologies. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 1141–1148. ACM, New York (2006)CrossRefGoogle Scholar
  64. 64.
    Dietterich, T.: Overfitting and undercomputing in machine learning. ACM Computing Surveys (CSUR) 27(3), 326–327 (1995)CrossRefGoogle Scholar
  65. 65.
    Droste, S., Wiesmann, D.: On representation and genetic operators in evolutionary algorithms. Tech. Rep. CI–41/98, Fachbereich Informatik, Universität Dortmund (1998)Google Scholar
  66. 66.
    Eiben, Á.E., Schippers, C.A.: On evolutionary exploration and exploitation. Fundamenta Informaticae 35(1-4), 35–50 (1998)zbMATHGoogle Scholar
  67. 67.
    Eldredge, N., Gould, S.J.: Punctuated equilibria: an alternative to phyletic gradualism. In: Schopf, T.J.M. (ed.) Models in Paleobiology, ch. 5, pp. 82–115. W.H. Freeman, New York (1972)Google Scholar
  68. 68.
    Eldredge, N., Gould, S.J.: Punctuated equilibria: The tempo and mode of evolution reconsidered. Paleobiology 3(2), 115–151 (1977)Google Scholar
  69. 69.
    Eshelman, L.J., Schaffer, J.D.: Preventing premature convergence in genetic algorithms by preventing incest. In: Proceedings of the International Conference on Genetic Algorithms, ICGA, pp. 115–122 (1991)Google Scholar
  70. 70.
    Eshelman, L.J., Caruana, R.A., Schaffer, J.D.: Biases in the crossover landscape. In: Proceedings of the third international conference on Genetic algorithms, pp. 10–19 (1989)Google Scholar
  71. 71.
    Feo, T.A., Resende, M.G.C.: Greedy randomized adaptive search procedures. Journal of Global Optimization 6(2), 109–133 (1995)zbMATHMathSciNetCrossRefGoogle Scholar
  72. 72.
    Festa, P., Resende, M.G.: An annotated bibliography of grasp. AT&T Labs Research Technical Report TD-5WYSEW, AT&T Labs (2004)Google Scholar
  73. 73.
    Fiacco, A.V., McCormick, G.P.: Nonlinear Programming: Sequential Unconstrained Minimization Techniques. John Wiley & Sons Inc., Chichester (1968)zbMATHGoogle Scholar
  74. 74.
    Fisher, S.R.A.: The correlations between relatives on the supposition of mendelian inheritance. Philosophical Transactions of the Royal Society of Edinburgh 52, 399–433 (1918)Google Scholar
  75. 75.
    Fitzpatrick, J.M., Grefenstette, J.J.: Genetic algorithms in noisy environments. Machine Learning 3(2–3), 101–120 (1988)Google Scholar
  76. 76.
    Fonseca, C.M., Fleming, P.J.: Genetic algorithms for multiobjective optimization: Formulation, discussion and generalization. In: Proceedings of the 5th International Conference on Genetic Algorithms, pp. 416–423 (1993)Google Scholar
  77. 77.
    Forst, C.V., Reidys, C., Weber, J.: Evolutionary dynamics and optimization: Neutral networks as model-landscapes for RNA secondary-structure folding-landscapes. In: European Conference on Artificial Life, pp. 128–147 (1995)Google Scholar
  78. 78.
    Friedberg, R.M.: A learning machine: Part i. IBM Journal of Research and Development 2, 2–13 (1958)MathSciNetGoogle Scholar
  79. 79.
    Garey, M.R., Johnson, D.S.: Computers and Intractability: A Guide to the Theory of NP-Completeness. Series of Books in the Mathematical Sciences. W. H. Freeman & Co., New York (1979)zbMATHGoogle Scholar
  80. 80.
    Geman, S., Bienenstock, E., Doursat, R.: Neural networks and the bias/variance dilemma. Neural Computation 4(1), 1–58 (1992)CrossRefGoogle Scholar
  81. 81.
    Glover, F.: Tabu search – part ii. Operations Research Society of America (ORSA) Journal on Computing 2(1), 190–206 (1990)Google Scholar
  82. 82.
    Glover, F., Taillard, É.D., de Werra, D.: A user’s guide to tabu search. Annals of Operations Research 41(1), 3–28 (1993)zbMATHCrossRefGoogle Scholar
  83. 83.
    Gobb, H.G., Grefenstette, J.J.: Genetic algorithms for tracking changing environments. In: Proceedings of the International Conference on Genetic Algorithms, ICGA, pp. 523–529 (1993)Google Scholar
  84. 84.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley Longman Publishing Co., Amsterdam (1989)zbMATHGoogle Scholar
  85. 85.
    Goldberg, D.E., Richardson, J.: Genetic algorithms with sharing for multimodal function optimization. In: Proceedings of the Second International Conference on Genetic algorithms and their Application, pp. 41–49 (1987)Google Scholar
  86. 86.
    Goldberg, D.E., Deb, K., Korb, B.: Messy genetic algorithms: motivation, analysis, and first results. Complex Systems 3, 493–530 (1989)zbMATHMathSciNetGoogle Scholar
  87. 87.
    Greiner, H.: Robust filter design by stochastic optimization. Proceedings of SPIE (The International Society for Optical Engineering) 2253, 150–161 (1994)Google Scholar
  88. 88.
    Greiner, H.: Robust optical coating design with evolutionary strategies. Applied Optics 35, 5477–5483 (1996)CrossRefGoogle Scholar
  89. 89.
    Gruau, F., Whitley, L.D.: Adding learning to the cellular development of neural networks: Evolution and the baldwin effect. Evolutionary Computation 1(3), 213–233 (1993)CrossRefGoogle Scholar
  90. 90.
    Guntsch, M., Middendorf, M.: Pheromone modification strategies for ant algorithms applied to dynamic TSP. In: Boers, E.J.W., Gottlieb, J., Lanzi, P.L., Smith, R.E., Cagnoni, S., Hart, E., Raidl, G.R., Tijink, H. (eds.) EvoIASP 2001, EvoWorkshops 2001, EvoFlight 2001, EvoSTIM 2001, EvoCOP 2001, and EvoLearn 2001. LNCS, vol. 2037, pp. 213–222. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  91. 91.
    Guntsch, M., Middendorf, M., Schmeck, H.: An ant colony optimization approach to dynamic TSP. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 860–867 (2001)Google Scholar
  92. 92.
    Gurin, L.S., Rastrigin, L.A.: Convergence of the random search method in the presence of noise. Automation and Remote Control 26, 1505–1511 (1965)Google Scholar
  93. 93.
    Gustafson, S.M.: An analysis of diversity in genetic programming. PhD thesis, University of Nottingham, School of Computer Science & IT (2004)Google Scholar
  94. 94.
    Gustafson, S.M., Ekárt, A., Burke, E.K., Kendall, G.: Problem difficulty and code growth in genetic programming. Genetic Programming and Evolvable Machines 5(3), 271–290 (2004)CrossRefGoogle Scholar
  95. 95.
    Hadj-Alouane, A.B., Bean, J.C.: A genetic algorithm for the multiple-choice integer program. Tech. Rep. 92-50, Department of Industrial and Operations Engineering, The University of Michigan, Ann Arbour, MI 48109-2117, USA (1992)Google Scholar
  96. 96.
    Hammel, U., Bäck, T.: Evolution strategies on noisy functions: How to improve convergence properties. In: Davidor, Y., Männer, R., Schwefel, H.-P. (eds.) PPSN 1994. LNCS, vol. 866, pp. 159–168. Springer, Heidelberg (1994)Google Scholar
  97. 97.
    Han, L., He, X.: A novel opposition-based particle swarm optimization for noisy problems. In: ICNC 2007: Proceedings of the Third International Conference on Natural Computation, vol. 3, pp. 624–629 (2007) doi:10.1109/ICNC.2007.119Google Scholar
  98. 98.
    Handa, H., Lin, D., Chapman, L., Yao, X.: Robust solution of salting route optimisation using evolutionary algorithms. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, pp. 3098–3105 (2006) doi:10.1109/CEC.2006.1688701Google Scholar
  99. 99.
    Harik, G.R.: Learning gene linkage to efficiently solve problems of bounded difficulty using genetic algorithms. PhD thesis, University of Michigan, Ann Arbor (1997)Google Scholar
  100. 100.
    Hinton, G.E., Nowlan, S.J.: How learning can guide evolution. Complex Systems 1, 495–502 (1987)zbMATHGoogle Scholar
  101. 101.
    Hinton, G.E., Nowlan, S.J.: How learning can guide evolution. In: Adaptive individuals in evolving populations: models and algorithms, pp. 447–454. Addison-Wesley Longman Publishing Co., Inc., Amsterdam (1996)Google Scholar
  102. 102.
    Holland, J.H.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. The University of Michigan Press, Ann Arbor (1975); reprinted by MIT Press, NetLibrary, Inc. (April 1992)Google Scholar
  103. 103.
    Holland, J.H.: Genetic algorithms. Scientific American 267(1), 44–50 (1992)CrossRefGoogle Scholar
  104. 104.
    Horn, J., Nafpliotis, N., Goldberg, D.E.: A niched pareto genetic algorithm for multiobjective optimization. In: Proceedings of the First IEEE Conference on Evolutionary Computation, vol. 1, pp. 82–87 (1994) doi:10.1109/ICEC.1994.350037Google Scholar
  105. 105.
    Huynen, M.A.: Exploring phenotype space through neutral evolution. Journal of Molecular Evolution 43(3), 165–169 (1996)CrossRefGoogle Scholar
  106. 106.
    Huynen, M.A., Stadler, P.F., Fontana, W.: Smoothness within ruggedness: The role of neutrality in adaptation. Proceedings of the National Academy of Science, USA 93, 397–401 (1996)CrossRefGoogle Scholar
  107. 107.
    Igel, C.: Causality of hierarchical variable length representations. In: Proceedings of the 1998 IEEE World Congress on Computational Intelligence, pp. 324–329 (1998)Google Scholar
  108. 108.
    Igel, C., Toussaint, M.: On classes of functions for which no free lunch results hold. Information Processing Letters 86(6), 317–321 (2003)zbMATHMathSciNetCrossRefGoogle Scholar
  109. 109.
    Igel, C., Toussaint, M.: Recent results on no-free-lunch theorems for optimization. ArXiv EPrint arXiv:cs/0303032 (Computer Science, Neural and Evolutionary Computing) (accessed 2008-03-28) (2003), http://www.citebase.org/abstract?id=oai:arXiv.org:cs/0303032
  110. 110.
    Ingber, L.: Adaptive simulated annealing (asa): Lessons learned. Control and Cybernetics 25(1), 33–54 (1996)zbMATHGoogle Scholar
  111. 111.
    Joines, J.A., Houck, C.R.: On the use of non-stationary penalty functions to solve nonlinear constrained optimization problems with ga’s. In: Proceedings of the First IEEE Conference on Evolutionary Computation, pp. 579–584 (1994) doi:10.1109/ICEC.1994.349995Google Scholar
  112. 112.
    Jones, T.: Evolutionary algorithms, fitness landscapes and search. PhD thesis, The University of New Mexico (1995)Google Scholar
  113. 113.
    Kauffman, S.A.: Adaptation on rugged fitness landscapes. In: Stein, D.L. (ed.) Lectures in the Sciences of Complexity: The Proceedings of the 1988 Complex Systems Summer School. Santa Fe Institute Studies in the Sciences of Complexity, vol. Lecture I, pp. 527–618. Addison-Wesley, Reading (1988)Google Scholar
  114. 114.
    Kauffman, S.A.: The Origins of Order: Self-Organization and Selection in Evolution. Oxford University Press, Oxford (1993)Google Scholar
  115. 115.
    Kauffman, S.A., Levin, S.A.: Towards a general theory of adaptive walks on rugged landscapes. Journal of Theoretical Biology 128(1), 11–45 (1987)MathSciNetCrossRefGoogle Scholar
  116. 116.
    Kearns, M.J., Mansour, Y., Ng, A.Y., Ron, D.: An experimental and theoretical comparison of model selection methods. In: COLT 1995: Proceedings of the eighth annual conference on Computational learning theory, pp. 21–30. ACM Press, New York (1995)CrossRefGoogle Scholar
  117. 117.
    Kirkpatrick, S., Gelatt Jr., C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983)MathSciNetCrossRefGoogle Scholar
  118. 118.
    Kirschner, M., Gerhart, J.: Evolvability. Proceedings of the National Academy of Science of the USA (PNAS) 95(15), 8420–8427 (1998)CrossRefGoogle Scholar
  119. 119.
    Kita, H., Sano, Y.: Genetic algorithms for optimization of noisy fitness functions and adaptation to changing environments. In: 2003 Joint Workshop of Hayashibara Foundation and 2003 Workshop on Statistical Mechanical Approach to Probabilistic Information Processing (SMAPIP) (2003)Google Scholar
  120. 120.
    Kolarov, K.: Landscape ruggedness in evolutionary algorithms. In: Proceedings of the IEEE Conference on Evolutionary Computation, pp. 19–24 (1997)Google Scholar
  121. 121.
    Köppen, M., Wolpert, D.H., Macready, W.G.: Remarks on a recent paper on the “no free lunch” theorems. IEEE Transactions on Evolutionary Computation 5(3), 295–296 (2001)CrossRefGoogle Scholar
  122. 122.
    Landau, E.: Handbuch der Lehre von der Verteilung der Primzahlen. B. G. Teubner, Leipzig (1909); reprinted by Chelsea, New York (1953)Google Scholar
  123. 123.
    Laumanns, M., Thiele, L., Deb, K., Zitzler, E.: On the convergence and diversity-preservation properties of multi-objective evolutionary algorithms. Tech. Rep. 108, Computer Engineering and Networks Laboratory (TIK), Department of Electrical Engineering, Swiss Federal Institute of Technology (ETH) Zurich and Kanpur Genetic Algorithms Laboratory (KanGAL), Department of Mechanical Engineering, Indian Institute of Technology Kanpur (2001)Google Scholar
  124. 124.
    Lawrence, S., Giles, C.L.: Overfitting and neural networks: Conjugate gradient and backpropagation. In: Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN 2000), vol. 1, pp. 1114–1119. IEEE Computer Society, Los Alamitos (2000)Google Scholar
  125. 125.
    Lee, J.Y.B., Wong, P.C.: The effect of function noise on gp efficiency. In: Progress in Evolutionary Computation, pp. 1–16 (1995)Google Scholar
  126. 126.
    Li, X., Branke, J., Blackwell, T.: Particle swarm with speciation and adaptation in a dynamic environment. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 51–58 (2006) doi:10.1145/1143997.1144005Google Scholar
  127. 127.
    Liepins, G.E., Vose, M.D.: Deceptiveness and genetic algorithm dynamics. In: Proceedings of the First Workshop on Foundations of Genetic Algorithms (FOGA), pp. 36–50 (1991)Google Scholar
  128. 128.
    Ling, C.X.: Overfitting and generalization in learning discrete patterns. Neurocomputing 8(3), 341–347 (1995)CrossRefGoogle Scholar
  129. 129.
    Lohmann, R.: Structure evolution and neural systems. In: Dynamic, Genetic, and Chaotic Programming: The Sixth-Generation, pp. 395–411. Wiley Interscience, Hoboken (1992)Google Scholar
  130. 130.
    Lohmann, R.: Structure evolution and incomplete induction. Biological Cybernetics 69(4), 319–326 (1993)CrossRefGoogle Scholar
  131. 131.
    Luke, S., Panait, L.: A comparison of bloat control methods for genetic programming. Evolutionary Computation 14(3), 309–344 (2006)CrossRefGoogle Scholar
  132. 132.
    Lush, J.L.: Progeny test and individual performance as indicators of an animal’s breeding value. Journal of Dairy Science 18(1), 1–19 (1935)Google Scholar
  133. 133.
    Magurran, A.E.: Biological diversity. Current Biology Magazine 15, R116–R118 (2005)CrossRefGoogle Scholar
  134. 134.
    Martin, W.N., Lienig, J., Cohoon, J.P.: Island (migration) models: Evolutionary algorithms based on punctuated equilibria. In: Handbook of Evolutionary Computation, ch. 6.3. Oxford University Press, Oxford (1997)Google Scholar
  135. 135.
    Mendes, R., Mohais, A.S.: Dynde: a differential evolution for dynamic optimization problems. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, vol. 3, pp. 2808–2815 (2005)Google Scholar
  136. 136.
    Miller, B.L., Goldberg, D.E.: Genetic algorithms, tournament selection, and the effects of noise. IlliGAL Report 95006, Illinois Genetic Algorithms Laboratory, Department of General Engineering, University of Illinois (1995)Google Scholar
  137. 137.
    Miller, B.L., Goldberg, D.E.: Genetic algorithms, selection schemes, and the varying effects of noise. Evolutionary Computation 4(2), 113–131 (1996)CrossRefGoogle Scholar
  138. 138.
    Miller, B.L., Shaw, M.J.: Genetic algorithms with dynamic niche sharing for multimodal function optimization. IlliGAL Report 95010, Department of General Engineering, University of Illinois at Urbana-Champaign (1995)Google Scholar
  139. 139.
    Mitchell, M., Forrest, S., Holland, J.H.: The royal road for genetic algorithms: Fitness landscapes and GA performance. In: Towards a Practice of Autonomous Systems: Proceedings of the First European Conference on Artificial Life, pp. 245–254 (1991)Google Scholar
  140. 140.
    Mitchell, T.M.: Generalization as search. In: Webber, B.L., Nilsson, N.J. (eds.) Readings in Artificial Intelligence, 2nd edn., pp. 517–542. Tioga Pub. Co. Press, Morgan Kaufmann Publishers, Elsevier Science & Technology Books (1981)Google Scholar
  141. 141.
    Mitchell, T.M.: Generalization as search. Artificial Intelligence 18(2), 203–226 (1982)MathSciNetCrossRefGoogle Scholar
  142. 142.
    Mori, N., Kita, H., Nishikawa, Y.: Adaptation to a changing environment by means of the thermodynamical genetic algorithm. In: Ebeling, W., Rechenberg, I., Voigt, H.-M., Schwefel, H.-P. (eds.) PPSN 1996. LNCS, vol. 1141, pp. 513–522. Springer, Heidelberg (1996)CrossRefGoogle Scholar
  143. 143.
    Mori, N., Imanishi, S., Kita, H., Nishikawa, Y.: Adaptation to changing environments by means of the memory based thermodynamical genetic algorithm. In: Proceedings of the International Conference on Genetic Algorithms, ICGA, pp. 299–306 (1997)Google Scholar
  144. 144.
    Mori, N., Kita, H., Nishikawa, Y.: Adaptation to a changing environment by means of the feedback thermodynamical genetic algorithm. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 149–158. Springer, Heidelberg (1998)CrossRefGoogle Scholar
  145. 145.
    Morrison, R.W.: Designing evolutionary algorithms for dynamic environments. PhD thesis, George Mason University, USA (2002)Google Scholar
  146. 146.
    Morrison, R.W.: Designing Evolutionary Algorithms for Dynamic Environments. Natural Computing 24(1), 143–144 (2004)Google Scholar
  147. 147.
    Morrison, R.W., De Jong, K.A.: A test problem generator for non-stationary environments. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, vol. 3, pp. 2047–2053 (1999) doi:10.1109/CEC.1999.785526Google Scholar
  148. 148.
    Morrison, R.W., De Jong, K.A.: Measurement of population diversity. In: Collet, P., Fonlupt, C., Hao, J.-K., Lutton, E., Schoenauer, M. (eds.) EA 2001. LNCS, vol. 2310, pp. 1047–1074. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  149. 149.
    Mostaghim, S.: Multi-objective evolutionary algorithms: Data structures, convergence and, diversity. PhD thesis, Fakultät für Elektrotechnik, Informatik und Mathematik, Universität Paderborn, Deutschland, Germany (2004)Google Scholar
  150. 150.
    Munetomo, M., Goldberg, D.E.: Linkage identification by non-monotonicity detection for overlapping functions. Evolutionary Computation 7(4), 377–398 (1999)CrossRefGoogle Scholar
  151. 151.
    Munetomo, M., Goldberg, D.E.: Linkage identification by non-monotonicity detection for overlapping functions. IlliGAL Report 99005, Illinois Genetic Algorithms Laboratory (IlliGAL), University of Illinois at Urbana-Champaign (1999)Google Scholar
  152. 152.
    Muttil, N., Liong, S.-Y.: Superior exploration–exploitation balance in shuffled complex evolution. Journal of Hydraulic Engineering 130(12), 1202–1205 (2004)CrossRefGoogle Scholar
  153. 153.
    Naudts, B., Verschoren, A.: Epistasis on finite and infinite spaces. In: Proceedings of the 8th International Conference on Systems Research, Informatics and Cybernetics, pp. 19–23 (1996)Google Scholar
  154. 154.
    Naudts, B., Verschoren, A.: Epistasis and deceptivity. Bulletin of the Belgian Mathematical Society 6(1), 147–154 (1999)zbMATHMathSciNetGoogle Scholar
  155. 155.
    Newman, M.E.J., Engelhardt, R.: Effect of neutral selection on the evolution of molecular species. Proceedings of the Royal Society of London B (Biological Sciences) 256(1403), 1333–1338 (1998)Google Scholar
  156. 156.
    Oei, C.K., Goldberg, D.E., Chang, S.J.: Tournament selection, niching, and the preservation of diversity. IlliGAl Report 91011, Illinois Genetic Algorithms Laboratory (IlliGAL), Department of Computer Science, Department of General Engineering, University of Illinois at Urbana-Champaign (1991)Google Scholar
  157. 157.
    Olsen, A.L.: Penalty functions and the knapsack problem. In: Proceedings of the First IEEE Conference on Evolutionary Computation, vol. 2, pp. 554–558 (1994)Google Scholar
  158. 158.
    Osman, I.H.: An introduction to metaheuristics. In: Lawrence, M., Wilsdon, C. (eds.) Operational Research Tutorial Papers, pp. 92–122. Stockton Press, Hampshire (1995); publication of the Operational Research Society, Birmingham, UKGoogle Scholar
  159. 159.
    Paenke, I., Branke, J., Jin, Y.: On the influence of phenotype plasticity on genotype diversity. In: First IEEE Symposium on Foundations of Computational Intelligence (FOCI 2007), pp. 33–40 (2007)Google Scholar
  160. 160.
    Pan, G., Dou, Q., Liu, X.: Performance of two improved particle swarm optimization in dynamic optimization environments. In: ISDA 2006: Proceedings of the Sixth International Conference on Intelligent Systems Design and Applications (ISDA 2006), vol. 2, pp. 1024–1028. IEEE Computer Society Press, Los Alamitos (2006)CrossRefGoogle Scholar
  161. 161.
    Pan, H., Wang, L., Liu, B.: Particle swarm optimization for function optimization in noisy environment. Applied Mathematics and Computation 181(2), 908–919 (2006)zbMATHMathSciNetCrossRefGoogle Scholar
  162. 162.
    Pelikan, M., Goldberg, D.E., Cantú-Paz, E.: Boa: The bayesian optimization algorithm. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 525–532 (1999)Google Scholar
  163. 163.
    Phillips, P.C.: The language of gene interaction. Genetics 149(3), 1167–1171 (1998)Google Scholar
  164. 164.
    Pohlheim, H.: Geatbx introduction – evolutionary algorithms: Overview, methods and operators. Tech. rep., documentation for GEATbx version 3.7 (2005) (accessed, 2007-07-03), http://www.GEATbx.com
  165. 165.
    Purshouse, R.C.: On the evolutionary optimisation of many objectives. PhD thesis, Department of Automatic Control and Systems Engineering, The University of Sheffield (2003)Google Scholar
  166. 166.
    Radcliffe, N.J.: Non-linear genetic representations. In: Proceedings of the International Conference on Parallel Problem Solving from Nature, PPSN, pp. 259–268. Elsevier, Amsterdam (1992)Google Scholar
  167. 167.
    Radcliffe, N.J.: The algebra of genetic algorithms. Annals of Mathematics and Artificial Intelligence 10(4) (1994) doi:10.1007/BF01531276Google Scholar
  168. 168.
    Radcliffe, N.J., Surry, P.D.: Fundamental limitations on search algorithms: Evolutionary computing in perspective. In: van Leeuwen, J. (ed.) Computer Science Today. LNCS, vol. 1000, pp. 275–291. Springer, Heidelberg (1995)CrossRefGoogle Scholar
  169. 169.
    Rayward-Smith, V.J.: A unified approach to tabu search, simulated annealing and genetic algorithms. In: Rayward-Smith, V.J. (ed.) Applications of Modern Heuristic Methods – Proceedings of the UNICOM Seminar on Adaptive Computing and Information Processing, Brunel University Conference Centre, London, UK, vol. I, pp. 55–78. Alfred Waller Ltd / Nelson Thornes Ltd / Unicom Seminars Ltd (1994)Google Scholar
  170. 170.
    Rechenberg, I.: Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog Verlag, Stuttgart (1973)Google Scholar
  171. 171.
    Rechenberg, I.: Evolutionsstrategie 1994. Werkstatt Bionik und Evolutionstechnik, vol. 1. Frommann Holzboog (1994)Google Scholar
  172. 172.
    Reidys, C.M., Stadler, P.F.: Neutrality in fitness landscapes. Applied Mathematics and Computation 117(2–3), 321–350 (2001)zbMATHMathSciNetCrossRefGoogle Scholar
  173. 173.
    Richter, H.: Behavior of evolutionary algorithms in chaotically changing fitness landscapes. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 111–120. Springer, Heidelberg (2004)Google Scholar
  174. 174.
    Riedl, R.J.: A systems-analytical approach to macroevolutionary phenomena. Quarterly Review of Biology, 351–370 (1977)Google Scholar
  175. 175.
    Robbins, H., Monro, S.: A stochastic approximation method. Annals of Mathematical Statistics 22(3), 400–407 (1951)zbMATHMathSciNetCrossRefGoogle Scholar
  176. 176.
    Ronald, S.: Preventing diversity loss in a routing genetic algorithm with hash tagging. Complexity International 2 (1995) (accessed 2008-12-07), http://www.complexity.org.au/ci/vol02/sr_hash/
  177. 177.
    Ronald, S.: Genetic algorithms and permutation-encoded problems. diversity preservation and a study of multimodality. PhD thesis, University Of South Australia. Department of Computer and Information Science (1996)Google Scholar
  178. 178.
    Ronald, S.: Robust encodings in genetic algorithms: A survey of encoding issues. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, pp. 43–48 (1997) doi:10.1109/ICEC.1997.592265Google Scholar
  179. 179.
    Rosca, J.P.: An analysis of hierarchical genetic programming. Tech. Rep. TR566, The University of Rochester, Computer Science Department (1995)Google Scholar
  180. 180.
    Rosca, J.P., Ballard, D.H.: Causality in genetic programming. In: Proceedings of the International Conference on Genetic Algorithms, ICGA, pp. 256–263 (1995)Google Scholar
  181. 181.
    Rosin, P.L., Fierens, F.: Improving neural network generalisation. In: Proceedings of the International Geoscience and Remote Sensing Symposium, Quantitative Remote Sensing for Science and Applications, IGARSS 1995, vol. 2, pp. 1255–1257. IEEE, Los Alamitos (1995)CrossRefGoogle Scholar
  182. 182.
    Rothlauf, F.: Representations for Genetic and Evolutionary Algorithms, 2nd edn. Physica-Verlag (2006) (1st edn., 2002)Google Scholar
  183. 183.
    Routledge, R.D.: Diversity indices: Which ones are admissible? Journal of Theoretical Biology 76, 503–515 (1979)MathSciNetCrossRefGoogle Scholar
  184. 184.
    Rudnick, W.M.: Genetic algorithms and fitness variance with an application to the automated design of artificial neural networks. PhD thesis, Oregon Graduate Institute of Science & Technology (1992)Google Scholar
  185. 185.
    Rudolph, G.: Self-adaptation and global convergence: A counter-example. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, vol. 1, pp. 646–651 (1999)Google Scholar
  186. 186.
    Rudolph, G.: Self-adaptive mutations may lead to premature convergence. IEEE Transactions on Evolutionary Computation 5(4), 410–414 (2001)CrossRefGoogle Scholar
  187. 187.
    Rudolph, G.: Self-adaptive mutations may lead to premature convergence. Tech. Rep. CI–73/99, Fachbereich Informatik, Universität Dortmund (2001)Google Scholar
  188. 188.
    Sano, Y., Kita, H.: Optimization of noisy fitness functions by means of genetic algorithms using history of search. In: Deb, K., Rudolph, G., Lutton, E., Merelo, J.J., Schoenauer, M., Schwefel, H.-P., Yao, X. (eds.) PPSN 2000. LNCS, vol. 1917, pp. 571–580. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  189. 189.
    Sano, Y., Kita, H.: Optimization of noisy fitness functions by means of genetic algorithms using history of search with test of estimation. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, pp. 360–365 (2002)Google Scholar
  190. 190.
    Sarle, W.: What is overfitting and how can i avoid it? Usenet FAQs: compaineural-nets FAQ 3: Generalization(3) (2007)Google Scholar
  191. 191.
    Sarle, W.S.: Stopped training and other remedies for overfitting. In: Proceedings of the 27th Symposium on the Interface: Computing Science and Statistics, pp. 352–360 (1995)Google Scholar
  192. 192.
    Schaffer, J.D., Eshelman, L.J., Offutt, D.: Spurious correlations and premature convergence in genetic algorithms. In: Proceedings of the First Workshop on Foundations of Genetic Algorithms (FOGA), pp. 102–112 (1990)Google Scholar
  193. 193.
    Sendhoff, B., Kreutz, M., von Seelen, W.: A condition for the genotype-phenotype mapping: Causality. In: Proceedings of the International Conference on Genetic Algorithms, ICGA, pp. 73–80 (1997)Google Scholar
  194. 194.
    Shackleton, M., Shipman, R., Ebner, M.: An investigation of redundant genotype-phenotype mappings and their role in evolutionary search. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, pp. 493–500 (2000)Google Scholar
  195. 195.
    Shekel, J.: Test functions for multimodal search techniques. In: Proceedings of the Fifth Annual Princeton Conference on Information Science and Systems, pp. 354–359. Princeton University Press, Princeton (1971)Google Scholar
  196. 196.
    Shipman, R.: Genetic redundancy: Desirable or problematic for evolutionary adaptation? In: Proceedings of the 4th International Conference on Artificial Neural Nets and Genetic Algorithms, pp. 1–11 (1999)Google Scholar
  197. 197.
    Shipman, R., Shackleton, M., Ebner, M., Watson, R.: Neutral search spaces for artificial evolution: a lesson from life. In: Bedau, M., McCaskill, J.S., Packard, N.H., Rasmussen, S., McCaskill, J., Packard, N. (eds.) Artificial Life VII: Proceedings of the Seventh International Conference on Artificial Life. The MIT Press, Bradford Books, Complex Adaptive Systems (2000)Google Scholar
  198. 198.
    Shipman, R., Shackleton, M., Harvey, I.: The use of neutral genotype-phenotype mappings for improved evolutionary search. BT Technology Journal 18(4), 103–111 (2000)CrossRefGoogle Scholar
  199. 199.
    Siedlecki, W.W., Sklansky, J.: Constrained genetic optimization via dynamic reward-penalty balancing and its use in pattern recognition. In: Proceedings of the third international conference on Genetic algorithms, pp. 141–150 (1989)Google Scholar
  200. 200.
    Singh, G., Deb, K.: Comparison of multi-modal optimization algorithms based on evolutionary algorithms. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 1305–1312 (2006)Google Scholar
  201. 201.
    Smith, A.E., Coit, D.W.: Penalty functions. In: Handbook of Evolutionary Computation, ch. 5.2. Oxford University Press, Oxford (1997)Google Scholar
  202. 202.
    Smith, M.: Neural Networks for Statistical Modeling. John Wiley & Sons, Inc. International Thomson Computer Press (1993/1996)Google Scholar
  203. 203.
    Smith, S.S.F.: Using multiple genetic operators to reduce premature convergence in genetic assembly planning. Computers in Industry 54(1), 35–49 (2004)CrossRefGoogle Scholar
  204. 204.
    Smith, T., Husbands, P., Layzell, P., O’Shea, M.: Fitness landscapes and evolvability. Evolutionary Computation 10(1), 1–34 (2002)CrossRefGoogle Scholar
  205. 205.
    Spatz, B.M., Rawlins, G.J.E. (eds.): Proceedings of the First Workshop on Foundations of Genetic Algorithms. Morgan Kaufmann Publishers, Inc., San Francisco (1990)Google Scholar
  206. 206.
    Spieth, C., Streichert, F., Speer, N., Zell, A.: Utilizing an island model for ea to preserve solution diversity for inferring gene regulatory networks. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, vol. 1, pp. 146–151 (2004)Google Scholar
  207. 207.
    Stagge, P., Igel, C.: Structure optimization and isomorphisms. In: Theoretical Aspects of Evolutionary Computing, pp. 409–422. Springer, Heidelberg (2000)Google Scholar
  208. 208.
    Stewart, T.: Extrema selection: accelerated evolution on neutral networks. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, vol. 1 (2001)Google Scholar
  209. 209.
    Taguchi, G.: Introduction to Quality Engineering: Designing Quality into Products and Processes. Asian Productivity Organization / American Supplier Institute Inc. / Quality Resources / Productivity Press Inc., translation of Sekkeisha no tame no hinshitsu kanri (1986)Google Scholar
  210. 210.
    Taillard, É.D., Gambardella, L.M., Gendrau, M., Potvin, J.-Y.: Adaptive memory programming: A unified view of metaheuristics. European Journal of Operational Research 135(1), 1–16 (2001)zbMATHMathSciNetCrossRefGoogle Scholar
  211. 211.
    Tetko, I.V., Livingstone, D.J., Luik, A.I.: Neural network studies, 1. comparison of overfitting and overtraining. Journal of Chemical Information and Computer Sciences 35(5), 826–833 (1995)Google Scholar
  212. 212.
    Thierens, D.: On the scalability of simple genetic algorithms. Tech. Rep. UU-CS-1999-48, Department of Information and Computing Sciences, Utrecht University (1999)Google Scholar
  213. 213.
    Thierens, D., Goldberg, D.E., Pereira, Â.G.: Domino convergence, drift, and the temporal-salience structure of problems. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, pp. 535–540 (1998), doi:10.1109/ICEC.1998.700085Google Scholar
  214. 214.
    Toussaint, M., Igel, C.: Neutrality: A necessity for self-adaptation. In: Proceedings of the IEEE Congress on Evolutionary Computation, CEC, pp. 1354–1359 (2002)Google Scholar
  215. 215.
    Trelea, I.C.: The particle swarm optimization algorithm: convergence analysis and parameter selection. Information Processing Letters 85(6), 317–325 (2003)zbMATHMathSciNetCrossRefGoogle Scholar
  216. 216.
    Trojanowski, K.: Evolutionary algorithms with redundant genetic material for non-stationary environments. PhD thesis, Instytut Podstaw Informatyki PAN, Institute of Computer Science, Warsaw, University of Technology, Poland (1994)Google Scholar
  217. 217.
    Tsutsui, S., Ghosh, A.: Genetic algorithms with a robust solution searching scheme. IEEE Transactions on Evolutionary Computation 1, 201–208 (1997)CrossRefGoogle Scholar
  218. 218.
    Tsutsui, S., Ghosh, A., Fujimoto, Y.: A robust solution searching scheme in genetic search. In: Ebeling, W., Rechenberg, I., Voigt, H.-M., Schwefel, H.-P. (eds.) PPSN 1996. LNCS, vol. 1141, pp. 543–552. Springer, Heidelberg (1996)CrossRefGoogle Scholar
  219. 219.
    Ursem, R.K.: Models for evolutionary algorithms and their applications in system identification and control optimization. PhD thesis, Department of Computer Science, University of Aarhus, Denmark (2003)Google Scholar
  220. 220.
    Vaessens, R.J.M., Aarts, E.H.L., Lenstra, J.K.: A local search template. In: Proceedings of the International Conference on Parallel Problem Solving from Nature, PPSN, pp. 67–76 (1992)Google Scholar
  221. 221.
    Vaessens, R.J.M., Aarts, E.H.L., Lenstra, J.K.: A local search template. Computers and Operations Research 25(11), 969–979 (1998)zbMATHMathSciNetCrossRefGoogle Scholar
  222. 222.
    van Nimwegen, E., Crutchfield, J.P.: Optimizing epochal evolutionary search: Population-size dependent theory. Machine Learning 45(1), 77–114 (2001)zbMATHCrossRefGoogle Scholar
  223. 223.
    van Nimwegen, E., Crutchfield, J.P., Huynen, M.: Neutral evolution of mutational robustness. Proceedings of the National Academy of Science of the United States of Americs (PNAS) – Evolution 96(17), 9716–9720 (1999)CrossRefGoogle Scholar
  224. 224.
    van Nimwegen, E., Crutchfield, J.P., Mitchell, M.: Statistical dynamics of the royal road genetic algorithm. Theoretical Computer Science 229(1–2), 41–102 (1999)zbMATHMathSciNetCrossRefGoogle Scholar
  225. 225.
    Wagner, A.: Robustness and Evolvability in Living Systems. Princeton Studies in Complexity. Princeton University Press, Princeton (2005)Google Scholar
  226. 226.
    Wagner, A.: Robustness, evolvability, and neutrality. FEBS Lett 579(8), 1772–1778 (2005)CrossRefGoogle Scholar
  227. 227.
    Wagner, G.P., Altenberg, L.: Complex adaptations and the evolution of evolvability. Evolution 50(3), 967–976 (1996)CrossRefGoogle Scholar
  228. 228.
    Watanabe, S.: Knowing and Guessing: A Quantitative Study of Inference and Information. John Wiley & Sons, Chichester (1969)zbMATHGoogle Scholar
  229. 229.
    Weicker, K.: Evolutionäre Algorithmen. Leitfäden der Informatik, B. G. Teubner GmbH (2002)Google Scholar
  230. 230.
    Weicker, K., Weicker, N.: Burden and benefits of redundancy. In: Sixth Workshop on Foundations of Genetic Algorithms (FOGA), pp. 313–333. Morgan Kaufmann, San Francisco (2000)Google Scholar
  231. 231.
    Weise, T., Zapf, M., Geihs, K.: Rule-based Genetic Programming. In: Proceedings of BIONETICS 2007, 2nd International Conference on Bio-Inspired Models of Network, Information, and Computing Systems (2007)Google Scholar
  232. 232.
    Weise, T., Niemczyk, S., Skubch, H., Reichle, R., Geihs, K.: A tunable model for multi-objective, epistatic, rugged, and neutral fitness landscapes. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 795–802 (2008)Google Scholar
  233. 233.
    Whitley, L.D., Gordon, V.S., Mathias, K.E.: Lamarckian evolution, the baldwin effect and function optimization. In: Davidor, Y., Männer, R., Schwefel, H.-P. (eds.) PPSN 1994. LNCS, vol. 866, pp. 6–15. Springer, Heidelberg (1994)Google Scholar
  234. 234.
    Wiesmann, D., Hammel, U., Bäck, T.: Robust design of multilayer optical coatings by means of evolutionary algorithms. IEEE Transactions on Evolutionary Computation 2, 162–167 (1998)CrossRefGoogle Scholar
  235. 235.
    Wiesmann, D., Hammel, U., Bäck, T.: Robust design of multilayer optical coatings by means of evolutionary strategies. Sonderforschungsbereich (sfb) 531, Universität Dortmund (1998)Google Scholar
  236. 236.
    Wilke, C.O.: Evolutionary dynamics in time-dependent environments. PhD thesis, Fakultät für Physik und Astronomie, Ruhr-Universität Bochum (1999)Google Scholar
  237. 237.
    Wilke, C.O.: Adaptive evolution on neutral networks. Bulletin of Mathematical Biology 63(4), 715–730 (2001)MathSciNetCrossRefGoogle Scholar
  238. 238.
    Wilke, D.N., Kok, S., Groenwold, A.A.: Comparison of linear and classical velocity update rules in particle swarm optimization: notes on diversity. International Journal for Numerical Methods in Engineering 70(8), 962–984 (2007)MathSciNetCrossRefGoogle Scholar
  239. 239.
    Williams, G.C.: Pleiotropy, natural selection, and the evolution of senescence. Evolution 11(4), 398–411 (1957)CrossRefGoogle Scholar
  240. 240.
    Winter, P.C., Hickey, G.I., Fletcher, H.L.: Instant Notes in Genetics, 3rd edn. Springer, New York (2006) (1st edn. 1998, 2nd edn. 2002)Google Scholar
  241. 241.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for search. Tech. Rep. SFI-TR-95-02-010, The Santa Fe Institute (1995)Google Scholar
  242. 242.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation 1(1), 67–82 (1997)CrossRefGoogle Scholar
  243. 243.
    Wu, N.: Differential evolution for optimisation in dynamic environments. Tech. rep., School of Computer Science and Information Technology, RMIT University (2006)Google Scholar
  244. 244.
    Yang, S., Ong, Y.S., Jin, Y.: Evolutionary Computation in Dynamic and Uncertain Environments. Studies in Computational Intelligence, vol. 51(XXIII). Springer, Heidelberg (2007)zbMATHGoogle Scholar
  245. 245.
    Zakian, V.: New formulation for the method of inequalities. Proceedings of the Institution of Electrical Engineers 126(6), 579–584 (1979)CrossRefGoogle Scholar
  246. 246.
    Žilinskas, A.: Algorithm as 133: Optimization of one-dimensional multimodal functions. Applied Statistics 27(3), 367–375 (1978)zbMATHCrossRefGoogle Scholar
  247. 247.
    Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: Improving the Strength Pareto Evolutionary Algorithm. Tech. Rep. 103, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH) Zurich (2001)Google Scholar
  248. 248.
    Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: Improving the strength pareto evolutionary algorithm for multiobjective optimization. In: Evolutionary Methods for Design, Optimisation and Control with Application to Industrial Problems. Proceedings of the EUROGEN 2001 Conference, pp. 95–100 (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Thomas Weise
    • 1
  • Michael Zapf
    • 1
  • Raymond Chiong
    • 2
  • Antonio J. Nebro
    • 3
  1. 1.Distributed Systems GroupUniversity of KasselKasselGermany
  2. 2.School of Computing & DesignSwinburne University of Technology (Sarawak Campus)Kuching, SarawakMalaysia
  3. 3.Dept. Lenguajes y Ciencias de la Computación, ETSI InformáticaUniversity of MálagaMálagaSpain

Personalised recommendations