Journal of Computer Science and Technology

, Volume 27, Issue 5, pp 907–936 | Cite as

Evolutionary Optimization: Pitfalls and Booby Traps

Regular Paper

Abstract

Evolutionary computation (EC), a collective name for a range of metaheuristic black-box optimization algorithms, is one of the fastest-growing areas in computer science. Many manuals and “how-to”s on the use of different EC methods as well as a variety of free or commercial software libraries are widely available nowadays. However, when one of these methods is applied to a real-world task, there can be many pitfalls and booby traps lurking — certain aspects of the optimization problem that may lead to unsatisfactory results even if the algorithm appears to be correctly implemented and executed. These include the convergence issues, ruggedness, deceptiveness, and neutrality in the fitness landscape, epistasis, non-separability, noise leading to the need for robustness, as well as dimensionality and scalability issues, among others. In this article, we systematically discuss these related hindrances and present some possible remedies. The goal is to equip practitioners and researchers alike with a clear picture and understanding of what kind of problems can render EC applications unsuccessful and how to avoid them from the start.

Keywords

evolutionary computing problem difficulty optimization meta-heuristics 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

11390_2012_1274_MOESM1_ESM.docx (15 kb)
(DOC 15.4 kb)

References

  1. [1]
    Blum C, Chiong R, Clerc M, De Jong K A, Michalewicz Z, Neri F, Weise T. Evolutionary optimization. In Variants of Evolutionary Algorithms for Real-World Applications, Chiong R, Weise T, Michalewicz Z (eds.), Berlin/Heidelberg: Springer-Verlag, 2011, pp.1–29.Google Scholar
  2. [2]
    Weise T. Global Optimization Algorithms – Theory and Application. Germany: it-weise.de (self-published), 2009. http://www.it-weise.de/projects/book.pdf.
  3. [3]
    Eiben Á E, Smith J E. Introduction to Evolutionary Computing (Natural Computing Series). New York, USA: Springer New York, 2003.Google Scholar
  4. [4]
    Chiong R, Weise T, Michalewicz Z (eds.). Variants of Evolutionary Algorithms for Real-World Applications. Berlin/Heidelberg: Springer-Verlag, 2011.Google Scholar
  5. [5]
    Whitley L D. A genetic algorithm tutorial. Statistics and Computing, 1994, 4(2): 65–85.CrossRefGoogle Scholar
  6. [6]
    Michalewicz Z. Genetic Algorithms + Data Structures = Evolution Programs. Berlin, Germany: Springer-Verlag GmbH, 1996.MATHGoogle Scholar
  7. [7]
    Coello Coello C A. A short tutorial on evolutionary multiobjective optimization. In Proc. the 1st International Conference on Evolutionary Multi-Criterion Optimization (EMO2001), Zürich, Switzerland, March 7-9, 2001, pp.21–40.Google Scholar
  8. [8]
    Coello Coello C A. Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: A survey of the state of the art. Computer Methods in Applied Mechanics and Engineering, 2002, 191(11-12): 1245–1287.MathSciNetCrossRefMATHGoogle Scholar
  9. [9]
    Trojanowski K, Michalewicz Z. Evolutionary algorithms and the problem-specific knowledge. In Proc. the 2nd National Conference on Evolutionary Computation and Global Optimization, Rytro, Poland, September 16-19, 1997, pp.281–292.Google Scholar
  10. [10]
    Chiong R, Dhakal S (eds.). Natural Intelligence for Scheduling, Planning and Packing Problems. Berlin/Heidelberg: Springer-Verlag, 2009.Google Scholar
  11. [11]
    Chiong R (ed.). Nature-Inspired Informatics for Intelligent Applications and Knowledge Discovery: Implications in Business, Science and Engineering. Hershey, PA, USA: Information Science Reference, 2009.Google Scholar
  12. [12]
    Chiong R (ed.). Nature-Inspired Algorithms for Optimisation. Berlin/Heidelberg: Springer-Verlag, 2009.Google Scholar
  13. [13]
    Chen T, Tang K, Chen G, Yao X. Analysis of computational time of simple estimation of distribution algorithms. IEEE Transactions on Evolutionary Computation, 2010, 14(1): 1–22.CrossRefMATHGoogle Scholar
  14. [14]
    Chen T, He J, Chen G, Yao X. Choosing selection pressure for wide-gap problems. Theoretical Computer Science, 2010, 411(6): 926–934.MathSciNetCrossRefMATHGoogle Scholar
  15. [15]
    He J, Yao X. Towards an analytic framework for analysing the computation time of evolutionary algorithms. Artificial Intelligence, 2003, 145(1-2): 59–97.MathSciNetCrossRefMATHGoogle Scholar
  16. [16]
    He J, Reeves C R, Witt C, Yao X. A note on problem difficulty measures in black-box optimization: Classification, realizations and predictability. Evolutionary Computation, 2007, 15(4): 435–443.CrossRefGoogle Scholar
  17. [17]
    Lochtefeld D F, Ciarallo F W. A diversity classification scheme for genetic algorithms. In Proc. the 61st Annual IIE Conference and Expo (IERC2011), Reno, NV, USA, May 21-25, 2011.Google Scholar
  18. [18]
    Devert A, Weise T, Tang K. A study on scalable representations for evolutionary optimization of ground structures. Evolutionary Computation, 2012, 20(3): 453–472.CrossRefGoogle Scholar
  19. [19]
    De Jong K A. An analysis of the behavior of a class of genetic adaptive systems [Ph.D. Thesis]. University of Michigan, 1975.Google Scholar
  20. [20]
    Wright A H. Genetic algorithms for real parameter optimization. In Proc. the 1st Workshop on Foundations of Genetic Algorithms (FOGA1990), Bloomington, IN, USA, July 15-18, 1990, pp.205–218.Google Scholar
  21. [21]
    Schraudolph N N, Belew R K. Dynamic parameter encoding for genetic algorithms. Machine Learning, 1992, 9(1): 9–21.Google Scholar
  22. [22]
    Goldberg D E. Genetic Algorithms in Search, Optimization, and Machine Learning. Boston, MA, USA: Addison-Wesley Longman Publishing Co., Inc., 1989.Google Scholar
  23. [23]
    Rothlauf F. Representations for Genetic and Evolutionary Algorithms. Heidelberg, Germany: Physica-Verlag GmbH & Co., 2002.Google Scholar
  24. [24]
    Grefenstette J J. Deception considered harmful. In Proc. the 2nd Workshop on Foundations of Genetic Algorithms (FOGA1992), Vail, CO, USA, July 26-29, 1992, pp.75–91.Google Scholar
  25. [25]
    Leblanc B, Lutton E. Bitwise regularity and GA-hardness. In Proc. the 1998 IEEE International Conference on Evolutionary Computation (CEC1998), Anchorage, AK, USA, May 4-9, 1998, pp.517–522.Google Scholar
  26. [26]
    Naudts B, Kallel L. Some facts about so called GA-hardness measures. Rapport Interne (R.I.) 379, Centre de Mathématiques APpliquées (CMAP), 1998.Google Scholar
  27. [27]
    Borenstein Y, Poli R. Fitness distributions and GA hardness. In Proc. the 8th International Conference on Parallel Problem Solving from Nature, Birmingham, UK, September 18-22, 2008, pp.11–20.Google Scholar
  28. [28]
    Guo H, Hsu W H. GA-hardness revisited. In Proc. the Genetic and Evolutionary Computation Conference (GECCO2003), Part I, Chicago, IL, USA, July 12-16, 2003, pp.1584–1585.Google Scholar
  29. [29]
    Oliveto P S, He J, Yao X. Time complexity of evolutionary algorithms for combinatorial optimization: A decade of results. International Journal of Automation and Computing, 2007, 4(3): 281–293.CrossRefGoogle Scholar
  30. [30]
    Oliveto P S, He J, Yao X. Analysis of the (1+1)-ea for finding approximate solutions to vertex cover problems. IEEE Trans. Evolutionary Computation, 2009, 13(5): 1006–1029.CrossRefGoogle Scholar
  31. [31]
    Horn J, Goldberg D E. Genetic algorithm difficulty and the modality of the fitness landscape. In Proc. the 3rd Workshop on Foundations of Genetic Algorithms, Estes Park, CO, USA, July 31-August 2, 1994, pp.243–269.Google Scholar
  32. [32]
    Singh G, Deb K. Comparison of multi-modal optimization algorithms based on evolutionary algorithms. In Proc. the 8th Annual Conference on Genetic and Evolutionary Computation, Seattle, WA, USA, July 8-12, 2006, pp.1305–1312.Google Scholar
  33. [33]
    Weise T, Zapf M, Chiong R, Nebro Urbaneja A J. Why is optimization difficult? In Nature-Inspired Algorithms for Optimisation, Studies in Computational Intelligence 193/2009, Chiong R (ed.), Berlin/Heidelberg: Springer-Verlag, 2009, pp.1–50.Google Scholar
  34. [34]
    Rudnick W M. Genetic algorithms and fitness variance with an application to the automated design of artificial neural networks [Ph.D. Thesis]. Oregon Graduate Institute of Science & Technology, 1992.Google Scholar
  35. [35]
    Thierens D, Goldberg D E, Pereira  G. Domino convergence, drift, and the temporal-salience structure of problems. In Proc. the 1998 IEEE International Conference on Evolutionary Computation, Anchorage, AK, USA, May 4-9, 1998, pp.535–540.Google Scholar
  36. [36]
    Mitchell M, Forrest S, Holland J H. The royal road for genetic algorithms: Fitness landscapes and GA performance. In Proc. the 1st European Conference on Artificial Life (ECAL1991), Paris, France, December 11-13, 1991, pp.245–254.Google Scholar
  37. [37]
    Paenke I, Branke J, Jin Y. On the influence of phenotype plasticity on genotype diversity. In Proc. the 1st IEEE Symposium on Foundations of Computational Intelligence (FOCI2007), Honolulu, HI, USA, April 1-5, 2007, pp.33–40.Google Scholar
  38. [38]
    Holland J H. Genetic algorithms — Computer programs that “evolve” in ways that resemble natural selection can solve complex problems even their creators do not fully understand. Scientific American, 1992, 267(1): 44–50.CrossRefGoogle Scholar
  39. [39]
    Holland J H. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. Ann Arbor, MI, USA: University of Michigan Press, 1975.Google Scholar
  40. [40]
    Eiben Á E, Schippers C A. On evolutionary exploration and exploitation. Fundamenta Informaticae, 1998, 35(1-4): 35–50.MATHGoogle Scholar
  41. [41]
    Glover F. Tabu search — Part ii. ORSA Journal on Computing, 1990, 2(1): 190–206.CrossRefGoogle Scholar
  42. [42]
    Nolte A, Schrader R. A note on the finite time behaviour of simulated annealing. Mathematics of Operations Research, 2000, 25(3): 476–484.MathSciNetCrossRefMATHGoogle Scholar
  43. [43]
    Ingber L. Simulated annealing: Practice versus theory. Mathematical and Computer Modelling, 1993, 18(11): 29–57.MathSciNetCrossRefMATHGoogle Scholar
  44. [44]
    Lehre P K, Yao X. On the impact of mutation-selection balance on the runtime of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 2012, 16(2): 225–241.CrossRefGoogle Scholar
  45. [45]
    Oliveto P S, Lehre P K, Neumann F. Theoretical analysis of rank-based mutation — Combining exploration and exploitation. In Proc. the 10th IEEE Congress on Evolutionary Computation, Trondheim, Norway, May 18-21, 2009, pp.1455–1462.Google Scholar
  46. [46]
    Muttil N, Liong S. Superior exploration-exploitation balance in shuffled complex evolution. Journal of Hydraulic Engineering, 2004, 130(12): 1202–1205.CrossRefGoogle Scholar
  47. [47]
    Amor H B, Rettinger A. Intelligent exploration for genetic algorithms: Using self-organizing maps in evolutionary computation. In Proc. the Genetic and Evolutionary Computation Conference (GECCO2005), Washington, DC, USA, June 25-27, 2005, pp.1531–1538.Google Scholar
  48. [48]
    Nguyen Q H, Ong Y, Lim M H, Krasnogor N. Adaptive cellular memetic algorithms. Evolutionary Computation, 2009, 17(2): 231–256.CrossRefGoogle Scholar
  49. [49]
    Ortiz-Boyer D, Hervás-Martínez C, García C A R. Cixl2: A crossover operator for evolutionary algorithms based on population features. Journal of Artificial Intelligence Research, 2005, 24: 1–48.CrossRefMATHGoogle Scholar
  50. [50]
    Lee C, Yao X. Evolutionary programming using the mutations based on the Lévy probability distribution. IEEE Transactions on Evolutionary Computation, 2004, 8(1): 1–13.CrossRefGoogle Scholar
  51. [51]
    Yao X. An empirical study of genetic operators in genetic algorithms. Microprocessing and Microprogramming, 1993, 38(1-5): 707–714.CrossRefGoogle Scholar
  52. [52]
    Feo T A, Resende M G. Greedy randomized adaptive search procedures. J. Global Optimization, 1995, 6(2): 109–133.MathSciNetCrossRefMATHGoogle Scholar
  53. [53]
    van Nimwegen E, Crutchfield J P. Optimizing epochal evolutionary search: Population-size dependent theory. Machine Learning, 2001, 45(1): 77–114.CrossRefMATHGoogle Scholar
  54. [54]
    Chen T, Tang K, Chen G, Yao X. A large population size can be unhelpful in evolutionary algorithms. Theoretical Computer Science, 2012, 436, June: 54-70.Google Scholar
  55. [55]
    He J, Yao X. From an individual to a population: An analysis of the first hitting time of population-based evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 2002, 6(5): 495–511.CrossRefGoogle Scholar
  56. [56]
    Ronald S, Asenstorfer J, Vincent M. Representational redundancy in evolutionary algorithms. In Proc. the 2nd IEEE International Conference on Evolutionary Computation (CEC1995), Perth, WA, Australia, November 29-December 1, 1995, pp.631–637.Google Scholar
  57. [57]
    Goldberg D E, Richardson J T. Genetic algorithms with sharing for multimodal function optimization. In Proc. the 2nd International Conference on Genetic Algorithms and their Applications, Cambridge, MA, USA, July 28-31, 1987, pp.41–49.Google Scholar
  58. [58]
    Deb K, Goldberg D E. An investigation of niche and species formation in genetic function optimization. In Proc. the 3rd International Conference on Genetic Algorithms (ICGA 1989), Fairfax, VA, USA, June 4-7, 1989, pp.42–50.Google Scholar
  59. [59]
    Goldberg D E, Deb K, Horn J. Massive multimodality, deception, and genetic algorithms. In Proc. the 2nd Parallel Problem Solving from Nature 2, Brussels, Belgium, September 28-30, 1992, pp.37–48.Google Scholar
  60. [60]
    Zitzler E, Thiele L. An evolutionary algorithm for multiobjective optimization: The strength pareto approach. TIK-Report 43, Computer Engineering and Networks Laboratory (TIK), Department of Electrical Engineering, Eidgenössische Technische Hochschule (ETH) Zürich, 1998.Google Scholar
  61. [61]
    Zitzler E, Laumanns M, Thiele L. Spea 2: Improving the strength pareto evolutionary algorithm. TIK-Report 101, Computer Engineering and Networks Laboratory (TIK), Department of Electrical Engineering, EidgenÄossische Technische Hochschule (ETH) Zürich, 2001.Google Scholar
  62. [62]
    Weise T, Podlich A, Gorldt C. Solving real-world vehicle routing problems with evolutionary algorithms. In Natural Intelligence for Scheduling, Planning and Packing Problems (Studies in Computational Intelligence 250), Chiong R, Dhakal S (eds.), Berlin/Heidelberg: Springer-Verlag, 2009, pp.29–53.Google Scholar
  63. [63]
    Weise T, Podlich A, Reinhard K, Gorldt C, Geihs K. Evolutionary freight transportation planning. In Lecture Notes in Computer Science 5484, Giacobini M, Brabazon A, Cagnonj S et al. (eds.), Springer-Verlag, 2009, pp.768–777.Google Scholar
  64. [64]
    Pétrowski A. A clearing procedure as a niching method for genetic algorithms. In Proc. the IEEE International Conference on Evolutionary Computation (CEC1996), Nagoya, Japan, May 1996, pp.798–803.Google Scholar
  65. [65]
    Darwen P J, Yao X. Every niching method has its niche: Fitness sharing and implicit sharing compared. In Proc. the 4th International Conference on Parallel Problem Solving from Nature, Berlin, Germany, September 22-24, 1996, pp.398–407.Google Scholar
  66. [66]
    Weise T, Niemczyk S, Chiong R, Wan M. A framework for multi-model edas with model recombination. In Proc. the EvoApplications 2011, Part 1, Torino, Italy, April 27-29, 2011, pp.304–313.Google Scholar
  67. [67]
    Lu Q, Yao X. Clustering and learning gaussian distribution for continuous optimization. IEEE Transactions on Systems, Man, and Cybernetics — Part C: Applications and Reviews, 2005, 35(2): 195–204.CrossRefGoogle Scholar
  68. [68]
    Cao A, Chen Y, Wei J, Li J. A hybrid evolutionary algorithm based on edas and clustering analysis. In Proc. the 26th Chinese Control Conference (CCC2007), Zhangjiajie, Hunan, China, July 26-31, 2007, pp.754–758.Google Scholar
  69. [69]
    Pelikan M, Sastry K, Goldberg D E. Multiobjective HBOA, clustering, and scalability. In Proc. the Genetic and Evolutionary Computation Conference (GECCO2005), Washington, DC, USA, June 25-27, 2005, pp.663–670.Google Scholar
  70. [70]
    Rudolph G. Self-adaptation and global convergence: A counter-example. In Proc. the IEEE Congress on Evolutionary Computation (CEC1999), Washington, DC, USA, July 6-9, 1999, pp.646–651.Google Scholar
  71. [71]
    Rudolph G. Self-adaptive mutations may lead to premature convergence. IEEE Transactions on Evolutionary Computation, 2001, 5(4): 410–414.CrossRefGoogle Scholar
  72. [72]
    Lochtefeld D F, Ciarallo F W. Helper-objective optimization strategies for the job-shop scheduling problem. Applied Soft Computing, 2011, 11(6): 4161–4174.CrossRefGoogle Scholar
  73. [73]
    Knowles J D, Watson R A, Corne D W. Reducing local optima in single-objective problems by multi-objectivization. In Proc. the 1st International Conference on Evolutionary Multi-Criterion Optimization (EMO2001), Zürich, Switzerland, March 7-9, 2001, pp.269–283.Google Scholar
  74. [74]
    Jensen M T. Helper-objectives: Using multi-objective evolutionary algorithms for single-objective optimisation. Journal of Mathematical Modelling and Algorithms, 2004, 3(4): 323–347.MathSciNetCrossRefMATHGoogle Scholar
  75. [75]
    Jähne M, Li X, Branke J. Evolutionary algorithms and multi-objectivization for the travelling salesman problem. In Proc. the 11th Annual Conference on Genetic and Evolutionary Computation, Montréal, QC, Canada, July 8-12, 2009, pp.595–602.Google Scholar
  76. [76]
    Lochtefeld D F, Ciarallo F W. Deterministic helper objective sequence applied to the job-shop scheduling problem. In Proc. the Genetic and Evolutionary Computation Conference, Portland, OR, USA, July 7-11, 2010, pp.431–438.Google Scholar
  77. [77]
    Lochtefeld D F, Ciarallo F W. Multiobjectivization via helper-objectives with the tunable objectives problem. IEEE Transactions on Evolutionary Computation, 2011, 16(3): 373–390.CrossRefGoogle Scholar
  78. [78]
    Handl J, Lovell S C, Knowles J D. Multiobjectivization by decomposition of scalar cost functions. In Proc. the 10th International Conference on Parallel Problem Solving from Nature, Dortmund, North Rhine-Westphalia, Germany, September 13-17, 2008, pp.31–40.Google Scholar
  79. [79]
    Rechenberg I. Evolutionsstrategie: Optimierung technischer systeme nach prinzipien der biologischen evolution [Ph.D. Thesis]. Technische UniversitÄat Berlin, Stuttgart, Germany, 1971.Google Scholar
  80. [80]
    Rechenberg I. Evolutionsstrategie ‘94. Bad Cannstadt, Stuttgart, Baden-Württemberg, Germany: Frommann-Holzboog Verlag, 1994.Google Scholar
  81. [81]
    Kolarov K. Landscape ruggedness in evolutionary algorithms. In Proc. the IEEE International Conference on Evolutionary Computation, Indianapolis, IN, USA, April 13-16, 1997, pp.19–24.Google Scholar
  82. [82]
    Whitley L D, Gordon V S, Mathias K E. Lamarckian evolution, the baldwin effect and function optimization. In Proc. the 3rd Conference on Parallel Problem Solving from Nature, Jerusalem, Israel, October 9-14, 1994, pp.6–15.Google Scholar
  83. [83]
    Hinton G E, Nowlan S J. How learning can guide evolution. Complex Systems, 1987, 1(3): 495–502.MATHGoogle Scholar
  84. [84]
    Holstein D, Moscato P. Memetic algorithms using guided local search: A case study. In New Ideas in Optimization, Corne D W, Dorigo M, Glover F, Dasgupta D, Moscato P, Poli R, Price K V (eds.), Maidenhead, England: McGraw-Hill Ltd., 1999, pp.235–244.Google Scholar
  85. [85]
    Moscato P, Cotta C. A gentle introduction to memetic algorithms. In Handbook of Metaheuristics, Glover F, Kochenberger G A (eds.), Norwell, MA, USA: Kluwer Academic Publishers and Springer Netherlands, 2003, pp.105–144.Google Scholar
  86. [86]
    Radcliffe N J, Surry P D. Formal memetic algorithms. In Lecture Notes in Computer Science 865, Fogarty T C (ed.), Springer-Verlag, 1994, pp.1–16.Google Scholar
  87. [87]
    Mühlenbein H. How genetic algorithms really work — I. mutation and hillclimbing. In Proc. the Parallel Problem Solving from Nature 2 (PPSN II), Brussels, Belgium, September 28-30, 1992, pp.15–26.Google Scholar
  88. [88]
    Davis L (ed.). Handbook of Genetic Algorithms. Stamford, CT, USA: Thomson Publishing Group, Inc., 1991.Google Scholar
  89. [89]
    Gruau F, Whitley L D. Adding learning to the cellular development of neural networks: Evolution and the baldwin effect. Evolutionary Computation, 1993, 1(3): 213–233.CrossRefGoogle Scholar
  90. [90]
    Liang K, Yao X, Newton C S. Evolutionary search of approximated n-dimensional landscapes. International Journal of Knowledge-Based and Intelligent Engineering Systems, 2000, 4(3): 172–183.Google Scholar
  91. [91]
    Wang Y, Li B, Weise T. Estimation of distribution and differential evolution cooperation for large scale economic load dispatch optimization of power systems. Information Sciences — Informatics and Computer Science Intelligent Systems Applications, 2010, 180(12): 2405–2420.Google Scholar
  92. [92]
    Weise T, Tang K. Evolving distributed algorithms with genetic programming. IEEE Transactions on Evolutionary Computation, 2011, 16(2): 242–265.CrossRefGoogle Scholar
  93. [93]
    Weise T. Evolving distributed algorithms with genetic programming [Ph.D. Thesis]. Distributed Systems Group, University of Kassel, 2009.Google Scholar
  94. [94]
    Goldberg D E. Genetic algorithms and walsh functions: Part i, a gentle introduction. Complex Systems, 1989, 3(2): 129–152.MathSciNetMATHGoogle Scholar
  95. [95]
    Goldberg D E. Genetic algorithms and walsh functions: Part ii, deception and its analysis. Complex Systems, 1989, 3(2): 153–171.MathSciNetMATHGoogle Scholar
  96. [96]
    Liepins G E, Vose M D. Deceptiveness and genetic algorithm dynamics. In Proc. the 1st Workshop on Foundations of Genetic Algorithms (FOGA1990), Bloomington, IN, USA, July 15-18, 1990, pp.36–50.Google Scholar
  97. [97]
    Weise T, Niemczyk S, Skubch H, Reichle R, Geihs K. A tunable model for multi-objective, epistatic, rugged, and neutral fitness landscapes. In Proc. the Genetic and Evolutionary Computation Conference (GECCO2008), Atlanta, GA, USA, July 12-16, 2008, pp.795–802.Google Scholar
  98. [98]
    Schnier T, Yao X. Using multiple representations in evolutionary algorithms. In Proc. the IEEE Congress on Evolutionary Computation, La Jolla, CA, USA, July 16-19, 2000, pp.479–486.Google Scholar
  99. [99]
    Li X. Niching without niching parameters: Particle swarm optimization using a ring topology. IEEE Transactions on Evolutionary Computation, 2010, 14(1): 150–169.CrossRefGoogle Scholar
  100. [100]
    Hutter M. Fitness uniform selection to preserve genetic diversity. In Proc. the IEEE Congress on Evolutionary Computation, Honolulu, HI, USA, May 12-17, 2002, pp.783–788.Google Scholar
  101. [101]
    Hutter M, Legg S. Fitness uniform optimization. IEEE Trans. Evolutionary Computation, 2006, 10(5): 568–589.CrossRefGoogle Scholar
  102. [102]
    Lehman J, Stanley K O. Exploiting open-endedness to solve problems through the search for novelty. In Proc. the 11th International Conference on Artificial Life, Winchester, Hampshire, UK, August 5-8, 2008, pp.329–336.Google Scholar
  103. [103]
    Lehman J, Stanley K O. Abandoning objectives: Evolution through the search for novelty alone. Evolutionary Computation, 2011, 19(2): 189–223.CrossRefGoogle Scholar
  104. [104]
    Lehman J, Stanley K O. Evolving a diversity of virtual creatures through novelty search and local competition. In Proc. the Genetic and Evolutionary Computation Conference, Dublin, Ireland, July 12-16, 2011, pp.211–218.Google Scholar
  105. [105]
    Wan M, Weise T, Tang K. Novel loop structures and the evolution of mathematical algorithms. In Proc. the 14th European Conference on Genetic Programming (EuroGP2011), Torino, Italy, April 27-29, 2011, pp.49–60.Google Scholar
  106. [106]
    Weise T. Evolving distributed algorithms with genetic programming. In Proc. the 1st ACM/SIGEVO Sunmit on Genetic and Evolutionary Computation, Shanghai, China, June 12-14, 2009, pp.577–584.Google Scholar
  107. [107]
    Reidys C M, Stadler P F. Neutrality in fitness landscapes. Journal of Applied Mathematics and Computation, 2001, 117(2-3): 321–350.MathSciNetCrossRefMATHGoogle Scholar
  108. [108]
    Barnett L. Ruggedness and neutrality — the NKP family of fitness landscapes. In Proc. the 6th International Conference on Artificial Life , Los Angeles, CA, USA, June 27-29, 1998, pp.18–27.Google Scholar
  109. [109]
    Hu T, Banzhaf W. Evolvability and speed of evolutionary algorithms in the light of recent developments in biology. Journal of Arti¯cial Evolution and Applications, 2010, January, Article No.1.Google Scholar
  110. [110]
    Wagner A. Robustness, evolvability, and neutrality. FEBS Letters, 2005, 579(8): 1772–1778.CrossRefGoogle Scholar
  111. [111]
    Kirschner M, Gerhart J. Evolvability. Proc. the National Academy of Science of the United States of America (PNAS), 1998, 95(15): 8420–8427.CrossRefGoogle Scholar
  112. [112]
    Beyer H. Toward a theory of evolution strategies: The (μ, λ)- theory. Evolutionary Computation, 1994, 2(4): 381–407.MathSciNetCrossRefGoogle Scholar
  113. [113]
    Altenberg L. Fitness distance correlation analysis: An instructive counterexample. In Proc. the 7th International Conference on Genetic Algorithms (ICGA1997), East Lansing, MI, USA, July 19-23, 1997, pp.57–64.Google Scholar
  114. [114]
    Altenberg L. The schema theorem and price’s theorem. In Proc. the 3rd Workshop on Foundations of Genetic Algorithms, Estes Park, CO, USA, July 31-August 2, 1994, pp.23–49.Google Scholar
  115. [115]
    Yu G T. Program evolvability under environmental variations and neutrality. In Proc. the 9th European Conference on Advances in Artificial Life (ECAL2007), Lisbon, Portugal, September 10-14, 2007, pp.835–844.Google Scholar
  116. [116]
    Beaudoin W, Vérel S, Collard P, Escazut C. Deceptiveness and neutrality the ND family of fitness landscapes. In Proc. the 8th Annual Conference on Genetic and Evolutionary Computation, Seattle, USA, July 8-12, 2006, pp.507–514.Google Scholar
  117. [117]
    Kimura M. The Neutral Theory of Molecular Evolution. Cambridge, UK: Cambridge University Press, 1985.Google Scholar
  118. [118]
    Toussaint M, Igel C. Neutrality: A necessity for self-adaptation. In Proc. the IEEE Congress on Evolutionary Computation, Honolulu, USA, May 12-17, 2002, pp.1354–1359.Google Scholar
  119. [119]
    Gould S J, Eldredge N. Punctuated equilibrium comes of age. Nature, 1993, 366(6452): 223–227.CrossRefGoogle Scholar
  120. [120]
    van Nimwegen E, Crutchfield J P, Mitchell M. Statistical dynamics of the royal road genetic algorithm. Theoretical Computer Science, 1999, 229(1-2): 41–102.MathSciNetCrossRefMATHGoogle Scholar
  121. [121]
    Cohoon J P, Hegde S U, MartinWN, Richards D. Punctuated equilibria: A parallel genetic algorithm. In Proc. the 2nd International Conference on Genetic Algorithms and their Applications, Cambridge, USA, July 28-31, 1987, pp.148–154.Google Scholar
  122. [122]
    Martin W N, Lienig J, Cohoon J P. Island (migration) models: Evolutionary algorithms based on punctuated equilibria. In Handbook of Evolutionary Computation, Bäck T, Fogel D B, Michalewicz Z (eds.), New York, USA: Oxford University Press, Inc., Institute of Physics Publishing Ltd. (IOP), and CRC Press, Inc., 1997, pp.448–463.Google Scholar
  123. [123]
    Edelman G M, Gally J A. Degeneracy and complexity in biological systems. Proc. the National Academy of Science of the United States of America, 2001, 98(24): 13763–13768.CrossRefGoogle Scholar
  124. [124]
    Whitacre J M, Rohlfshagen P, Bender A, Yao X. The role of degenerate robustness in the evolvability of multi-agent systems in dynamic environments. In Proc. the 11th International Conference on Parallel Problem Solving From Nature, Part 1, Kraków, Poland, September 11-15, 2010, pp.284–293.Google Scholar
  125. [125]
    Smith T, Husbands P, Layzell P, O’Shea M. Fitness landscapes and evolvability. Evolutionary Computation, 2002, 10(1): 1–34.CrossRefGoogle Scholar
  126. [126]
    Preuß M, Schönemann L, Emmerich M T. Counteracting genetic drift and disruptive recombination in (μ+, λ)-ea on multimodal fitness landscapes. In Proc. the Genetic and Evolutionary Computation Conference (GECCO2005), Washington, DC, USA, June 25-27, 2005, pp.865–872.Google Scholar
  127. [127]
    Weicker K, Weicker N. Burden and benefits of redundancy. In Proc. the 6th Workshop on Foundations of Genetic Algorithms, Charlottesville, USA, July 21-23, 2001, pp.313–333.Google Scholar
  128. [128]
    Shipman R, Shackleton M, Ebner M, Watson R A. Neutral search spaces for artificial evolution: A lesson from life. In Proc. the 7th International Conference on Artificial Life, Portland, OR, USA, August 1-2, 2000, pp.162–167.Google Scholar
  129. [129]
    Shackleton M, Shipman R, Ebner M. An investigation of redundant genotype-phenotype mappings and their role in evolutionary search. In Proc. the IEEE Congress on Evolutionary Computation, La Jolla, USA, July 16-19, 2000, pp.493–500.Google Scholar
  130. [130]
    Kauffman S A. The Origins of Order: Self-Organization and Selection in Evolution. New York, USA: Oxford University Press, Inc., 1993.Google Scholar
  131. [131]
    Vassilev V K, Miller J F. The advantages of landscape neutrality in digital circuit evolution. In Proc. the 3rd International Conference on Evolvable Systems — From Biology to Hardware, Edinburgh, Scotland, UK, April 17-19, 2000, pp.252–263.Google Scholar
  132. [132]
    Yu G T, Miller J F. Finding needles in haystacks is not hard with neutrality. In Proc. the 5th European Conference on Genetic Programming, Kinsale, Ireland, April 3-5, 2002, pp.46–54.Google Scholar
  133. [133]
    Goldberg D E. Making genetic algorithm fly: A lesson from the wright brothers. Advanced Technology for Developers, 1993, 2: 1–8.Google Scholar
  134. [134]
    Tschudin C F. Fraglets — A metabolistic execution model for communication protocols. In Proc. the 2nd Annual Symposium on Autonomous Intelligent Networks and Systems (AINS2003), Menlo Park, CA, USA, June 30-July 1, 2003.Google Scholar
  135. [135]
    Weise T, Zapf M. Evolving distributed algorithms with genetic programming: Election. In Proc. the 1st ACM/SIGEVO Summit on Genetic and Evolutionary Computation, Shanghai, China, June 12-14, 2009, pp.577–584.Google Scholar
  136. [136]
    Ronald S. Robust encodings in genetic algorithms: A survey of encoding issues. In Proc. the IEEE International Conference on Evolutionary Computation (CEC1997), Indianapolis, IN, USA, April 13-16, 1997, pp.43–48.Google Scholar
  137. [137]
    Phillips P C. The language of gene interaction. Genetics, 1998, 149(3): 1167–1171.Google Scholar
  138. [138]
    Lush J L. Progeny test and individual performance as indicators of an animal's breeding value. Journal of Dairy Science, 1935, 18(1): 1–19.CrossRefGoogle Scholar
  139. [139]
    Davidor Y. Epistasis variance: A viewpoint on GA-hardness. In Proc. the 1st Workshop on Foundations of Genetic Algorithms, Bloomington, IN, USA, July 15-18, 1990, pp.23–35.Google Scholar
  140. [140]
    Altenberg L. Nk fitness landscapes. In Handbook of Evolutionary Computation, Bäck T, Fogel D B, Michalewicz Z (eds.), New York, USA: Oxford University Press, Inc., Institute of Physics Publishing Ltd. (IOP), and CRC Press, Inc., 1997, chapter B2.7.2.Google Scholar
  141. [141]
    Naudts B, Verschoren A. Epistasis on finite and infinite spaces. In Proc. the 8th International Conference on Systems Research, Informatics and Cybernetics, Baden-Baden, Baden-Württemberg, Germany, August 14-18, 1996, pp.19–23.Google Scholar
  142. [142]
    Tang K, Yao X, Suganthan P N, MacNish C, Chen Y, Chen C, Yang Z. Benchmark functions for the CEC’2008 special session and competition on large scale global optimization. Technical Report, Nature Inspired Computation and Applications Laboratory, School of Computer Science and Technology, University of Science and Technology of China, 2007.Google Scholar
  143. [143]
    Tang K, Li X, Suganthan P N, Yang Z, Weise T. Benchmark functions for the CEC'2010 special session and competition on large-scale global optimization. Technical Report, Nature Inspired Computation and Applications Laboratory, School of Computer Science and Technology, University of Science and Technology of China, 2010.Google Scholar
  144. [144]
    Auger A, Hansen N, Mauny N, Ros R, Schoenauer M. Bioinspired continuous optimization: The coming of age. In Proc. the IEEE Congress on Evolutionary Computation (CEC2007), Singapore, September 25-28, 2007.Google Scholar
  145. [145]
    Sutton A M, Lunacek M, Whitley L D. Differential evolution and non-separability: Using selective pressure to focus search. In Proc. the 9th Genetic and Evolutionary Computation Conference, London, UK, July 7-11, 2007, pp.1428–1435.Google Scholar
  146. [146]
    Bowers C P. Simulating evolution with a computational model of embryogeny: Obtaining robustness from evolved individuals. In Proc. the 8th European Conference on Advances in Artificial Life, Canterbury, Kent, UK, September 5-9, 2005, pp.149–158.Google Scholar
  147. [147]
    Weise T, Zapf M, Geihs K. Rule-based genetic programming. In Proc. the 2nd International Conference on Bio-Inspired Models of Network, Information, and Computing Systems, Budapest, Hungary, December 10-13, 2007, pp.8–15.Google Scholar
  148. [148]
    Shinkai M, Aguirre A H, Tanaka K. Mutation strategy improves gas performance on epistatic problems. In Proc. the IEEE Congress on Evolutionary Computation (CEC2002), Honolulu, HI, USA, May 12-17, 2002, pp.968–973.Google Scholar
  149. [149]
    Winter P C, Hickey G I, Fletcher H L. Instant Notes in Genetics (1st edition). Oxford, UK: BIOS Scientific Publishers Ltd., Taylor and Francis LLC, Science Press, 1998.Google Scholar
  150. [150]
    Munetomo M, Goldberg D E. Linkage identification by non-monotonicity detection for overlapping functions. Evolutionary Computation, 1999, 7(4): 377–398.CrossRefGoogle Scholar
  151. [151]
    Harik G R. Learning gene linkage to efficiently solve problems of bounded difficulty using genetic algorithms [Ph.D. Thesis]. University of Michigan, 1997.Google Scholar
  152. [152]
    Deb K, Sinha A, Kukkonen S. Multi-objective test problems, linkages, and evolutionary methodologies. In Proc. the 8th Annual Conference on Genetic and Evolutionary Computation, Seattle, WA, USA, July 8-12, 2006, pp.1141–1148.Google Scholar
  153. [153]
    Goldberg D E, Deb K, Korb B. Messy genetic algorithms: Motivation, analysis, and first results. Complex Systems, 1989, 3(5): 493–530.MathSciNetMATHGoogle Scholar
  154. [154]
    Cantú-Paz E, Pelikan M, Goldberg D E. Linkage problem, distribution estimation, and bayesian networks. Evolutionary Computation, 2000, 8(3): 311–340.CrossRefGoogle Scholar
  155. [155]
    Angeline P J, Pollack J B. Evolutionary module acquisition. In Proc. the 2nd Annual Conf. Evolutionary Programming, La Jolla, CA, USA, February 25-26, 1993, pp.154–163.Google Scholar
  156. [156]
    Chen W, Weise T, Yang Z, Tang K. Large-scale global optimization using cooperative coevolution with variable interaction learning. In Proc. the 11th International Conference on Parallel Problem Solving From Nature, Part 2, Kraków, Poland, September 11-15, 2010, pp.300–309.Google Scholar
  157. [157]
    Potter M A, De Jong K A. Cooperative coevolution: An architecture for evolving coadapted subcomponents. Evolutionary Computation, 2000, 8(1): 1–29.CrossRefGoogle Scholar
  158. [158]
    Jin Y, Branke J. Evolutionary optimization in uncertain environments — A survey. IEEE Transactions on Evolutionary Computation, 2005, 9(3): 303–317.CrossRefGoogle Scholar
  159. [159]
    Yang S, Ong Y, Jin Y (eds.). Evolutionary Computation in Dynamic and Uncertain Environments. Berlin/Heidelberg: Springer-Verlag, 2007.Google Scholar
  160. [160]
    Miller B L, Goldberg D E. Genetic algorithms, selection schemes, and the varying effects of noise. Evolutionary Computation, 1996, 4(2): 113–131.CrossRefGoogle Scholar
  161. [161]
    Lee J Y, Wong P C. The effect of function noise on GP efficiency. In Lecture Notes in Computer Science 956, Goos G, Hartmanis J, van Leeuwen (eds.), 1993, pp.1–16.Google Scholar
  162. [162]
    Fitzpatrick J M, Grefenstette J J. Genetic algorithms in noisy environments. Machine Learning, 1998, 3(2-3): 101–120.Google Scholar
  163. [163]
    Sano Y, Kita H. Optimization of noisy fitness functions by means of genetic algorithms using history of search with test of estimation. In Proc. the IEEE Congress on Evolutionary Computation, Honolulu, USA, May 12-17, 2002, pp.360–365.Google Scholar
  164. [164]
    Bäck T, Hammel U. Evolution strategies applied to perturbed objective functions. In Proc. the 1st IEEE Conference on Evolutionary Computation, Orlando, USA, June 27-29, 1994, pp.40–45.Google Scholar
  165. [165]
    Hammel U, Bäck T. Evolution strategies on noisy functions: How to improve convergence properties. In Proc. the 3rd Conference on Parallel Problem Solving from Nature, Jerusalem, Israel, October 9-14, 1994, pp.159–168.Google Scholar
  166. [166]
    Pan H, Wang L, Liu B. Particle swarm optimization for function optimization in noisy environment. Journal of Applied Mathematics and Computation, 2006, 181(2): 908–919.MathSciNetCrossRefMATHGoogle Scholar
  167. [167]
    Branke J. Creating robust solutions by means of evolutionary algorithms. In Proc. the 5th International Conference on Parallel Problem Solving from Nature, Amsterdam, The Netherlands, September 27-30, 1998, pp.119–128.Google Scholar
  168. [168]
    Taguchi G. Introduction to Quality Engineering: Designing Quality into Products and Processes. Chiyoda-ku, Tokyo, Japan: Asian Productivity Organization (APO) and Kraus International Publications, 1986.Google Scholar
  169. [169]
    Greiner H. Robust optical coating design with evolutionary strategies. Applied Optics, 1996, 35(28): 5477–5483.CrossRefGoogle Scholar
  170. [170]
    Wiesmann D, Hammel U, Bäck T. Robust design of multilayer optical coatings by means of evolutionary algorithms. IEEE Trans. Evolutionary Computation, 1998, 2(4): 162–167.CrossRefGoogle Scholar
  171. [171]
    Zitzler E, Deb K, Thiele L. Comparison of multiobjective evolutionary algorithms: Empirical results. Evolutionary Computation, 2000, 8(2): 173–195.CrossRefGoogle Scholar
  172. [172]
    Coello Coello C A. Evolutionary multiobjective optimization. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 2011, 1(5): 444–447.CrossRefGoogle Scholar
  173. [173]
    Rudolph G. On a multi-objective evolutionary algorithm and its convergence to the pareto set. In Proc. the 1998 IEEE International Conference on Evolutionary Computation, Anchorage, AK, USA, May 4-9, 1998, pp.511–515.Google Scholar
  174. [174]
    Khare V, Yao X, Deb K. Performance scaling of multiobjective evolutionary algorithms. In Proc. the 2nd International Conference on Evolutionary Multi-Criterion Optimization, Faro, Portugal, April 8-11, 2003, pp.367–390.Google Scholar
  175. [175]
    Coello Coello C A, Lamont G B, van Veldhuizen D A. Evolutionary Algorithms for Solving Multi-Objective Problems. Boston, MA, USA: Springer US and Kluwer Academic Publishers, 2002.Google Scholar
  176. [176]
    Khare V. Performance scaling of multi-objective evolutionary algorithms [Master Thesis]. School of Computer Science, University of Birmingham, 2002.Google Scholar
  177. [177]
    López Jaimes A, Santana-Quintero L V, Coello Coello C A. Ranking methods in many-objective evolutionary algorithms. In Nature-Inspired Algorithms for Optimisation, Studies in Computational Intelligence 193/2009, Chiong R (ed.), Berlin/Heidelberg: Springer-Verlag, 2009, pp.413–434.Google Scholar
  178. [178]
    Purshouse R C. On the evolutionary optimisation of many objectives [Ph.D. Thesis]. Department of Automatic Control and Systems Engineering, University of She ± eld, 2003.Google Scholar
  179. [179]
    Farina M, Amato P. On the optimal solution definition for many-criteria optimization problems. In Proc. the Annual Meeting of the North American Fuzzy Information Processing Society, New Orleans, USA, June 27-29, 2002, pp.233–238.Google Scholar
  180. [180]
    Deb K, Pratab A, Agrawal S, Meyarivan T. A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE Transactions on Evolutionary Computation, 2002, 6(2): 182–197.CrossRefGoogle Scholar
  181. [181]
    Corne D W, Knowles J D, Oates M J. The pareto envelope-based selection algorithm for multiobjective optimization. In Proc. the 6th International Conference on Parallel Problem Solving from Nature, Paris, France, September 18-20, 2000, pp.839–848.Google Scholar
  182. [182]
    Hughes E J. Evolutionary many-objective optimisation: Many once or one many? In Proc. the IEEE Congress on Evolutionary Computation (CEC2005), Edinburgh, Scotland, UK, September 2-5, 2005, pp.222–227.Google Scholar
  183. [183]
    Kang Z, Kang L, Zou X, Liu M, Li C, Yang M, Li Y, Chen Y, Zeng S. A new evolutionary decision theory for many-objective optimization problems. In Proc. the 2nd International Symposium on Advances in Computation and Intelligence, Wuhan, Hubei, China, September 21-23, 2007, pp.1–11.Google Scholar
  184. [184]
    Ishibuchi H, Nojima Y, Doi T. Comparison between single-objective and multi-objective genetic algorithms: Performance comparison and performance measures. In Proc. the IEEE Congress on Evolutionary Computation (CEC2006), Vancouver, BC, Canada, July 16-21, 2006, pp.3959–3966.Google Scholar
  185. [185]
    Ikeda K, Kita H, Kobayashi S. Failure of pareto-based moeas: Does non-dominated really mean near to optimal? In Proc. the IEEE Congress on Evolutionary Computation, Gangnamgu, Seoul, Korea, May 27-30, 2001, pp.957–962.Google Scholar
  186. [186]
    Deb K, Thiele L, Laumanns M, Zitzler E. Scalable multi-objective optimization test problems. In Proc. the IEEE Congress on Evolutionary Computation (CEC2002), Honolulu, HI, USA, May 12-17, 2002, pp.825–830.Google Scholar
  187. [187]
    Bouyssou D. Building criteria: A prerequisite for MCDA. In Selected Readings from the 3rd International Summer School on Multicriteria Decision Aid: Methods, Applications, and Software (MCDA1990), Monte Estoril, Lisbon, Portugal, July 23-27, 1990, pp.58–80.Google Scholar
  188. [188]
    Miller G A. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 1956, 63(2): 81–97.CrossRefGoogle Scholar
  189. [189]
    Ishibuchi H, Tsukamoto N, Nojima Y. Evolutionary many-objective optimization: A short review. In Proc. the IEEE Congress on Evolutionary Computation (CEC2008), Hong Kong, China, June 1-6, 2008, pp.2424–2431.Google Scholar
  190. [190]
    Deb K. Multi-Objective Optimization Using Evolutionary Algorithms. New York, USA: John Wiley & Sons Ltd., 2001.Google Scholar
  191. [191]
    López Jaimes A, Coello Coello C A. Some techniques to deal with many-objective problems. In Proc. the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference (GECCO2009), Montréal, QC, Canada, July 8-12, 2009, pp.2693–2696.Google Scholar
  192. [192]
    Praditwong K, Yao X. A new multi-objective evolutionary optimisation algorithm: The two-archive algorithm. In Proc. the 2006 International Conference on Computational Intelligence and Security (CIS2006), Guangzhou, China, November 3-6, 2006, pp.286–291.Google Scholar
  193. [193]
    Praditwong K, Harman M, Yao X. Software module clustering as a multi-objective search problem. IEEE Transactions on Software Engineering, 2011, 37(2): 264–282.CrossRefGoogle Scholar
  194. [194]
    Sato H, Aguirre A H, Tanaka K. Controlling dominance area of solutions and its impact on the performance of moeas. In Proc. the 4th International Conference on Evolutionary Multi-Criterion Optimization (EMO2007), Matsushima, Sendai, Japan, March 5-8, 2007, pp.5–20.Google Scholar
  195. [195]
    Drechsler N, Drechsler R, Becker B. Multi-objective optimisation based on relation favour. In Proc. the 1st International Conference on Evolutionary Multi-Criterion Optimization (EMO2001), Zürich, Switzerland, March 7-9, 2001, pp.154–166.Google Scholar
  196. [196]
    Köppen M, Yoshida K. Substitute distance assignments in NSGA-II for handling many-objective optimization problems. In Proc. the 4th International Conference on Evolutionary Multi-Criterion Optimization (EMO2007), Matsushima, Sendai, Japan, March 5-8, 2007, pp.727–741.Google Scholar
  197. [197]
    Corne D W, Knowles J D. Techniques for highly multiobjective optimisation: Some nondominated points are better than others. In Proc. the 9th Genetic and Evolutionary Computation Conference, London, UK, July 7-11, 2007, pp.773–780.Google Scholar
  198. [198]
    Kukkonen S, Lampinen J A. Ranking-dominance and many-objective optimization. In Proc. the IEEE Congress on Evolutionary Computation (CEC2007), Singapore, September 25-28, 2007, pp.3983–3990.Google Scholar
  199. [199]
    Wagner T, Beume N, Naujoks B. Pareto-, aggregation-, and indicator-based methods in many-objective optimization. In Proc. the 4th International Conference on Evolutionary Multi-Criterion Optimization (EMO2007), Matsushima, Sendai, Japan, March 5-8, 2007, pp.742–756.Google Scholar
  200. [200]
    Ishibuchi H, Tsukamoto N, Nojima Y. Iterative approach to indicator-based multiobjective optimization. In Proc the IEEE Congress on Evolutionary Computation (CEC2007), Singapore, September 25-28, 2007, pp.3967–3974.Google Scholar
  201. [201]
    Bringmann K, Friedrich T. The maximum hypervolume set yields near-optimal approximation. In Proc. the Genetic and Evolutionary Computation Conference (GECCO2010), Portland, OR, USA, July 7-11, 2010, pp.511–518.Google Scholar
  202. [202]
    Ishibuchi H, Doi T, Nojima Y. Incorporation of scalarizing fitness functions into evolutionary multiobjective optimization algorithms. In Proc. the 9th International Conference on Parallel Problem Solving from Nature, Reykjavik, Iceland, September 9-13, 2006, pp.493–502.Google Scholar
  203. [203]
    Hughes E J. MSOPS-II: A general-purpose many-objective optimiser. In Proc. the IEEE Congress on Evolutionary Computation, Singapore, September 25-28, 2007, pp.3944–3951.Google Scholar
  204. [204]
    Fleming P J, Purshouse R C, Lygoe R J. Many-objective optimization: An engineering design perspective. In Proc. the 3rd International Conference on Evolutionary Multi-Criterion Optimization, Guanajuato, México, March 9-11, 2005, pp.14–32.Google Scholar
  205. [205]
    Deb K, Sundar J. Reference point based multi-objective optimization using evolutionary algorithms. In Proc. the 8th Annual Conference on Genetic and Evolutionary Computation, Seattle, WA, USA, July 8-12, 2006, pp.635–642.Google Scholar
  206. [206]
    Brockhoff D, Zitzler E. Are all objectives necessary? On dimensionality reduction in evolutionary multiobjective optimization. In Proc. the 9th International Conference on Parallel Problem Solving from Nature, Reykjavik, Iceland, September 9-13, 2006, pp.533–542.Google Scholar
  207. [207]
    Saxena D K, Deb K. Dimensionality reduction of objectives and constraints in multi-objective optimization problems: A system design perspective. In Proc. the IEEE Congress on Evolutionary Computation (CEC2008), Hong Kong, China, June 1-6, 2008, pp.3204–3211.Google Scholar
  208. [208]
    Brockhoff D, Zitzler E. Improving hypervolume-based multi-objective evolutionary algorithms by using objective reduction methods. In Proc. the IEEE Congress on Evolutionary Computation (CEC2007), Singapore, September 25-28, 2007, pp.2086–2093.Google Scholar
  209. [209]
    Furuhashi T, Yoshikawa T. Visualization techniques for mining of solutions. In Proc. the 8th International Symposium on Advanced Intelligent Systems (ISIS2007), Sokcho, Korea, September 5-8, 2007, pp.68–71.Google Scholar
  210. [210]
    Köppen M, Yoshida K. Visualization of pareto-sets in evolutionary multi-objective optimization. In Proc. the 7th International Conference on Hybrid Intelligent Systems, Kaiserslautern, Germany, September 17-19, 2007, pp.156–161.Google Scholar
  211. [211]
    Bellman R E. Dynamic Programming. Princeton, NJ, USA: Princeton University Press, 1957.Google Scholar
  212. [212]
    Bellman R E. Adaptive Control Processes: A Guided Tour. Princeton, NJ, USA: Princeton University Press, 1961.Google Scholar
  213. [213]
    Sabharwal A. Combinatorial problems i: Finding solutions. In the 2nd Asian-Pacific School on Statistical Physics and Interdisciplinary Applications, Beijing, China, March 3-14, 2008.Google Scholar
  214. [214]
    Amdahl G M. Validity of the single processor approach to achieving large-scale computing capabilities. In Proc. the Spring Joint Computer Conference (AFIPS), Atlantic City, NJ, USA, April 18-20, 1967, pp.483–485.Google Scholar
  215. [215]
    Liu P, Lau F C M, Lewis M J, Wang C. A new asynchronous parallel evolutionary algorithm for function optimization. In Proc. the 7th International Conference on Parallel Problem Solving from Nature, Granada, Spain, September 7-11, 2002, pp.401–410.Google Scholar
  216. [216]
    Cantú-Paz E. A survey of parallel genetic algorithms. Calculateurs Parallèles, Réseaux et Systèmes Répartis, 1998, 10(2): 141–171.Google Scholar
  217. [217]
    Alba Torres E, Troya J M. A survey of parallel distributed genetic algorithms. Complexity, 1999, 4(4): 31–52.MathSciNetCrossRefGoogle Scholar
  218. [218]
    Alba Torres E, Tomassini M. Parallelism and evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 2002, 6(5): 443–462.CrossRefGoogle Scholar
  219. [219]
    Weise T, Geihs K. DGPF — An adaptable framework for distributed multi-objective search algorithms applied to the genetic programming of sensor networks. In Proc. the 2nd International Conference on Bioinspired Optimization Methods and their Applications (BIOMA2006), Ljubljana, Slovenia, October 9-10, 2006, pp.157–166.Google Scholar
  220. [220]
    Hauser R, Männer R. Implementation of standard genetic algorithm on mimd machines. In Proc. the 3rd Conference on Parallel Problem Solving from Natur, Jerusalem, Israel, October 9-14, 1994, pp.504–513.Google Scholar
  221. [221]
    Langdon W B, Banzhaf W. A SIMD interpreter for genetic programming on GPU graphics cards. In Proc. the 11th European Conference on Genetic Programming (EuroGP2008), Naples, Italy, March 26-28, 2008, pp.73–85.Google Scholar
  222. [222]
    Zhu W. Nonlinear optimization with a massively parallel evolution strategy-pattern search algorithm on graphics hardware. Applied Soft Computing, 2011, 11(2): 1770–1781.CrossRefGoogle Scholar
  223. [223]
    Dubreuil M, Gagné C, Parizeau M. Analysis of a master-slave architecture for distributed evolutionary computations. IEEE Transactions on Systems, Man, and Cybernetics — Part B: Cybernetics, 2006, 36(1): 229–235.CrossRefGoogle Scholar
  224. [224]
    Tongchim S, Yao X. Parallel evolutionary programming. In Proc. the IEEE Congress on Evolutionary Computation, Portland, OR, USA, June 20-23, 2004, pp.1362–1367.Google Scholar
  225. [225]
    O’Neill M, Ryan C. Grammatical Evolution: Evolutionary Automatic Programming in an Arbitrary Language. New York, USA: Springer Science+Business Media, Inc., 2003.Google Scholar
  226. [226]
    Devert A. Building processes optimization: Toward an artificial ontogeny based approach [Ph.D. Thesis]. Centre de Recherche Saclay — Île-de-France, Ecole Doctorale d’Informatique and Institut National de Recherche en Informatique et en Automatique, Université Paris-Sud, 2009.Google Scholar
  227. [227]
    Husbands P, Mill F. Simulated co-evolution as the mechanism for emergent planning and scheduling. In Proc. the 4th International Conference on Genetic Algorithms (ICGA 1991), San Diego, CA, USA, July 13-16, 1991, pp.264–270.Google Scholar
  228. [228]
    Yang Z, Tang K, Yao X. Large scale evolutionary optimization using cooperative coevolution. Information Sciences — Informatics and Computer Science Intelligent Systems Applications, 2008, 178(15).Google Scholar
  229. [229]
    LaTorre A, Peña J M, Muelas S, Zaforas M. Hybrid evolutionary algorithms for large scale continuous problems. In Proc. the 11th Annual Conference on Genetic and Evolutionary Computation (GECCO2009), Montréal, QC, Canada, July 8-12, 2009, pp.1863–1864.Google Scholar
  230. [230]
    Peng F, Tang K, Chen G, Yao X. Population-based algorithm portfolios for numerical optimization. IEEE Transactions on Evolutionary Computation, 2010, 14(5): 782–800.CrossRefGoogle Scholar
  231. [231]
    Wolpert D H, Macready W G. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1997, 1(1): 67–82.CrossRefGoogle Scholar
  232. [232]
    Auger A, Teytaud O. Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica, 2010, 57(1): 121–146.MathSciNetCrossRefMATHGoogle Scholar
  233. [233]
    Radcliffe N J. The algebra of genetic algorithms. Annals of Mathematics and Arti¯cial Intelligence, 1994, 10(4): 339–384.Google Scholar

Copyright information

© Springer Science+Business Media New York & Science Press, China 2012

Authors and Affiliations

  1. 1.Nature Inspired Computation and Applications LaboratorySchool of Computer Science and Technology University of Science and Technology of ChinaHefeiChina
  2. 2.Faculty of Higher EducationSwinburne University of TechnologyVictoriaAustralia

Personalised recommendations