Performance of Evolutionary Algorithms on Random Decomposable Problems

  • Martin Pelikan
  • Kumara Sastry
  • Martin V. Butz
  • David E. Goldberg
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4193)


This paper describes a class of random additively decomposable problems (rADPs) with and without interactions between the subproblems. The paper then tests the hierarchical Bayesian optimization algorithm (hBOA) and other evolutionary algorithms on a large number of random instances of the proposed class of problems. The results show that hBOA can scalably solve rADPs and that it significantly outperforms all other methods included in the comparison. Furthermore, the results provide a number of interesting insights into both the difficulty of a broad class of decomposable problems as well as the sensitivity of various evolutionary algorithms to different sources of problem difficulty. rADPs can be used to test other optimization algorithms.


Genetic Algorithm Evolutionary Algorithm Problem Instance Hill Climbing Random Instance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Ackley, D.H.: An empirical study of bit vector function optimization. Genetic Algorithms and Simulated Annealing, 170–204 (1987)Google Scholar
  2. 2.
    Deb, K., Goldberg, D.E.: Analyzing deception in trap functions. IlliGAL Report No. 91009, University of Illinois at Urbana-Champaign, Illinois Genetic Algorithms Laboratory, Urbana, IL (1991)Google Scholar
  3. 3.
    Cheeseman, P., Kanefsky, B., Taylor, W.M.: Where the really hard problems are. In: Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI 1991), pp. 331–337 (1991)Google Scholar
  4. 4.
    Santarelli, S., Goldberg, D.E., Yu, T.L.: Optimization of a constrained feed network for an antenna array using simple and competent genetic algorithm techniques. In: Proceedings of the Workshop Military and Security Application of Evolutionary Computation (MSAEC 2004) (2004)Google Scholar
  5. 5.
    Barahona, F., Maynard, R., Rammal, R., Uhry, J.: Morphology of ground states of a two dimensional frustration model. J. Phys. A 15, 673 (1982)CrossRefMathSciNetGoogle Scholar
  6. 6.
    Papadimitriou, C.H.: The Euclidean travelling salesman problem is NP-complete. Theoretical Computer Science 4, 237–244 (1977)MATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Gao, Y., Culberson, J.: Space complexity of EDA. Evolutionary Computation 13(1), 125–143 (2005)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Gao, Y., Culberson, J.: On the treewidth of NK landscapes. In: Genetic and Evolutionary Computation Conference (GECCO 2003), vol. II, pp. 948–954 (2003)Google Scholar
  9. 9.
    Weinberger, E.D.: Local properties of kauffman’s N-k model: A tunably rugged enegy landscape. Physical Review A 44(10), 6399–6413 (1991)CrossRefGoogle Scholar
  10. 10.
    Mühlenbein, H., Paaß, G.: From recombination of genes to the estimation of distributions I. Binary parameters. Parallel Problem Solving from Nature, 178–187 (1996)Google Scholar
  11. 11.
    Pelikan, M., Goldberg, D.E.: Escaping hierarchical traps with competent genetic algorithms. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2001), pp. 511–518 (2001); Also IlliGAL Report No. 2000020Google Scholar
  12. 12.
    Baluja, S.: Population-based incremental learning: A method for integrating genetic search based function optimization and competitive learning. Tech. Rep. No. CMU-CS-94-163, Carnegie Mellon University, Pittsburgh, PA (1994)Google Scholar
  13. 13.
    Pelikan, M., Goldberg, D.E., Lobo, F.: A survey of optimization by building and using probabilistic models. Computational Optimization and Applications 21(1), 5–20 (2002); Also IlliGAL Report No. 99018MATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Larrañaga, P., Lozano, J.A. (eds.): Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation. Kluwer, Boston (2002)MATHGoogle Scholar
  15. 15.
    Pelikan, M.: Hierarchical Bayesian optimization algorithm: Toward a new generation of evolutionary algorithms. Springer, Heidelberg (2005)MATHGoogle Scholar
  16. 16.
    Thierens, D., Goldberg, D.E., Pereira, A.G.: Domino convergence, drift, and the temporal-salience structure of problems. In: Proceedings of the International Conference on Evolutionary Computation (ICEC 1998), pp. 535–540 (1998)Google Scholar
  17. 17.
    Mühlenbein, H.: How genetic algorithms really work: I. Mutation and Hillclimbing. In: Männer, R., Manderick, B. (eds.) Parallel Problem Solving from Nature, pp. 15–25. Elsevier Science, Amsterdam, Netherlands (1992)Google Scholar
  18. 18.
    Thierens, D.: Analysis and design of genetic algorithms. PhD thesis, Katholieke Universiteit Leuven, Leuven, Belgium (1995)Google Scholar
  19. 19.
    Goldberg, D.E.: The design of innovation: Lessons from and for competent genetic algorithms. Genetic Algorithms and Evolutionary Computation, vol. 7. Kluwer Academic Publishers, Dordrecht (2002)Google Scholar
  20. 20.
    Holland, J.H.: Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor (1975)Google Scholar
  21. 21.
    Goldberg, D.E.: Genetic algorithms in search, optimization, and machine learning. Addison-Wesley, Reading (1989)MATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Martin Pelikan
    • 1
  • Kumara Sastry
    • 2
  • Martin V. Butz
    • 3
  • David E. Goldberg
    • 2
  1. 1.Missouri Estimation of Distribution Algorithms Laboratory (MEDAL), 320 CCBUniversity of Missouri in St. LouisSt. Louis
  2. 2.Illinois Genetic Algorithms Laboratory (IlliGAL), 107 TBUniversity of Illinois at Urbana-ChampaignUrbana
  3. 3.Department of Cognitive PsychologyUniversity of WürzburgWürzburgGermany

Personalised recommendations