Adaptive Evolutionary Algorithm Based on a Cliqued Gibbs Sampling over Graphical Markov Model Structure
This chapter introduces Estimation of Distribution Algorithms (EDAs) based on a learning strategy with two steps. The first step is based on the estimation of the searching sample complexity through an index based on the sample entropy. The searching sample algorithm learns a tree, and then, uses a sample complexity index to prognose the missing edges to obtain the cliques of the structure of the estimating distribution adding more edges if necessary. In the second step a new population is generated by a new cliqued Gibbs sampler (CG-Sampler) that drags through the space of solutions driven by the cliques of the learned graphical Markov model. Two variants of this algorithm are compared, the Adaptive Tree Cliqued - EDA (ATC-EDA) and the Adaptive Extended Tree Cliqued - EDA (AETC-EDA), and the Boltzmann selection procedure is used in CG-Sampler. They are tested with 5 known functions defined for 48, 50, 99 and 100 variables, and compared to Univariate Marginal Distribution Algorithm (UMDA). The performance of the two algorithms compared to UMDA is equal for OneMax and ZeroMax functions. The ATC-EDA and AETC-EDA are better than the UMDA for the other 3 functions.
KeywordsMarkov Random Fields Gibbs Distribution Distribution Algorithm Clique Size Bayesian Optimization Algorithm
Unable to display preview. Download preview PDF.
- 6.Kindermann, R., Snell, J.L.: Markov random fields and their applications. Contemporary Mathematics. American Mathematical Society, Providence, RI (1980)Google Scholar
- 7.Kruskal, J.B.: On the shortest spanning tree of a graph and the traveling salesman problem. In: Proceeding American Mathematical Society, vol. 7, pp. 48–50. American Mathematical Society (1956)Google Scholar
- 8.de la Maza, M., Tidor, B.: An analysis of selection procedures with particular attention paid to proportional and Boltzmann selection. In: Proceedings of the 5th International Conference on Genetic Algorithms, pp. 124–131. Morgan Kaufmann Publishers Inc., San Francisco (1993)Google Scholar
- 12.Pelikan, M.: Bayesian Optimization Algorithm: From Single Level to Hierarchy. PhD thesis, University Illinois at Urbana Champain, Also IlliGAL Report No. 2002023 (2002)Google Scholar
- 13.Santana, R., Mühlenbein, H.: Blocked stochastic sampling versus estimation of distribution algorithms. In: Proceedings of the 2002 Congress on the Evolutionary Computation, CEC 2002, vol. 2, pp. 1390–1395. IEEE Press (2002)Google Scholar