Adaptive Evolutionary Algorithm Based on a Cliqued Gibbs Sampling over Graphical Markov Model Structure

  • Eunice Esther Ponce-de-Leon-Senti
  • Elva Diaz-Diaz
Part of the Adaptation, Learning, and Optimization book series (ALO, volume 14)

Abstract

This chapter introduces Estimation of Distribution Algorithms (EDAs) based on a learning strategy with two steps. The first step is based on the estimation of the searching sample complexity through an index based on the sample entropy. The searching sample algorithm learns a tree, and then, uses a sample complexity index to prognose the missing edges to obtain the cliques of the structure of the estimating distribution adding more edges if necessary. In the second step a new population is generated by a new cliqued Gibbs sampler (CG-Sampler) that drags through the space of solutions driven by the cliques of the learned graphical Markov model. Two variants of this algorithm are compared, the Adaptive Tree Cliqued - EDA (ATC-EDA) and the Adaptive Extended Tree Cliqued - EDA (AETC-EDA), and the Boltzmann selection procedure is used in CG-Sampler. They are tested with 5 known functions defined for 48, 50, 99 and 100 variables, and compared to Univariate Marginal Distribution Algorithm (UMDA). The performance of the two algorithms compared to UMDA is equal for OneMax and ZeroMax functions. The ATC-EDA and AETC-EDA are better than the UMDA for the other 3 functions.

Keywords

Markov Random Fields Gibbs Distribution Distribution Algorithm Clique Size Bayesian Optimization Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Akaike, H.: A new look at the statistical model identification. IEEE Transactions on Automatic Control 19(6), 716–723 (1974)MathSciNetMATHCrossRefGoogle Scholar
  2. 2.
    Besag, J.E.: Spatial interaction and the statistical analysis of lattice systems (with discussion). J. Royal Statist. Soc. Series B 36, 192–326 (1974)MathSciNetMATHGoogle Scholar
  3. 3.
    Chow, C.K., Liu, C.N.: Approximating discrete probability distributions with dependence trees. IEEE Transactions on Information Theory IT-14(3), 462–467 (1968)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Diaz, E., Ponce-de-Leon, E., Larrañaga, P., Bielza, C.: Probabilistic Graphical Markov Model Learning: An Adaptive Strategy. In: Aguirre, A.H., Borja, R.M., Garciá, C.A.R. (eds.) MICAI 2009. LNCS, vol. 5845, pp. 225–236. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  5. 5.
    Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions and the bayesian distribution of images. IEEE Transactions on Pattern Analysis and Machine Intelligence 6, 721–741 (1984)MATHCrossRefGoogle Scholar
  6. 6.
    Kindermann, R., Snell, J.L.: Markov random fields and their applications. Contemporary Mathematics. American Mathematical Society, Providence, RI (1980)Google Scholar
  7. 7.
    Kruskal, J.B.: On the shortest spanning tree of a graph and the traveling salesman problem. In: Proceeding American Mathematical Society, vol. 7, pp. 48–50. American Mathematical Society (1956)Google Scholar
  8. 8.
    de la Maza, M., Tidor, B.: An analysis of selection procedures with particular attention paid to proportional and Boltzmann selection. In: Proceedings of the 5th International Conference on Genetic Algorithms, pp. 124–131. Morgan Kaufmann Publishers Inc., San Francisco (1993)Google Scholar
  9. 9.
    Mühlenbein, H.: The equation for the response to selection and its use for prediction. Evolutionary Computation 5(3), 303–346 (1998)CrossRefGoogle Scholar
  10. 10.
    Mühlenbein, H., Mahnig, T., Ochoa Rodriguez, A.: Schemata, distributions and graphical models in evolutionary optimization. Journal of Heuristic 5, 215–247 (1999)MATHCrossRefGoogle Scholar
  11. 11.
    Mühlenbein, H., Paaß, G.: From Recombination of Genes to the Estimation of Distributions I. Binary Parameters. In: Ebeling, W., Rechenberg, I., Voigt, H.-M., Schwefel, H.-P. (eds.) PPSN 1996. LNCS, vol. 1141, pp. 178–187. Springer, Heidelberg (1996)CrossRefGoogle Scholar
  12. 12.
    Pelikan, M.: Bayesian Optimization Algorithm: From Single Level to Hierarchy. PhD thesis, University Illinois at Urbana Champain, Also IlliGAL Report No. 2002023 (2002)Google Scholar
  13. 13.
    Santana, R., Mühlenbein, H.: Blocked stochastic sampling versus estimation of distribution algorithms. In: Proceedings of the 2002 Congress on the Evolutionary Computation, CEC 2002, vol. 2, pp. 1390–1395. IEEE Press (2002)Google Scholar

Copyright information

© Springer Berlin Heidelberg 2012

Authors and Affiliations

  • Eunice Esther Ponce-de-Leon-Senti
    • 1
  • Elva Diaz-Diaz
    • 1
  1. 1.Computer Sciences DepartmentAutonomous University of AguascalientesAguascalientesMexico

Personalised recommendations