Abstract
UMDA(the univariate marginal distribution algorithm) was derived by analyzing the mathematical principles behind recombination. Mutation, however, was not considered. The same is true for the FDA (factorized distribution algorithm), an extension of the UMDA which can cover dependencies between variables. In this paper mutation is introduced into these algorithms by a technique called Bayesian prior. We derive theoretically an estimate how to choose the Bayesian prior. The recommended Bayesian prior turns out to be a good choice in a number of experiments. These experiments also indicate that mutation increases in many cases the performance of the algorithms and decreases the dependence on a good choice of the population size.
Real World Computing Partnership
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
G.F. Cooper and E.A. Herskovits. A Bayesian method for the induction of probabilistic networks from data. Machine Learning, 9:309–347, 1992.
D. S. Falconer. Introduction to Quantitative Genetics. Longman, London, 1981.
D. Heckerman. Atutorial on learning with Bayesian networks. In Jordan [4], pages 301–354.
M.I. Jordan, editor. Learning in Graphical Models. MIT Press, Cambrigde, 1999.
St.A. Kauffman and S. Levin. Towards a general theory of adaptive walks on rugged landscapes. Journal of Theoretical Biology, 128:11–45, 1987.
H. Mühlenbein. The equation for response to selection and its use for prediction. Evolutionary Computation, 5:303–346, 1998.
H. Mühlenbein and G. Paaß. From recombination of genes to the estimation of distributions i. binary parameters. In H.-M. Voigt, W. Ebeling, I. Rechenberg, and H.-P. Schwefel, editors, Lecture Notes in Computer Science 1141: Parallel Problem Solving from Nature-PPSN IV, pages 178–187, Berlin, 1996. Springer-Verlag.
H. Mühlenbein and J. Zimmermann. Size of neighborhood more important than temperature for stochastic local search. In Proceedings of the 2000 Congress on Evolutionary Computation, pages 1017–1024, New Jersey, 2000. IEEE Press.
Heinz Mühlenbein and Thilo Mahnig. Convergence theory and applications of the factorized distribution algorithm. Journal of Computing and Information Technology, 7:19–32, 1999.
Heinz Mühlenbein and Thilo Mahnig. FDA-a scalable evolutionary algorithm for the optimization of additively decomposed functions. Evolutionary Computation, 7(4):353–376, 1999.
Heinz Mühlenbein and Thilo Mahnig. Evolutionary algorithms: From recombination to search distributions. In L. Kallel, B. Naudts, and A. Rogers, editors, Theoretical Aspects of Evolutionary Computing, Natural Computing, pages 137–176, Berlin, 2000. Springer Verlag.
Heinz Mühlenbein, Thilo Mahnig, and A. Rodriguez Ochoa. Schemata, distributions and graphical models in evolutionary optimization. Journal of Heuristics, 5(2):215–247, 1999.
J. Pearl. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufman, San Mateo, 1988.
S. Wright. Evolution in Mendelian populations. Genetics, 16:97–159, 1931.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Mahnig, T., Mühlenbein, H. (2001). Optimal Mutation Rate Using Bayesian Priors for Estimation of Distribution Algorithms. In: Steinhöfel, K. (eds) Stochastic Algorithms: Foundations and Applications. SAGA 2001. Lecture Notes in Computer Science, vol 2264. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45322-9_2
Download citation
DOI: https://doi.org/10.1007/3-540-45322-9_2
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43025-4
Online ISBN: 978-3-540-45322-2
eBook Packages: Springer Book Archive