Abstract
This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions, but also for estimating their fitness. The results indicate that fitness inheritance is a promising concept in BOA, because population-sizing requirements for building appropriate models of promising solutions lead to good fitness estimates even if only a small proportion of candidate solutions is evaluated using the actual fitness function. This can lead to a reduction of the number of actual fitness evaluations by a factor of 30 or more.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Smith, R.E., Dike, B.A., Stegmann, S.A.: Fitness inheritance in genetic algorithms. In: Proceedings of the ACM Symposium on Applied Computing, pp. 345–350 (1995)
Sastry, K., Goldberg, D.E., Pelikan, M.: Don’t evaluate, inherit. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 551–558 (2001) Also IlliGAL Report No. 2001013
Pelikan, M., Goldberg, D.E., Cantú-Paz, E.: BOA: The Bayesian optimization algorithm. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 1999), vol. I, pp. 525–532 (1999) Also IlliGAL Report No. 99003
Pelikan, M.: Bayesian optimization algorithm: From single level to hierarchy. PhD thesis, University of Illinois at Urbana-Champaign, Urbana, IL (2002)
Pelikan, M., Goldberg, D.E., Sastry, K.: Bayesian optimization algorithm, decision graphs, and Occam’s razor. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 519–526 (2001)
Pelikan, M., Goldberg, D.E.: Escaping hierarchical traps with competent genetic algorithms. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 511–518 (2001) Also IlliGAL Report No. 2000020
Pelikan, M., Goldberg, D.E.: A hierarchy machine: Learning to optimize from nature and humans. Complexity 8 (2003)
Pelikan, M., Goldberg, D.E., Lobo, F.: A survey of optimization by building and using probabilistic models. Computational Optimization and Applications 21, 5–20 (2002)
Holland, J.H.: Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor (1975)
Goldberg, D.E.: Genetic algorithms in search, optimization, and machine learning. Addison-Wesley, Reading (1989)
Jordan, M.I. (ed.): Learning in Graphical Models. MIT Press, Cambridge (1998)
Simon, H.A.: The Sciences of the Artificial. The MIT Press, Cambridge (1968)
Howard, R.A., Matheson, J.E.: Influence diagrams. In: Howard, R.A., Matheson, J.E. (eds.) Readings on the principles and applications of decision analysis, vol. II, pp. 721–762. Strategic Decisions Group, Menlo Park (1981)
Pearl, J.: Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann, San Mateo (1988)
Buntine, W.L.: Theory refinement of Bayesian networks. In: Proceedings of the Uncertainty in Artificial Intelligence (UAI-1991), pp. 52–60 (1991)
Cooper, G.F., Herskovits, E.H.: A Bayesian method for the induction of probabilistic networks from data. Machine Learning 9, 309–347 (1992)
Heckerman, D., Geiger, D., Chickering, D.M.: Learning Bayesian networks: The combination of knowledge and statistical data. Technical Report MSR-TR-94-09, Microsoft Research, Redmond, WA (1994)
Schwarz, G.: Estimating the dimension of a model. The Annals of Statistics 6, 461–464 (1978)
Chickering, D.M., Heckerman, D., Meek, C.: A Bayesian approach to learning Bayesian networks with local structure. Technical Report MSR-TR-97-07, Microsoft Research, Redmond, WA (1997)
Friedman, N., Goldszmidt, M.: Learning Bayesian networks with local structure. In: Jordan, M.I. (ed.) Graphical models, pp. 421–459. MIT Press, Cambridge (1999)
Zheng, X., Julstrom, B., Cheng, W.: Design of vector quantization codebooks using a genetic algorithm. In: Proceedings of the International Conference on Evolutionary Computation (ICEC 1997), Picataway, NJ, pp. 525–529. IEEE Press, Los Alamitos (1997)
Pelikan, M., Sastry, K., Goldberg, D.E.: Scalability of the Bayesian optimization algorithm. International Journal of Approximate Reasoning 31, 221–258 (2002)
Ackley, D.H.: An empirical study of bit vector function optimization. Genetic Algorithms and Simulated Annealing, 170–204 (1987)
Deb, K., Goldberg, D.E.: Sufficient conditions for deceptive and easy binary functions. Annals of Mathematics and Artificial Intelligence 10, 385–408 (1994)
Goldberg, D.E.: The design of innovation: Lessons from and for competent genetic algorithms. Genetic Algorithms and Evolutionary Computation, vol. 7. Kluwer Academic Publishers, Dordrecht (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pelikan, M., Sastry, K. (2004). Fitness Inheritance in the Bayesian Optimization Algorithm. In: Deb, K. (eds) Genetic and Evolutionary Computation – GECCO 2004. GECCO 2004. Lecture Notes in Computer Science, vol 3103. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24855-2_5
Download citation
DOI: https://doi.org/10.1007/978-3-540-24855-2_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22343-6
Online ISBN: 978-3-540-24855-2
eBook Packages: Springer Book Archive