Skip to main content

Fitness Inheritance in the Bayesian Optimization Algorithm

  • Conference paper
Genetic and Evolutionary Computation – GECCO 2004 (GECCO 2004)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3103))

Included in the following conference series:

Abstract

This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions, but also for estimating their fitness. The results indicate that fitness inheritance is a promising concept in BOA, because population-sizing requirements for building appropriate models of promising solutions lead to good fitness estimates even if only a small proportion of candidate solutions is evaluated using the actual fitness function. This can lead to a reduction of the number of actual fitness evaluations by a factor of 30 or more.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Smith, R.E., Dike, B.A., Stegmann, S.A.: Fitness inheritance in genetic algorithms. In: Proceedings of the ACM Symposium on Applied Computing, pp. 345–350 (1995)

    Google Scholar 

  2. Sastry, K., Goldberg, D.E., Pelikan, M.: Don’t evaluate, inherit. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 551–558 (2001) Also IlliGAL Report No. 2001013

    Google Scholar 

  3. Pelikan, M., Goldberg, D.E., Cantú-Paz, E.: BOA: The Bayesian optimization algorithm. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 1999), vol. I, pp. 525–532 (1999) Also IlliGAL Report No. 99003

    Google Scholar 

  4. Pelikan, M.: Bayesian optimization algorithm: From single level to hierarchy. PhD thesis, University of Illinois at Urbana-Champaign, Urbana, IL (2002)

    Google Scholar 

  5. Pelikan, M., Goldberg, D.E., Sastry, K.: Bayesian optimization algorithm, decision graphs, and Occam’s razor. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 519–526 (2001)

    Google Scholar 

  6. Pelikan, M., Goldberg, D.E.: Escaping hierarchical traps with competent genetic algorithms. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 511–518 (2001) Also IlliGAL Report No. 2000020

    Google Scholar 

  7. Pelikan, M., Goldberg, D.E.: A hierarchy machine: Learning to optimize from nature and humans. Complexity 8 (2003)

    Google Scholar 

  8. Pelikan, M., Goldberg, D.E., Lobo, F.: A survey of optimization by building and using probabilistic models. Computational Optimization and Applications 21, 5–20 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  9. Holland, J.H.: Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor (1975)

    Google Scholar 

  10. Goldberg, D.E.: Genetic algorithms in search, optimization, and machine learning. Addison-Wesley, Reading (1989)

    MATH  Google Scholar 

  11. Jordan, M.I. (ed.): Learning in Graphical Models. MIT Press, Cambridge (1998)

    MATH  Google Scholar 

  12. Simon, H.A.: The Sciences of the Artificial. The MIT Press, Cambridge (1968)

    Google Scholar 

  13. Howard, R.A., Matheson, J.E.: Influence diagrams. In: Howard, R.A., Matheson, J.E. (eds.) Readings on the principles and applications of decision analysis, vol. II, pp. 721–762. Strategic Decisions Group, Menlo Park (1981)

    Google Scholar 

  14. Pearl, J.: Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann, San Mateo (1988)

    Google Scholar 

  15. Buntine, W.L.: Theory refinement of Bayesian networks. In: Proceedings of the Uncertainty in Artificial Intelligence (UAI-1991), pp. 52–60 (1991)

    Google Scholar 

  16. Cooper, G.F., Herskovits, E.H.: A Bayesian method for the induction of probabilistic networks from data. Machine Learning 9, 309–347 (1992)

    MATH  Google Scholar 

  17. Heckerman, D., Geiger, D., Chickering, D.M.: Learning Bayesian networks: The combination of knowledge and statistical data. Technical Report MSR-TR-94-09, Microsoft Research, Redmond, WA (1994)

    Google Scholar 

  18. Schwarz, G.: Estimating the dimension of a model. The Annals of Statistics 6, 461–464 (1978)

    Article  MATH  MathSciNet  Google Scholar 

  19. Chickering, D.M., Heckerman, D., Meek, C.: A Bayesian approach to learning Bayesian networks with local structure. Technical Report MSR-TR-97-07, Microsoft Research, Redmond, WA (1997)

    Google Scholar 

  20. Friedman, N., Goldszmidt, M.: Learning Bayesian networks with local structure. In: Jordan, M.I. (ed.) Graphical models, pp. 421–459. MIT Press, Cambridge (1999)

    Google Scholar 

  21. Zheng, X., Julstrom, B., Cheng, W.: Design of vector quantization codebooks using a genetic algorithm. In: Proceedings of the International Conference on Evolutionary Computation (ICEC 1997), Picataway, NJ, pp. 525–529. IEEE Press, Los Alamitos (1997)

    Chapter  Google Scholar 

  22. Pelikan, M., Sastry, K., Goldberg, D.E.: Scalability of the Bayesian optimization algorithm. International Journal of Approximate Reasoning 31, 221–258 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  23. Ackley, D.H.: An empirical study of bit vector function optimization. Genetic Algorithms and Simulated Annealing, 170–204 (1987)

    Google Scholar 

  24. Deb, K., Goldberg, D.E.: Sufficient conditions for deceptive and easy binary functions. Annals of Mathematics and Artificial Intelligence 10, 385–408 (1994)

    Article  MATH  MathSciNet  Google Scholar 

  25. Goldberg, D.E.: The design of innovation: Lessons from and for competent genetic algorithms. Genetic Algorithms and Evolutionary Computation, vol. 7. Kluwer Academic Publishers, Dordrecht (2002)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pelikan, M., Sastry, K. (2004). Fitness Inheritance in the Bayesian Optimization Algorithm. In: Deb, K. (eds) Genetic and Evolutionary Computation – GECCO 2004. GECCO 2004. Lecture Notes in Computer Science, vol 3103. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24855-2_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-24855-2_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22343-6

  • Online ISBN: 978-3-540-24855-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics