Evolutionary Bayesian Classifier-Based Optimization in Continuous Domains

  • Teresa Miquélez
  • Endika Bengoetxea
  • Pedro Larrañaga
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4247)


In this work, we present a generalisation to continuous domains of an optimization method based on evolutionary computation that applies Bayesian classifiers in the learning process. The main difference between other estimation of distribution algorithms (EDAs) and this new method –known as Evolutionary Bayesian Classifier-based Optimization Algorithms (EBCOAs)– is the way the fitness function is taken into account, as a new variable, to generate the probabilistic graphical model that will be applied for sampling the next population.

We also present experimental results to compare performance of this new method with other methods of the evolutionary computation field like evolution strategies, and EDAs. Results obtained show that this new approach can at least obtain similar performance as these other paradigms.


Evolutionary Computation Continuous Domain Distribution Algorithm Probabilistic Graphical Model Evolution Strategy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bengoetxea, E., Miquélez, T., Larrañaga, P., Lozano, J.A.: Experimental results in function optimization with EDAs in continuous domain. In: Larrañaga, P., Lozano, J.A. (eds.) Estimation of Distribution Algorithms. A New Tool for Evolutionary Computation, pp. 181–194. Kluwer Academic Publishers, Dordrecht (2001)Google Scholar
  2. 2.
    Cervone, G.: LEM2 Theory and Implementation of the Learnable Evolution. Technical report, Machine Learning and Inference Laboratory, George Mason University (1999)Google Scholar
  3. 3.
    Chow, C., Liu, C.: Approximating discrete probability distributions with dependence trees. IEEE Transactions on Information Theory 14(3), 462–467 (1968)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Friedman, N., Geiger, D., Goldsmidt, M.: Bayesian network classifiers. Machine Learning 29(2), 131–163 (1997)MATHCrossRefGoogle Scholar
  5. 5.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading (1989)MATHGoogle Scholar
  6. 6.
    Hansen, N.: The CMA evolution strategy: A comparing review. In: Inza, I., Lozano, J.A., Larrañaga, P., Bengoetxea, E. (eds.) Towards a New Evolutionary Computation. Advances in Estimation of Distribution Algorithms, pp. 75–102. Springer, Heidelberg (2006)Google Scholar
  7. 7.
    Hansen, N., Kern, S.: Evaluating the CMA evolution etrategy on multimodal test functions. In: Eighth International Conference on Parallel Problem Solving from Nature – PPSN VIII, pp. 282–291 (2004)Google Scholar
  8. 8.
    Holland, J.H.: Adaptation in Natural and Artificial Systems. The University of Michigan Press, Michigan (1975)Google Scholar
  9. 9.
    Kononenko, I.: Semi-naïve Bayesian classifiers. In: Proceedings of the 6th European Working Session on Learning, Porto, Portugal, pp. 206–219 (1991)Google Scholar
  10. 10.
    Larrañaga, P., Lozano, J.A.: Estimation of Distribution Algorithms. A New Tool for Evolutionary Computation. Kluwer Academic Publishers, Dordrecht (2001)Google Scholar
  11. 11.
    Lauritzen, S.L., Wermuth, N.: Graphical models for associations between variables, some of which are qualitative and some quantitative. Annals of Statistics 17, 31–57 (1989)MATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Llorà, X., Goldberg, D.E.: Wise breeding GA via machine learning techniques for function optimization. In: Cantú-Paz, E., Foster, J.A., Deb, K., Davis, L., Roy, R., O’Reilly, U.-M., Beyer, H.-G., Kendall, G., Wilson, S.W., Harman, M., Wegener, J., Dasgupta, D., Potter, M.A., Schultz, A., Dowsland, K.A., Jonoska, N., Miller, J., Standish, R.K. (eds.) GECCO 2003. LNCS, vol. 2723, pp. 1172–1183. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  13. 13.
    Lozano, J.A., Larrañaga, P., Inza, I., Bengoetxea, E.: Towards a New Evolutionary Computation. Advances in Estimation of Distribution Algorithms. Springer, Heidelberg (2006)Google Scholar
  14. 14.
    Michalski, R.S.: Learnable evolution model: Evolutionary processes guided by machine learning. Machine Learning 38, 9–40 (2000)MATHCrossRefGoogle Scholar
  15. 15.
    Minsky, M.: Steps toward artificial intelligence. Transactions on Institute of Radio Engineers 49, 8–30 (1961)MathSciNetGoogle Scholar
  16. 16.
    Miquélez, T., Bengoetxea, E., Larrañaga, P.: Evolutionary computation based on Bayesian classifiers. International Journal of Applied Mathematics and Computer Science 14(3), 335–349 (2004)MATHMathSciNetGoogle Scholar
  17. 17.
    Mühlenbein, H., Mahning, T., Ochoa, A.: Schemata, distributions and graphical models in evolutionary optimization. Journal of Heuristics 5(2), 215–247 (1999)MATHCrossRefGoogle Scholar
  18. 18.
    Mühlenbein, H., Paaß, G.: From recombination of genes to the estimation of distributions: I. Binary parameters. In: Asplund, L. (ed.) Ada-Europe 1998. LNCS, vol. 1411, pp. 178–187. Springer, Heidelberg (1998)Google Scholar
  19. 19.
    Pazzani, M.: Searching for dependencies in Bayesian classifiers. In: Fisher, D., Lenz, H.-J. (eds.) Learning from Data: Artificial Intelligence and Statistics V, pp. 239–248. Springer, New York (1997)Google Scholar
  20. 20.
    Pelikan, M.: Hierarchical Bayesian Optimization Algorithm: Toward a New Generation of Evolutionary Algorithms. Springer, Heidelberg (2005)MATHGoogle Scholar
  21. 21.
    Schwefel, H.-P.: Evolution and Optimum Seeking. Wiley, Chichester (1995)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Teresa Miquélez
    • 1
  • Endika Bengoetxea
    • 1
  • Pedro Larrañaga
    • 1
  1. 1.Intelligent Systems GroupUniversity of the Basque Country, Computer Engineering FacultySan SebastianSpain

Personalised recommendations