Journal of Central South University

, Volume 19, Issue 2, pp 443–452 | Cite as

Gaussian process assisted coevolutionary estimation of distribution algorithm for computationally expensive problems

  • Na Luo (罗娜)
  • Feng Qian (钱锋)
  • Liang Zhao (赵亮)
  • Wei-min Zhong (钟伟民)
Article

Abstract

In order to reduce the computation of complex problems, a new surrogate-assisted estimation of distribution algorithm with Gaussian process was proposed. Coevolution was used in dual populations which evolved in parallel. The search space was projected into multiple subspaces and searched by sub-populations. Also, the whole space was exploited by the other population which exchanges information with the sub-populations. In order to make the evolutionary course efficient, multivariate Gaussian model and Gaussian mixture model were used in both populations separately to estimate the distribution of individuals and reproduce new generations. For the surrogate model, Gaussian process was combined with the algorithm which predicted variance of the predictions. The results on six benchmark functions show that the new algorithm performs better than other surrogate-model based algorithms and the computation complexity is only 10% of the original estimation of distribution algorithm.

Key words

estimation of distribution algorithm fitness function modeling Gaussian process surrogate approach 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    JIN Yao-chu, OLHOFER M, SENDHOFF B. A framework for evolutionary optimization with approximate fitness functions [J]. IEEE Transactions on Evolutionary Computation, 2002, 6(5): 481–494.CrossRefGoogle Scholar
  2. [2]
    SCHMIDT M D, LIPSON H. Coevolution of fitness predictors [J]. IEEE Transactions on Evolutionary Computation, 2008, 12(6): 736–749.CrossRefGoogle Scholar
  3. [3]
    VO C, PANAIT L, LUKE S, Cooperative coevolution and univariate estimation of distribution algorithms [C]// Proceedings of the 10th ACM SIGEVO Conference on Foundations of Genetic Algorithms, Association for Computing Machinery. Orlando, Florida, USA: 2009: 141–150.Google Scholar
  4. [4]
    POTTER M A, DEJONG K A. A cooperative coevolutionary approach to function optimization [C]// DAVIDOR Y, SCHWEFEL H, MÄNNER R. The Third Parallel Problem Solving from Nature. Springer Berlin/Heidelberg, 1994: 249–257.CrossRefGoogle Scholar
  5. [5]
    PANAIT L. Theoretical convergence guarantees for cooperative coevolutionary algorithms [J]. Evolutionary Computation, 2010, 18(4): 581–615.CrossRefGoogle Scholar
  6. [6]
    JANSEN T, WIEGAND R P. The cooperative coevolutionary (1+1) EA[J]. Evolutionary Computation, 2004, 12(4): 405–434.CrossRefGoogle Scholar
  7. [7]
    BERGH F V D, ENGELBRECHT A P. A cooperative approach to particle swarm optimization [J]. IEEE Transactions on Evolutionary Computation, 2004, 8(3): 225–239.CrossRefGoogle Scholar
  8. [8]
    YANG Z Y, TANGA K, YAO X. Large scale evolutionary optimization using cooperative coevolution [J]. Information Sciences, 2008, 178(15): 2985–2999.MathSciNetCrossRefGoogle Scholar
  9. [9]
    EL-BELTAGY M A, KEANE A J. Evolutionary optimization for computationally expensive problems using gaussian processes [C]// Proceedings of the International Conference on Artificial Intelligence. Las Vegas: CSREA Press, 2001: 708–714.Google Scholar
  10. [10]
    SU Shao-guo. Accelerating particle swarm optimization algorithms using gaussian process machine learning [C]// QI L, TIAN X Z. Proceedings of the 2009 International Conference on Computational Intelligence and Natural Computing. Wuhan: IEEE Computer Society, 2009: 174–177.CrossRefGoogle Scholar
  11. [11]
    BUCHE D, SCHRAUDOLPH N N, KOUMOUTSAKOS P. Accelerating evolutionary algorithms with gaussian process fitness function models [J]. IEEE Transactions on Systems Man and Cybernetics Part C-Applications and Reviews, 2005, 35(2): 183–194.CrossRefGoogle Scholar
  12. [12]
    LI Bin, ZHONG Rui-tian, WANG Xian-ji, ZHUANG Zhen-quan. Continuous optimization based-on boosting gaussian mixture model [C]// TANG Y Y, WANG S P, LORETTE G, YEUNG D S, YAN H. Proceedings of the 18th International Conference on Pattern Recognition (ICPR 2006). Hong Kong, China: IEEE Computer Society, 2006: 1192–1195.Google Scholar
  13. [13]
    DING N, ZHOU S, SUN Z. Histogram-based estimation of distribution algorithm: A competent method for continuous optimization [J]. Journal of computer science and technology, 2008, 23(1): 35–43.CrossRefGoogle Scholar
  14. [14]
    KROHLING R A, COELHO L D. Coevolutionary particle swarm optimization using gaussian distribution for solving constrained optimization problems [J]. IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics, 2006, 36(6): 1407–1416.CrossRefGoogle Scholar
  15. [15]
    ZHAO Liang, YANG Yu-pu, ZENG Yong. Eliciting compact t-s fuzzy models using subtractive clustering and coevolutionary particle swarm optimization [J]. Neurocomputing, 2009, 72(10/11/12): 2569–2575.CrossRefGoogle Scholar
  16. [16]
    XIONG Zi-hua. Soft sensor modeling based on gaussian processes [J]. Journal of Central South University of Technology, 2005, 12(4): 469–471.CrossRefGoogle Scholar
  17. [17]
    JIN Yao-chu. A comprehensive survey of fitness approximation in evolutionary computation [J]. Soft Computing, 2005, 9(1): 3–12.CrossRefGoogle Scholar

Copyright information

© Central South University Press and Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Na Luo (罗娜)
    • 1
  • Feng Qian (钱锋)
    • 1
  • Liang Zhao (赵亮)
    • 1
  • Wei-min Zhong (钟伟民)
    • 1
  1. 1.Key Laboratory of Advanced Control and Optimization for Chemical Processes of Ministry of EducationEast China University of Science and TechnologyShanghaiChina

Personalised recommendations