Advertisement

Journal of Global Optimization

, Volume 68, Issue 3, pp 641–662 | Cite as

Pseudo expected improvement criterion for parallel EGO algorithm

Article

Abstract

The efficient global optimization (EGO) algorithm is famous for its high efficiency in solving computationally expensive optimization problems. However, the expected improvement (EI) criterion used for picking up candidate points in the EGO process produces only one design point per optimization cycle, which is time-wasting when parallel computing can be used. In this work, a new criterion called pseudo expected improvement (PEI) is proposed for developing parallel EGO algorithms. In each cycle, the first updating point is selected by the initial EI function. After that, the PEI function is built to approximate the real updated EI function by multiplying the initial EI function by an influence function of the updating point. The influence function is designed to simulate the impact that the updating point will have on the EI function, and is only corresponding to the position of the updating point (not the function value of the updating point). Therefore, the next updating point can be identified by maximizing the PEI function without evaluating the first updating point. As the sequential process goes on, a desired number of updating points can be selected by the PEI criterion within one optimization cycle. The efficiency of the proposed PEI criterion is validated by six benchmarks with dimension from 2 to 6. The results show that the proposed PEI algorithm performs significantly better than the standard EGO algorithm, and gains significant improvements over five of the six test problems compared against a state-of-the-art parallel EGO algorithm. Furthermore, additional experiments show that it affects the convergence of the proposed algorithm significantly when the global maximum of the PEI function is not found. It is recommended to use as much evaluations as one can afford to find the global maximum of the PEI function.

Keywords

Efficient global optimization Expected improvement Parallel computing Pseudo expected improvement Influence function 

Notes

Acknowledgements

We would like to thank the anonymous reviewers for their helpful comments.

References

  1. 1.
    Rios, L.M., Sahinidis, N.V.: Derivative-free optimization: a review of algorithms and comparison of software implementations. J. Glob. Optim. 56(3), 1247–1293 (2013)MathSciNetCrossRefMATHGoogle Scholar
  2. 2.
    Boukouvala, F., Misener, R., Floudas, C.A.: Global optimization advances in mixed-integer nonlinear programming, MINLP, and constrained derivative-free optimization, CDFO. Eur. J. Oper. Res. 252(3), 701–727 (2016)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Wang, G.G., Shan, S.: Review of metamodeling techniques in support of engineering design optimization. J. Mech. Des. 129(2), 370–380 (2007)CrossRefGoogle Scholar
  4. 4.
    Viana, F.A.C., Simpson, T.W., Balabanov, V., Toropov, V.: Metamodeling in multidisciplinary design optimization: how far have we really come? AIAA J. 52(4), 670–690 (2014)CrossRefGoogle Scholar
  5. 5.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)MathSciNetCrossRefMATHGoogle Scholar
  6. 6.
    Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21(4), 345–383 (2001)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Sasena, M.J., Papalambros, P., Goovaerts, P.: Exploration of metamodeling sampling criteria for constrained global optimization. Eng. Optim. 34(3), 263–278 (2002)CrossRefGoogle Scholar
  8. 8.
    Parr, J.M., Keane, A.J., Forrester, A.I.J., Holden, C.M.E.: Infill sampling criteria for surrogate-based optimization with constraint handling. Eng. Optim. 44(10), 1147–1166 (2012)CrossRefMATHGoogle Scholar
  9. 9.
    Huang, D., Allen, T.T., Notz, W.I., Zeng, N.: Global optimization of stochastic black-box systems via sequential Kriging meta-models. J. Glob. Optim. 34(3), 441–466 (2006)MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Forrester, A.I., Keane, A.J., Bressloff, N.W.: Design and analysis of noisy computer experiments. AIAA J 44(10), 2331–2339 (2006)CrossRefGoogle Scholar
  11. 11.
    Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evolut. Comput. 10(1), 50–66 (2006)CrossRefGoogle Scholar
  12. 12.
    Couckuyt, I., Deschrijver, D., Dhaene, T.: Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization. J. Glob. Optim. 60(3), 575–594 (2014)MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    Ginsbourger, D., Le Riche, R., Carraro, L.: Kriging is well-suited to parallelize optimization. In: Tenne, Y., Goh, C.-K. (eds.) Computational Intelligence in Expensive Optimization Problems. Adaptation Learning and Optimization, vol. 2, pp. 131–162. Springer, Berlin (2010)Google Scholar
  14. 14.
    Sóbester, A., Leary, S.J., Keane, A.J.: A parallel updating scheme for approximating and optimizing high fidelity computer simulations. Struct. Multidiscipl. Optim. 27(5), 371–383 (2004)CrossRefGoogle Scholar
  15. 15.
    Feng, Z.W., Zhang, Q.B., Zhang, Q.F., Tang, Q.G., Yang, T., Ma, Y.: A multiobjective optimization based framework to balance the global exploration and local exploitation in expensive optimization. J. Glob. Optim. 61(4), 677–694 (2015)MathSciNetCrossRefMATHGoogle Scholar
  16. 16.
    Forrester, A., Sóbester, A., Keane, A.: Engineering Design Via Surrogate Aodelling: A Practical Guide. Wiley, New York (2008)CrossRefGoogle Scholar
  17. 17.
    Bischl, B., Wessing, S., Bauer, N., Friedrichs, K., Weihs, C.: MOI-MBO: multiobjective infill for parallel Mmodel-based optimization. In: Pardalos, P.M., Resende, M.G.C., Vogiatzis, C., Walteros, J.L. (eds.) Learning and Intelligent Optimization. Lecture Notes in Computer Science, pp. 173–186. Springer, Berlin (2014)Google Scholar
  18. 18.
    Hamza, K., Shalaby, M.: A framework for parallelized efficient global optimization with application to vehicle crashworthiness optimization. Eng. Optim. 46(9), 1200–1221 (2014)CrossRefGoogle Scholar
  19. 19.
    Hutter, F., Hoos, H., Leyton-Brown, K.: Parallel algorithm configuration. In: Hamadi, Y., Schoenauer, M. (eds.) Learning and Intelligent Optimization. Lecture Notes in Computer Science, pp. 55–70. Springer, Berlin (2012)Google Scholar
  20. 20.
    Viana, F.A., Haftka, R.T., Watson, L.T.: Efficient global optimization algorithm assisted by multiple surrogate techniques. J. Glob. Optim. 56(02), 669–689 (2013)CrossRefMATHGoogle Scholar
  21. 21.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4(4), 409–423 (1989)MathSciNetCrossRefMATHGoogle Scholar
  22. 22.
    Torn, A., Zilinskas, A.: Global Optimization. Springer, Berlin (1987)MATHGoogle Scholar
  23. 23.
    Dixon, L.C.W., Szego, G.P.: The optimization problem: an introduction. In: Dixon, L.C.W., Szego, G.P. (eds.) Towards Global Optimization II. North Holland, New York (1978)Google Scholar
  24. 24.
    Sasena, M.J.: Flexibility and Efficiency Enhancements for Constrained Global Design Optimization with Kriging Approximations. University of Michigan, Ann Arbor (2002)Google Scholar
  25. 25.
    Lophaven, S.N., Nielsen, H.B., Søndergaard, J.: Dace—a matlab Kriging toolbox. Technical Report IMM TR C2002 C12, Technical University of Denmark, Denmark (2002). http://www2.imm.dtu.dk/~hbn/dace/
  26. 26.
    Viana F.A.C.: SURROGATES Toolbox Users Guide. Gainesville, FL, USA, version 3.0 edn. (2011). http://sites.google.com/site/felipeacviana/surrogatestoolbox
  27. 27.
    Price, K.V., Storn, R.M., Lampinen, J.A.: Differential Evolution: A Practical Approach to Global Optimization. Springer, Berlin (2005). http://www1.icsi.berkeley.edu/~storn/code.html
  28. 28.
    Barr, R.S., Hickman, B.L.: Reporting computational experiments with parallel algorithms: issues, measures, and experts’ opinions. ORSA J. Comput. 5(1), 2–18 (1993)CrossRefMATHGoogle Scholar
  29. 29.
    Regis, R.G., Shoemaker, C.A.: Parallel radial basis function methods for the global optimization of expensive functions. Eur. J. Oper. Res. 182(2), 514–535 (2007)MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.School of Naval Architecture and Ocean EngineeringHuazhong University of Science and TechnologyWuhanChina

Personalised recommendations