Advertisement

Doubly Trained Evolution Control for the Surrogate CMA-ES

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9921)

Abstract

This paper presents a new variant of surrogate-model utilization in expensive continuous evolutionary black-box optimization. This algorithm is based on the surrogate version of the CMA-ES, the Surrogate Covariance Matrix Adaptation Evolution Strategy (S-CMA-ES). Similarly to the original S-CMA-ES, expensive function evaluations are saved through a surrogate model. However, the model is retrained after the points in which its prediction was most uncertain have been evaluated by the true fitness in each generation. We demonstrate that within small budget of evaluations, the new variant of S-CMA-ES improves the original algorithm and outperforms two state-of-the-art surrogate optimizers, except a few evaluations at the beginning of the optimization process.

Keywords

Black-box optimization Surrogate model Evolution control Gaussian process 

Notes

Acknowledgements

This work was supported by the Grant Agency of the Czech Technical University in Prague with its grant No. SGS14/205/OHK4/3T/14 by the Czech Health Research Council project NV15-33250A, by the project “National Institute of Mental Health (NIMH-CZ)”, grant number ED2.1.00/03.0078 and the European Regional Development Fund, and by the project Nr.LO1611 with a financial support from the MEYS under the NPU I program. Further, access to computing and storage facilities owned by parties and projects contributing to the National Grid Infrastructure MetaCentrum, provided under the programme “Projects of Large Infrastructure for Research, Development, and Innovations” (LM2010005), is greatly appreciated.

References

  1. 1.
    Bajer, L., Pitra, Z., Holeňa, M.: Benchmarking Gaussian processes and random forests surrogate models on the BBOB noiseless testbed. In: Proceedings of the 17th GECCO Conference Companion. ACM, Madrid, July 2015Google Scholar
  2. 2.
    Breiman, L.: Classification and Regression Trees. Chapman & Hall/CRC, Boca Raton (1984)zbMATHGoogle Scholar
  3. 3.
    Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-boxoptimization benchmarking 2012: experimental setup. Technical report, INRIA (2012)Google Scholar
  4. 4.
    Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: noiseless functions definitions. Technical report RR-6829, INRIA (2009). Updated February 2010Google Scholar
  5. 5.
    Hansen, N.: The CMA evolution strategy a comparing review. In: Lozano, J.A., Larrañaga, P., Inza, I., Bengoetxea, E. (eds.) Towards a New Evolutionary Computation. Studies in Fuzziness and Soft Computing, vol. 192, pp. 75–102. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  6. 6.
    Holeňa, M., Linke, D., Bajer, L.: Surrogate modeling in the evolutionary optimization of catalytic materials. In: Soule, T. (ed.) Proceedings of the 14th GECCO, pp. 1095–1102. ACM, New York, Philadelphia (2012)Google Scholar
  7. 7.
    Hutter, F., Hoos, H., Leyton-Brown, K.: An evaluation of sequential model-based optimization for expensive blackbox functions. In: Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO 2013 Companion, pp. 1209–1216. ACM, New York (2013)Google Scholar
  8. 8.
    Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  9. 9.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. 9(1), 3–12 (2005)CrossRefGoogle Scholar
  10. 10.
    Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21(4), 345–383 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Loshchilov, I., Schoenauer, M., Sebag, M.: Intensive surrogate model exploitation in self-adaptive surrogate-assisted CMA-ES (saACM-ES). In: Genetic and Evolutionary Computation Conference (GECCO), pp. 439–446. ACM Press, July 2013Google Scholar
  12. 12.
    Loshchilov, I., Schoenauer, M., Sebag, M.: Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy. In: Proceedings of the 14th GECCO, GECCO 2012, pp. 321–328. ACM, New York (2012)Google Scholar
  13. 13.
    Loshchilov, I., Schoenauer, M., Sebag, M.: BI-population CMA-ES algorithms with surrogate models and line searches. In: Genetic and Evolutionary Computation Conference (GECCO Companion), pp. 1177–1184. ACM Press, July 2013Google Scholar
  14. 14.
    Lu, J., Li, B., Jin, Y.: An evolution strategy assisted by an ensemble of local Gaussian process models. In: Proceedings of the 15th Annual Conference on Genetic and Evolutionary Computation, GECCO 2013, pp. 447–454. ACM, New York (2013)Google Scholar
  15. 15.
    Ong, Y.S., Nair, P.B., Keane, A.J.: Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA J. 41(4), 687–696 (2003)CrossRefGoogle Scholar
  16. 16.
    Pitra, Z., Bajer, L., Holeňa, M.: Comparing SVM, Gaussian process and random forest surrogate models for the CMA-ES. In: ITAT 2015: Information Technologies - Applications and Theory, pp. 186–193. CreateSpace Independent Publishing Platform, North Charleston (2015)Google Scholar
  17. 17.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. Adaptative Computation and Machine Learning Series. MIT Press, Cambridge (2006)zbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.National Institute of Mental HealthKlecanyCzech Republic
  2. 2.Faculty of Nuclear Sciences and Physical EngineeringCzech Technical University in PraguePrague 1Czech Republic
  3. 3.Institute of Computer ScienceAcademy of Sciences of the Czech RepublicPrague 8Czech Republic
  4. 4.Faculty of Mathematics and PhysicsCharles University in PraguePrague 1Czech Republic
  5. 5.Unicorn CollegePrague 3Czech Republic

Personalised recommendations