Advertisement

Parameter Setting for Multicore CMA-ES with Large Populations

  • Nacim Belkhir
  • Johann Dréo
  • Pierre Savéant
  • Marc Schoenauer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9554)

Abstract

The goal of this paper is to investigate on the overall performance of CMA-ES, when dealing with a large number of cores — considering the direct mapping between cores and individuals — and to empirically find the best parameter strategies for a parallel machine. By considering the problem of parameter setting, we empirically determine a new strategy for CMA-ES, and we investigate whether Self-CMA-ES (a self-adaptive variant of CMA-ES) could be a viable alternative to CMA-ES when using parallel computers with a coarse-grained distribution of the fitness evaluations. According to a large population size, the resulting new strategy for Self-CMA-ES and CMA-ES, is experimentally validated on BBOB benchmark where it is shown to outperform a CMA-ES with default parameter strategy.

Keywords

Empirical study Numerical optimization Metaheuristics Algorithms comparison 

References

  1. 1.
    Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: CEC 2005, vol. 2, pp. 1769–1776. IEEE (2005)Google Scholar
  2. 2.
    Bartz-Beielstein, T., Lasarczyk, C.W., Preuß, M.: Sequential parameter optimization. In: CEC 2005, vol. 1, pp. 773–780. IEEE (2005)Google Scholar
  3. 3.
    Beyer, H.-G., Sendhoff, B.: Covariance matrix adaptation revisited – The CMSA evolution strategy –. In: Rudolph, G., Jansen, T., Lucas, S., Poloni, C., Beume, N. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 123–132. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  4. 4.
    Birattari, M., Stützle, T., Paquete, L., Varrentrapp, K., et al.: A racing algorithm for configuring metaheuristics. In: Langdon, W.B., et al. (ed.) Proceedings of ACM GECCO 2002, pp. 11–18 (2002)Google Scholar
  5. 5.
    Eiben, A., Michalewicz, Z., Schoenauer, M., Smith, J.E.: Parameter control in evolutionary algorithms. In: Lobo, F., Lima, C.F., Michalewicz, Z. (eds.) Parameter Setting in Evolutionary Algorithms, pp. 19–46. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  6. 6.
    Hansen, N., Müller, S., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 11(1), 1–18 (2003)CrossRefGoogle Scholar
  7. 7.
    Hansen, N., Niederberger, S., Guzzella, L., Koumoutsakos, P.: A method for handling uncertainty in evolutionary optimization with an application to feedback control of combustion. IEEE Trans. Evol. Comput. 13(1), 180–197 (2009)CrossRefGoogle Scholar
  8. 8.
    Hansen, N.: Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed. In: Rothlauf, F. (ed.) GECCO Companion, pp. 2389–2396. ACM (2009)Google Scholar
  9. 9.
    Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-box optimization benchmarking 2010: experimental setup. Technical report RR-7215, INRIA (2010)Google Scholar
  10. 10.
    Hansen, N., Ros, R., Mauny, N., Schoenauer, M., Auger, A.: Impacts of invariance in search: when CMA-ES and PSO face Ill-conditioned and non-separable problems. Appl. Soft Comput. 11, 5755–5769 (2011)CrossRefGoogle Scholar
  11. 11.
    Hoos, H.H.: Programming by optimization. Commun. ACM 55(2), 70–80 (2012)CrossRefGoogle Scholar
  12. 12.
    Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  13. 13.
    Hutter, F., Hoos, H.H., Leyton-Brown, K., Murphy, K.P.: An experimental investigation of model-based parameter optimisation: SPO and beyond. In: Rothlauf, F. (ed.) GECCO 2009, pp. 271–278. ACM (2009)Google Scholar
  14. 14.
    Hutter, F., Hoos, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: an automatic algorithm configuration framework. JAIR 36(1), 267–306 (2009)zbMATHGoogle Scholar
  15. 15.
    Liao, T., Stützle, T.: Testing the impact of parameter tuning on a variant of IPOP-CMA-ES with a bounded maximum population size on the noiseless BBOB testbed. In: Proceedings of ACM GECCO, pp. 1169–1176. ACM (2013)Google Scholar
  16. 16.
    López-Ibáñez, M., Dubois-Lacoste, J., Stützle, T., Birattari, M.: The R-package Irace, iterated race for automatic algorithm configuration. Technical report TR/IRIDIA/2011-004, IRIDIA, Université Libre de Bruxelles, Belgium (2011)Google Scholar
  17. 17.
    Loshchilov, I., Schoenauer, M., Sebag, M., Hansen, N.: Maximum likelihood-based online adaptation of hyper-parameters in CMA-ES. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 70–79. Springer, Heidelberg (2014)Google Scholar
  18. 18.
    Nannen, V., Eiben, A.E.: Relevance estimation and value calibration of evolutionary algorithm parameters. In: IJCAI 2007, vol. 7, pp. 6–12 (2007)Google Scholar
  19. 19.
    Smit, S., Eiben, A.: Beating the “World champion" evolutionary algorithm via REVAC tuning. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 1–8, July 2010Google Scholar
  20. 20.
    Smit, S.K., Eiben, A.E.: Parameter tuning of evolutionary algorithms: generalist vs. specialist. In: Di Chio, C., et al. (eds.) EvoApplicatons 2010, Part I. LNCS, vol. 6024, pp. 542–551. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  21. 21.
    Teytaud, F.: A new selection ratio for large population sizes. In: Di Chio, C., et al. (eds.) EvoApplicatons 2010, Part I. LNCS, vol. 6024, pp. 452–460. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  22. 22.
    Teytaud, F., Teytaud, O.: On the parallel speed-up of estimation of multivariate normal algorithm and evolution strategies. In: Giacobini, M., et al. (eds.) EvoWorkshops 2009. LNCS, vol. 5484, pp. 655–664. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Nacim Belkhir
    • 1
    • 2
  • Johann Dréo
    • 1
  • Pierre Savéant
    • 1
  • Marc Schoenauer
    • 2
  1. 1.Thales Research and TechnologyPalaiseauFrance
  2. 2.TAO, Inria Saclay Île-de-FranceOrsayFrance

Personalised recommendations