Advertisement

A Comparative Study of Large-Scale Variants of CMA-ES

  • Konstantinos VarelasEmail author
  • Anne Auger
  • Dimo Brockhoff
  • Nikolaus Hansen
  • Ouassim Ait ElHara
  • Yann Semet
  • Rami Kassab
  • Frédéric Barbaresco
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11101)

Abstract

The CMA-ES is one of the most powerful stochastic numerical optimizers to address difficult black-box problems. Its intrinsic time and space complexity is quadratic—limiting its applicability with increasing problem dimensionality. To circumvent this limitation, different large-scale variants of CMA-ES with subquadratic complexity have been proposed over the past ten years. To-date however, these variants have been tested and compared only in rather restrictive settings, due to the lack of a comprehensive large-scale testbed to assess their performance. In this context, we introduce a new large-scale testbed with dimension up to 640, implemented within the COCO benchmarking platform. We use this testbed to assess the performance of several promising variants of CMA-ES and the standard limited-memory L-BFGS. In all tested dimensions, the best CMA-ES variant solves more problems than L-BFGS for larger budgets while L-BFGS outperforms the best CMA-ES variant for smaller budgets. However, over all functions, the cumulative runtime distributions between L-BFGS and the best CMA-ES variants are close (less than a factor of 4 in high dimension).

Our results illustrate different scaling behaviors of the methods, expose a few defects of the algorithms and reveal that for dimension larger than 80, LM-CMA solves more problems than VkD-CMA while in the cumulative runtime distribution over all functions the VkD-CMA dominates LM-CMA for budgets up to \(10^4\) times dimension and for all budgets up to dimension 80.

Notes

Acknowledgement

The PhD thesis of Konstantinos Varelas is funded by the French MoD DGA/MRIS and Thales Land & Air Systems.

References

  1. 1.
    Ait ElHara, O., Auger, A., Hansen, N.: Permuted orthogonal block-diagonal transformation matrices for large scale optimization benchmarking. In: Genetic and Evolutionary Computation Conference (GECCO 2016), pp. 189–196. ACM (2016)Google Scholar
  2. 2.
    Akimoto, Y., Hansen, N.: Online model selection for restricted covariance matrix adaptation. In: Handl, J., Hart, E., Lewis, P.R., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds.) PPSN 2016. LNCS, vol. 9921, pp. 3–13. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-45823-6_1CrossRefGoogle Scholar
  3. 3.
    Akimoto, Y., Hansen, N.: Projection-based restricted covariance matrix adaptation for high dimension. In: Genetic and Evolutionary Computation Conference (GECCO 2016), pp. 197–204. Denver, USA, July 2016Google Scholar
  4. 4.
    Hansen, N., Auger, A., Mersmann, O., Tušar, T., Brockhoff, D.: COCO: A platform for comparing continuous optimizers in a black-box setting (2016). arXiv:1603.08785
  5. 5.
    Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Research Report RR-6829, INRIA (2009)Google Scholar
  6. 6.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)CrossRefGoogle Scholar
  7. 7.
    Knight, J.N., Lunacek, M.: Reducing the space-time complexity of the CMA-ES. In: Genetic and Evolutionary Computation Conference (GECCO 2007), pp. 658–665. ACM (2007)Google Scholar
  8. 8.
    Krause, O., Arbonès, D.R., Igel, C.: CMA-ES with optimal covariance update and storage complexity. In: NIPS Proceedings (2016)Google Scholar
  9. 9.
    Li, Z., Zhang, Q.: A simple yet efficient evolution strategy for large scale black-box optimization. IEEE Trans. Evol. Comput. (2017, accepted)Google Scholar
  10. 10.
    Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45(3), 503–528 (1989)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Loshchilov, I.: LM-CMA: an alternative to L-BFGS for large scale black-box optimization. Evol. Comput. 25, 143–171 (2017)CrossRefGoogle Scholar
  12. 12.
    Loshchilov, I.: A computationally efficient limited memory CMA-ES for large scale optimization. In: Genetic and Evolutionary Computation Conference (GECCO 2014), pp. 397–404 (2014)Google Scholar
  13. 13.
    Loshchilov, I., Glasmachers, T., Beyer, H.: Limited-memory matrix adaptation for large scale black-box optimization. CoRR abs/1705.06693 (2017)Google Scholar
  14. 14.
    Ros, R., Hansen, N.: A simple modification in CMA-ES achieving linear time and space complexity. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 296–305. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-87700-4_30CrossRefGoogle Scholar
  15. 15.
    Sun, Y., Gomez, F.J., Schaul, T., Schmidhuber, J.: A linear time natural evolution strategy for non-separable functions. CoRR abs/1106.1998 (2011)Google Scholar
  16. 16.
    Suttorp, T., Hansen, N., Igel, C.: Efficient covariance matrix update for variable metric evolution strategies. Mach. Learn. 75(2), 167–197 (2009)CrossRefGoogle Scholar
  17. 17.
    Tang, K., et al.: Benchmark functions for the CEC 2008 special session and competition on large scale global optimization (2007)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Konstantinos Varelas
    • 1
    • 2
    Email author
  • Anne Auger
    • 1
  • Dimo Brockhoff
    • 1
  • Nikolaus Hansen
    • 1
  • Ouassim Ait ElHara
    • 1
  • Yann Semet
    • 3
  • Rami Kassab
    • 2
  • Frédéric Barbaresco
    • 2
  1. 1.Inria, RandOpt team, CMAP, École PolytechniquePalaiseauFrance
  2. 2.Thales LAS France SAS - LimoursLimoursFrance
  3. 3.Thales Research TechnologyPalaiseauFrance

Personalised recommendations