Advertisement

Maximum Likelihood-Based Online Adaptation of Hyper-Parameters in CMA-ES

  • Ilya Loshchilov
  • Marc Schoenauer
  • Michèle Sebag
  • Nikolaus Hansen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8672)

Abstract

The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is widely accepted as a robust derivative-free continuous optimization algorithm for non-linear and non-convex optimization problems. CMA-ES is well known to be almost parameterless, meaning that only one hyper-parameter, the population size, is proposed to be tuned by the user. In this paper, we propose a principled approach called self-CMA-ES to achieve the online adaptation of CMA-ES hyper-parameters in order to improve its overall performance. Experimental results show that for larger-than-default population size, the default settings of hyper-parameters of CMA-ES are far from being optimal, and that self-CMA-ES allows for dynamically approaching optimal settings.

Keywords

Evolutionary Computation Covariance Matrix Adaptation Evolution Strategy Rosenbrock Function Evolve Fuzzy System Auxiliary Optimization Problem 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Auger, A., Hansen, N.: A Restart CMA Evolution Strategy With Increasing Population Size. In: IEEE Congress on Evolutionary Computation, pp. 1769–1776. IEEE Press (2005)Google Scholar
  2. 2.
    Beyer, H.-G., Hellwig, M.: Controlling population size and mutation strength by meta-es under fitness noise. In: Proceedings of the Twelfth Workshop on Foundations of Genetic Algorithms XII, FOGA XII 2013, pp. 11–24. ACM (2013)Google Scholar
  3. 3.
    Hansen, N.: Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed. In: GECCO Companion, pp. 2389–2396 (2009)Google Scholar
  4. 4.
    Hansen, N., Auger, A., Finck, S., Ros, R.: Real-Parameter Black-Box Optimization Benchmarking 2010: Experimental Setup. Technical report, INRIA (2010)Google Scholar
  5. 5.
    Hansen, N., Müller, S., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation 11(1), 1–18 (2003)CrossRefGoogle Scholar
  6. 6.
    Hansen, N., Ostermeier, A.: Adapting Arbitrary Normal Mutation Distributions in Evolution Strategies: The Covariance Matrix Adaptation. In: International Conference on Evolutionary Computation, pp. 312–317 (1996)Google Scholar
  7. 7.
    Hansen, N., Ostermeier, A.: Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation 9(2), 159–195 (2001)CrossRefGoogle Scholar
  8. 8.
    Hansen, N., Ros, R.: Benchmarking a weighted negative covariance matrix update on the BBOB-2010 noisy testbed. In: GECCO 2010: Proceedings of the 12th Annual Conference Comp on Genetic and Evolutionary Computation, pp. 1681–1688. ACM, New York (2010)Google Scholar
  9. 9.
    Hoffmann, F., Holemann, S.: Controlled Model Assisted Evolution Strategy with Adaptive Preselection. In: International Symposium on Evolving Fuzzy Systems, pp. 182–187. IEEE (2006)Google Scholar
  10. 10.
    Igel, C., Hüsken, M.: Empirical evaluation of the improved rprop learning algorithms. Neurocomputing 50, 105–123 (2003)CrossRefMATHGoogle Scholar
  11. 11.
    Liao, T., Stützle, T.: Benchmark results for a simple hybrid algorithm on the CEC 2013 benchmark set for real-parameter optimization. In: IEEE Congress on Evolutionary Computation (CEC), pp. 1938–1944. IEEE Press (2013)Google Scholar
  12. 12.
    Loshchilov, I., Schoenauer, M., Sebag, M.: Alternative Restart Strategies for CMA-ES. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012, Part I. LNCS, vol. 7491, pp. 296–305. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  13. 13.
    Loshchilov, I., Schoenauer, M., Sebag, M.: Self-Adaptive Surrogate-Assisted Covariance Matrix Adaptation Evolution Strategy. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 321–328. ACM Press (July 2012)Google Scholar
  14. 14.
    Loshchilov, I., Schoenauer, M., Sebag, M.: Intensive Surrogate Model Exploitation in Self-adaptive Surrogate-assisted CMA-ES (saACM-ES). In: Genetic and Evolutionary Computation Conference, pp. 439–446. ACM (2013)Google Scholar
  15. 15.
    Schaul, T.: Comparing natural evolution strategies to bipop-cma-es on noiseless and noisy black-box optimization testbeds. In: Genetic and Evolutionary Computation Conference Companion, pp. 237–244. ACM (2012)Google Scholar
  16. 16.
    Smit, S., Eiben, A.: Beating the ‘world champion’ Evolutionary Algorithm via REVAC Tuning. IEEE Congress on Evolutionary Computation, 1–8 (2010)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Ilya Loshchilov
    • 1
  • Marc Schoenauer
    • 2
    • 3
  • Michèle Sebag
    • 3
    • 2
  • Nikolaus Hansen
    • 2
    • 3
  1. 1.Laboratory of Intelligent SystemsÉcole Polytechnique Fédérale de LausanneSwitzerland
  2. 2.TAO Project-teamINRIA SaclayÎle-de-FranceFrance
  3. 3.Laboratoire de Recherche en Informatique (UMR CNRS 8623)Université Paris-SudOrsay CedexFrance

Personalised recommendations