Skip to main content

Parameter Setting for Multicore CMA-ES with Large Populations

  • Conference paper
  • First Online:
Artificial Evolution (EA 2015)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9554))

Abstract

The goal of this paper is to investigate on the overall performance of CMA-ES, when dealing with a large number of cores — considering the direct mapping between cores and individuals — and to empirically find the best parameter strategies for a parallel machine. By considering the problem of parameter setting, we empirically determine a new strategy for CMA-ES, and we investigate whether Self-CMA-ES (a self-adaptive variant of CMA-ES) could be a viable alternative to CMA-ES when using parallel computers with a coarse-grained distribution of the fitness evaluations. According to a large population size, the resulting new strategy for Self-CMA-ES and CMA-ES, is experimentally validated on BBOB benchmark where it is shown to outperform a CMA-ES with default parameter strategy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    This also covers the case where \(\lambda \) is set to some multiple of the number of cores.

  2. 2.

    SMAC is freely available at http://www.cs.ubc.ca/labs/beta/Projects/SMAC/.

  3. 3.

    http://coco.gforge.inria.fr.

  4. 4.

    https://sites.google.com/site/selfcmappsn/.

References

  1. Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: CEC 2005, vol. 2, pp. 1769–1776. IEEE (2005)

    Google Scholar 

  2. Bartz-Beielstein, T., Lasarczyk, C.W., Preuß, M.: Sequential parameter optimization. In: CEC 2005, vol. 1, pp. 773–780. IEEE (2005)

    Google Scholar 

  3. Beyer, H.-G., Sendhoff, B.: Covariance matrix adaptation revisited – The CMSA evolution strategy –. In: Rudolph, G., Jansen, T., Lucas, S., Poloni, C., Beume, N. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 123–132. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  4. Birattari, M., Stützle, T., Paquete, L., Varrentrapp, K., et al.: A racing algorithm for configuring metaheuristics. In: Langdon, W.B., et al. (ed.) Proceedings of ACM GECCO 2002, pp. 11–18 (2002)

    Google Scholar 

  5. Eiben, A., Michalewicz, Z., Schoenauer, M., Smith, J.E.: Parameter control in evolutionary algorithms. In: Lobo, F., Lima, C.F., Michalewicz, Z. (eds.) Parameter Setting in Evolutionary Algorithms, pp. 19–46. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  6. Hansen, N., Müller, S., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 11(1), 1–18 (2003)

    Article  Google Scholar 

  7. Hansen, N., Niederberger, S., Guzzella, L., Koumoutsakos, P.: A method for handling uncertainty in evolutionary optimization with an application to feedback control of combustion. IEEE Trans. Evol. Comput. 13(1), 180–197 (2009)

    Article  Google Scholar 

  8. Hansen, N.: Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed. In: Rothlauf, F. (ed.) GECCO Companion, pp. 2389–2396. ACM (2009)

    Google Scholar 

  9. Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-box optimization benchmarking 2010: experimental setup. Technical report RR-7215, INRIA (2010)

    Google Scholar 

  10. Hansen, N., Ros, R., Mauny, N., Schoenauer, M., Auger, A.: Impacts of invariance in search: when CMA-ES and PSO face Ill-conditioned and non-separable problems. Appl. Soft Comput. 11, 5755–5769 (2011)

    Article  Google Scholar 

  11. Hoos, H.H.: Programming by optimization. Commun. ACM 55(2), 70–80 (2012)

    Article  Google Scholar 

  12. Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  13. Hutter, F., Hoos, H.H., Leyton-Brown, K., Murphy, K.P.: An experimental investigation of model-based parameter optimisation: SPO and beyond. In: Rothlauf, F. (ed.) GECCO 2009, pp. 271–278. ACM (2009)

    Google Scholar 

  14. Hutter, F., Hoos, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: an automatic algorithm configuration framework. JAIR 36(1), 267–306 (2009)

    MATH  Google Scholar 

  15. Liao, T., Stützle, T.: Testing the impact of parameter tuning on a variant of IPOP-CMA-ES with a bounded maximum population size on the noiseless BBOB testbed. In: Proceedings of ACM GECCO, pp. 1169–1176. ACM (2013)

    Google Scholar 

  16. López-Ibáñez, M., Dubois-Lacoste, J., Stützle, T., Birattari, M.: The R-package Irace, iterated race for automatic algorithm configuration. Technical report TR/IRIDIA/2011-004, IRIDIA, Université Libre de Bruxelles, Belgium (2011)

    Google Scholar 

  17. Loshchilov, I., Schoenauer, M., Sebag, M., Hansen, N.: Maximum likelihood-based online adaptation of hyper-parameters in CMA-ES. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 70–79. Springer, Heidelberg (2014)

    Google Scholar 

  18. Nannen, V., Eiben, A.E.: Relevance estimation and value calibration of evolutionary algorithm parameters. In: IJCAI 2007, vol. 7, pp. 6–12 (2007)

    Google Scholar 

  19. Smit, S., Eiben, A.: Beating the “World champion" evolutionary algorithm via REVAC tuning. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 1–8, July 2010

    Google Scholar 

  20. Smit, S.K., Eiben, A.E.: Parameter tuning of evolutionary algorithms: generalist vs. specialist. In: Di Chio, C., et al. (eds.) EvoApplicatons 2010, Part I. LNCS, vol. 6024, pp. 542–551. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  21. Teytaud, F.: A new selection ratio for large population sizes. In: Di Chio, C., et al. (eds.) EvoApplicatons 2010, Part I. LNCS, vol. 6024, pp. 452–460. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  22. Teytaud, F., Teytaud, O.: On the parallel speed-up of estimation of multivariate normal algorithm and evolution strategies. In: Giacobini, M., et al. (eds.) EvoWorkshops 2009. LNCS, vol. 5484, pp. 655–664. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nacim Belkhir .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Belkhir, N., Dréo, J., Savéant, P., Schoenauer, M. (2016). Parameter Setting for Multicore CMA-ES with Large Populations. In: Bonnevay, S., Legrand, P., Monmarché, N., Lutton, E., Schoenauer, M. (eds) Artificial Evolution. EA 2015. Lecture Notes in Computer Science(), vol 9554. Springer, Cham. https://doi.org/10.1007/978-3-319-31471-6_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-31471-6_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-31470-9

  • Online ISBN: 978-3-319-31471-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics