Advertisement

Feature Based Algorithm Configuration: A Case Study with Differential Evolution

  • Nacim Belkhir
  • Johann Dréo
  • Pierre Savéant
  • Marc Schoenauer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9921)

Abstract

Algorithm Configuration is still an intricate problem especially in the continuous black box optimization domain. This paper empirically investigates the relationship between continuous problem features (measuring different problem characteristics) and the best parameter configuration of a given stochastic algorithm over a bench of test functions — namely here, the original version of Differential Evolution over the BBOB test bench. This is achieved by learning an empirical performance model from the problem features and the algorithm parameters. This performance model can then be used to compute an empirical optimal parameter configuration from features values. The results show that reasonable performance models can indeed be learned, resulting in a better parameter configuration than a static parameter setting optimized for robustness over the test bench.

Keywords

Empirical study Black-box continuous optimization Problem features Algorithm configuration Empirical Performance Model Differential Evolution 

References

  1. 1.
    Auger, A., Teytaud, O.: Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica (2009). https://hal.inria.fr/inria-00369788
  2. 2.
    Belkhir, N., Dréo, J., Savéant, P., Schoenauer, M.: Surrogate assisted feature computation for continuous problems. In: Proceedings of LION 10 (2016, to appear). https://hal.archives-ouvertes.fr/hal-01303320
  3. 3.
    Bischl, B., Mersmann, O., Trautmann, H., Preuß, M.: Algorithm selection based on exploratory landscape analysis and cost-sensitive learning. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, pp. 313–320. ACM (2012)Google Scholar
  4. 4.
    Bossek, J., Bischl, B., Wagner, T., Rudolph, G.: Learning feature-parameter mappings for parameter tuning via the profile expected improvement. In: Proceedings of the 2015 on Genetic and Evolutionary Computation Conference, pp. 1319–1326. ACM (2015)Google Scholar
  5. 5.
    Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-box optimization benchmarking 2010: experimental setup. Technical report, RR-7215, INRIA (2010)Google Scholar
  6. 6.
    Hoos, H.H.: Programming by optimization. Commun. ACM 55(2), 70–80 (2012)CrossRefGoogle Scholar
  7. 7.
    Hutter, F., Hamadi, Y., Hoos, H.H., Leyton-Brown, K.: Performance prediction and automated tuning of randomized and parametric algorithms. In: Benhamou, F. (ed.) CP 2006. LNCS, vol. 4204, pp. 213–228. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  8. 8.
    Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 5, 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  9. 9.
    Hutter, F., Xu, L., Hoos, H.H., Leyton-Brown, K.: Algorithm runtime prediction: methods & evaluation. Artif. Intell. 206, 79–111 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Leyton-Brown, K., Nudelman, E., Shoham, Y.: Learning the empirical hardness of optimization problems: the case of combinatorial auctions. In: Hentenryck, P. (ed.) CP 2002. LNCS, vol. 2470, pp. 556–572. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  11. 11.
    Lunacek, M., Whitley, D.: The dispersion metric and the cma evolution strategy. In: Proceedings of the 8th GECCO, pp. 477–484. ACM (2006)Google Scholar
  12. 12.
    Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., Rudolph, G.: Exploratory landscape analysis. In: Proceedings of the 13th GECCO, pp. 829–836. ACM (2011)Google Scholar
  13. 13.
    Munoz, M., Kirley, M., Halgamuge, S.K., et al.: Exploratory landscape analysis of continuous space optimization problems using information content. IEEE Trans. Evol. Comput. 19(1), 74–87 (2015)CrossRefGoogle Scholar
  14. 14.
    Muñoz, M.A., Kirley, M., Halgamuge, S.K.: A meta-learning prediction model of algorithm performance for continuous optimization problems. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012, Part I. LNCS, vol. 7491, pp. 226–235. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  15. 15.
    Rice, J.R.: The algorithm selection problem. Adv. Comput. 15, 65–118 (1976)CrossRefGoogle Scholar
  16. 16.
    Storn, R., Price, K.: Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341–359 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Wolpert, D., Macready, W.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)CrossRefGoogle Scholar
  18. 18.
    Xu, L., Hutter, F., Hoos, H.H., Leyton-Brown, K.: Satzilla: portfolio-based algorithm selection for sat. J. Artif. Intell. Res. 565–606 (2008)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Nacim Belkhir
    • 1
    • 2
  • Johann Dréo
    • 1
  • Pierre Savéant
    • 1
  • Marc Schoenauer
    • 2
  1. 1.Thales Research & TechnologyPalaiseauFrance
  2. 2.TAO, Inria Saclay Île-de-FranceOrsayFrance

Personalised recommendations