Advertisement

Modern Continuous Optimization Algorithms for Tuning Real and Integer Algorithm Parameters

  • Zhi Yuan
  • Marco A. Montes de Oca
  • Mauro Birattari
  • Thomas Stützle
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6234)

Abstract

To obtain peak performance from optimization algorithms, it is required to set appropriately their parameters. Frequently, algorithm parameters can take values from the set of real numbers, or from a large integer set. To tune this kind of parameters, it is interesting to apply state-of-the-art continuous optimization algorithms instead of using a tedious, and error-prone, hands-on approach. In this paper, we study the performance of several continuous optimization algorithms for the algorithm parameter tuning task. As case studies, we use a number of optimization algorithms from the swarm intelligence literature.

Keywords

Particle Swarm Optimization Travel Salesman Problem Sampling Algorithm Continuous Optimization Covariance Matrix Adaptation Evolution Strategy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Adenso-Díaz, B., Laguna, M.: Fine-tuning of algorithms using fractional experimental designs and local search. Operations Research 54(1), 99–114 (2006)MATHCrossRefGoogle Scholar
  2. 2.
    Ansotegui Gil, C., Sellmann, M., Tierney, K.: A gender-based genetic algorithm for the automatic configuration of solvers. In: Gent, I.P. (ed.) CP 2009. LNCS, vol. 5732, pp. 142–157. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  3. 3.
    Audet, C., Dennis, J.E., Mesh, J.: adaptive direct search algorithms for constrained optimization. SIAM Journal on Optimization 17(1), 188–217 (2006)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Auger, A., Hansen, N., Zerpa, J.M.P., Ros, R., Schoenauer, M.: Experimental comparisons of derivative free optimization algorithms. In: SEA 2009. LNCS, vol. 5526, pp. 3–15. Springer, Heidelberg (2009)Google Scholar
  5. 5.
    Bartz-Beielstein, T.: Experimental Research in Evolutionary Computation–The New Experimentalism. Springer, Berlin (2006)MATHGoogle Scholar
  6. 6.
    Birattari, M.: The Problem of Tuning Metaheuristics as seen from a Machine Learning Perspective. Ph.D. thesis, Université Libre de Bruxelles (2004)Google Scholar
  7. 7.
    Birattari, M.: Tuning Metaheuristics: A machine learning perspective. Springer, Berlin (2009)MATHGoogle Scholar
  8. 8.
    Birattari, M., Yuan, Z., Balaprakash, P., Stützle, T.: F-Race and iterated F-Race: An overview. In: Bartz-Beielstein, T., et al. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 311–336. Springer, Berlin (2009)Google Scholar
  9. 9.
    Dorigo, M., Stützle, T.: Ant Colony Optimization. MIT Press, Cambridge (2004)MATHGoogle Scholar
  10. 10.
    Fukunaga, A.S.: Automated discovery of local search heuristics for satisfiability testing. Evolutionary Computation 16(1), 31–61 (2008)CrossRefGoogle Scholar
  11. 11.
    Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J., et al. (eds.) Towards a new evolutionary computation, pp. 75–102. Springer, Berlin (2006)CrossRefGoogle Scholar
  12. 12.
    Hutter, F., Hoos, H.H., Leyton-Brown, K., Murphy, K.P.: An experimental investigation of model-based parameter optimisation: SPO and beyond. In: Proc. of GECCO 2009, pp. 271–278. ACM press, New York (2009)CrossRefGoogle Scholar
  13. 13.
    Hutter, F., Hoos, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: An automatic algorithm configuration framework. Journal of Artificial Intelligence Research 36, 267–306 (2009)MATHGoogle Scholar
  14. 14.
    Johnson, D.S., McGeoch, L.A., Rego, C., Glover, F.: 8th DIMACS implementation challenge, http://www.research.att.com/~dsj/chtsp/
  15. 15.
    Jones, T., Forrest, S.: Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In: Proc. of 6th Int. Conf. on Genetic Algorithms, pp. 184–192. Morgan Kaufmann, San Francisco (1995)Google Scholar
  16. 16.
    Nannen, V., Eiben, A.E.: Relevance estimation and value calibration of evolutionary algorithm parameters. In: Proc. of IJCAI 2007, pp. 975–980 (2007)Google Scholar
  17. 17.
    Oltean, M.: Evolving evolutionary algorithms using linear genetic programming. Evolutionary Computation 13(3), 387–410 (2005)CrossRefGoogle Scholar
  18. 18.
    Poli, R., Kennedy, J., Blackwell, T.: Particle swarm optimization. An overview. Swarm Intelligence 1(1), 33–57 (2007)CrossRefGoogle Scholar
  19. 19.
    Powell, M.J.D.: The NEWUOA software for unconstrained optimization. In: Large-Scale Nonlinear Optimization, Nonconvex Optimization and Its Applications, vol. 83, pp. 255–297. Springer, Berlin (2006)Google Scholar
  20. 20.
    Powell, M.J.D.: The BOBYQA algorithm for bound constrained optimization without derivatives. Tech. Rep. NA2009/06, Department of Applied Mathematics and Theoretical Physics, University of Cambridge (2009)Google Scholar
  21. 21.
    Storn, R.: Differential evolution homepage, http://www.icsi.berkeley.edu/~storn/code.html#prac
  22. 22.
    Storn, R., Price, K.: Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization 11(4), 341–359 (1997)MATHCrossRefMathSciNetGoogle Scholar
  23. 23.
  24. 24.
    Stützle, T., Hoos, H.: \(\cal MAX\)\(\cal MIN\). Ant System. Future Generation Computer Systems 16(8), 889–914 (2000)CrossRefGoogle Scholar
  25. 25.
    Ting, C.K., Huang, C.H.: Varying number of difference vectors in differential evolution. In: Proc. of CEC 2009, pp. 1351–1358. IEEE Press, Piscataway (2009)Google Scholar
  26. 26.
    Torczon, V.: On the convergence of pattern search algorithms. SIAM Journal on Optimization 7(1), 1–25 (1997)MATHCrossRefMathSciNetGoogle Scholar
  27. 27.
    Yuan, Z., Stützle, T., Birattari, M.: MADS/F-race: mesh adaptive direct search meets F-race. In: Ali, M., et al. (eds.) Trends in Applied Intelligent Systems. LNCS, vol. 6096, pp. 41–50. Springer, Heidelberg (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Zhi Yuan
    • 1
  • Marco A. Montes de Oca
    • 1
  • Mauro Birattari
    • 1
  • Thomas Stützle
    • 1
  1. 1.IRIDIA, CoDEUniversité Libre de BruxellesBrusselsBelgium

Personalised recommendations