Abstract
To obtain peak performance from optimization algorithms, it is required to set appropriately their parameters. Frequently, algorithm parameters can take values from the set of real numbers, or from a large integer set. To tune this kind of parameters, it is interesting to apply state-of-the-art continuous optimization algorithms instead of using a tedious, and error-prone, hands-on approach. In this paper, we study the performance of several continuous optimization algorithms for the algorithm parameter tuning task. As case studies, we use a number of optimization algorithms from the swarm intelligence literature.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Adenso-Díaz, B., Laguna, M.: Fine-tuning of algorithms using fractional experimental designs and local search. Operations Research 54(1), 99–114 (2006)
Ansotegui Gil, C., Sellmann, M., Tierney, K.: A gender-based genetic algorithm for the automatic configuration of solvers. In: Gent, I.P. (ed.) CP 2009. LNCS, vol. 5732, pp. 142–157. Springer, Heidelberg (2009)
Audet, C., Dennis, J.E., Mesh, J.: adaptive direct search algorithms for constrained optimization. SIAM Journal on Optimization 17(1), 188–217 (2006)
Auger, A., Hansen, N., Zerpa, J.M.P., Ros, R., Schoenauer, M.: Experimental comparisons of derivative free optimization algorithms. In: SEA 2009. LNCS, vol. 5526, pp. 3–15. Springer, Heidelberg (2009)
Bartz-Beielstein, T.: Experimental Research in Evolutionary Computation–The New Experimentalism. Springer, Berlin (2006)
Birattari, M.: The Problem of Tuning Metaheuristics as seen from a Machine Learning Perspective. Ph.D. thesis, Université Libre de Bruxelles (2004)
Birattari, M.: Tuning Metaheuristics: A machine learning perspective. Springer, Berlin (2009)
Birattari, M., Yuan, Z., Balaprakash, P., Stützle, T.: F-Race and iterated F-Race: An overview. In: Bartz-Beielstein, T., et al. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 311–336. Springer, Berlin (2009)
Dorigo, M., Stützle, T.: Ant Colony Optimization. MIT Press, Cambridge (2004)
Fukunaga, A.S.: Automated discovery of local search heuristics for satisfiability testing. Evolutionary Computation 16(1), 31–61 (2008)
Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J., et al. (eds.) Towards a new evolutionary computation, pp. 75–102. Springer, Berlin (2006)
Hutter, F., Hoos, H.H., Leyton-Brown, K., Murphy, K.P.: An experimental investigation of model-based parameter optimisation: SPO and beyond. In: Proc. of GECCO 2009, pp. 271–278. ACM press, New York (2009)
Hutter, F., Hoos, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: An automatic algorithm configuration framework. Journal of Artificial Intelligence Research 36, 267–306 (2009)
Johnson, D.S., McGeoch, L.A., Rego, C., Glover, F.: 8th DIMACS implementation challenge, http://www.research.att.com/~dsj/chtsp/
Jones, T., Forrest, S.: Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In: Proc. of 6th Int. Conf. on Genetic Algorithms, pp. 184–192. Morgan Kaufmann, San Francisco (1995)
Nannen, V., Eiben, A.E.: Relevance estimation and value calibration of evolutionary algorithm parameters. In: Proc. of IJCAI 2007, pp. 975–980 (2007)
Oltean, M.: Evolving evolutionary algorithms using linear genetic programming. Evolutionary Computation 13(3), 387–410 (2005)
Poli, R., Kennedy, J., Blackwell, T.: Particle swarm optimization. An overview. Swarm Intelligence 1(1), 33–57 (2007)
Powell, M.J.D.: The NEWUOA software for unconstrained optimization. In: Large-Scale Nonlinear Optimization, Nonconvex Optimization and Its Applications, vol. 83, pp. 255–297. Springer, Berlin (2006)
Powell, M.J.D.: The BOBYQA algorithm for bound constrained optimization without derivatives. Tech. Rep. NA2009/06, Department of Applied Mathematics and Theoretical Physics, University of Cambridge (2009)
Storn, R.: Differential evolution homepage, http://www.icsi.berkeley.edu/~storn/code.html#prac
Storn, R., Price, K.: Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization 11(4), 341–359 (1997)
Stützle, T.: Software ACOTSP, http://iridia.ulb.ac.be/~mdorigo/ACO/aco-code/public-software.html
Stützle, T., Hoos, H.: \(\cal MAX\)–\(\cal MIN\). Ant System. Future Generation Computer Systems 16(8), 889–914 (2000)
Ting, C.K., Huang, C.H.: Varying number of difference vectors in differential evolution. In: Proc. of CEC 2009, pp. 1351–1358. IEEE Press, Piscataway (2009)
Torczon, V.: On the convergence of pattern search algorithms. SIAM Journal on Optimization 7(1), 1–25 (1997)
Yuan, Z., Stützle, T., Birattari, M.: MADS/F-race: mesh adaptive direct search meets F-race. In: Ali, M., et al. (eds.) Trends in Applied Intelligent Systems. LNCS, vol. 6096, pp. 41–50. Springer, Heidelberg (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yuan, Z., de Oca, M.A.M., Birattari, M., Stützle, T. (2010). Modern Continuous Optimization Algorithms for Tuning Real and Integer Algorithm Parameters. In: Dorigo, M., et al. Swarm Intelligence. ANTS 2010. Lecture Notes in Computer Science, vol 6234. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15461-4_18
Download citation
DOI: https://doi.org/10.1007/978-3-642-15461-4_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-15460-7
Online ISBN: 978-3-642-15461-4
eBook Packages: Computer ScienceComputer Science (R0)