Advertisement

Automated Design of Metaheuristic Algorithms

  • Thomas StützleEmail author
  • Manuel López-Ibáñez
Chapter
Part of the International Series in Operations Research & Management Science book series (ISOR, volume 272)

Abstract

The design and development of metaheuristic algorithms can be time-consuming and difficult for a number of reasons including the complexity of the problems being tackled, the large number of degrees of freedom when designing an algorithm and setting its numerical parameters, and the difficulties of algorithm analysis due to heuristic biases and stochasticity. Traditionally, this design and development has been done through a manual, labor-intensive approach guided mainly by the expertise and intuition of the algorithm designer. In recent years, a number of automatic algorithm configuration methods have been developed that are able to effectively search large and diverse parameter spaces. They have been shown to be very successful in identifying high-performing algorithm designs and parameter settings. In this chapter, we review the recent advances in addressing automatic metaheuristic algorithm design and configuration. We describe the main existing automatic algorithm configuration techniques and discuss some of the main uses of such techniques, ranging from the mere optimization of the performance of already developed metaheuristic algorithms to their pivotal role in modifying the way metaheuristic algorithms will be designed and developed in the future.

Notes

Acknowledgements

The authors would like to thanks the editors for the careful reading of the chapter and the valuable comments for improving the presentation. Thomas Stützle acknowledges support from the F.R.S.-FNRS, of which he is a research director. This work received support from the COMEX project P7/36 within the Interuniversity Attraction Poles Programme of the Belgian Science Policy Office.

References

  1. 1.
    E.H.L. Aarts, J.K. Lenstra (eds.), Local Search in Combinatorial Optimization (Wiley, Chichester, 1997)Google Scholar
  2. 2.
    B. Adenso-Díaz, M. Laguna, Fine-tuning of algorithms using fractional experimental design and local search. Oper. Res. 54(1), 99–114 (2006)CrossRefGoogle Scholar
  3. 3.
    S. Aine, R. Kumar, P.P. Chakrabarti, Adaptive parameter control of evolutionary algorithms to improve quality-time trade-off. Appl. Soft Comput. 9(2), 527–540 (2009)CrossRefGoogle Scholar
  4. 4.
    J. Ansel, S. Kamil, K. Veeramachaneni, J. Ragan-Kelley, J. Bosboom, U.M. O’Reilly, S. Amarasinghe, Opentuner: an extensible framework for program autotuning, in Proceedings of the 23rd International Conference on Parallel Architectures and Compilation (ACM, New York, 2014), pp. 303–315Google Scholar
  5. 5.
    C. Ansótegui, M. Sellmann, K. Tierney, A gender-based genetic algorithm for the automatic configuration of algorithms, in Principles and Practice of Constraint Programming, CP 2009, ed. by I.P. Gent. Lecture Notes in Computer Science, vol. 5732 (Springer, Heidelberg, 2009), pp. 142–157Google Scholar
  6. 6.
    C. Ansótegui, Y. Malitsky, H. Samulowitz, M. Sellmann, K. Tierney, Model-based genetic algorithms for algorithm configuration, in Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI-15), ed. by Q. Yang, M. Wooldridge (IJCAI/AAAI Press, Menlo Park, 2015), pp. 733–739Google Scholar
  7. 7.
    J. April, F. Glover, J.P. Kelly, M. Laguna, Simulation-based optimization: practical introduction to simulation optimization, in Proceedings of the 35th Winter Simulation Conference: Driving Innovation, December 2003, vol. 1, ed. by S.E. Chick, P.J. Sanchez, D.M. Ferrin, D.J. Morrice (ACM Press, New York, 2003), pp. 71–78Google Scholar
  8. 8.
    C. Audet, D. Orban, Finding optimal algorithmic parameters using derivative-free optimization. SIAM J. Optim. 17(3), 642–664 (2006)CrossRefGoogle Scholar
  9. 9.
    C. Audet, K.-C. Dang, D. Orban, Optimization of algorithms with OPAL. Math. Program. Comput. 6(3), 233–254 (2014)CrossRefGoogle Scholar
  10. 10.
    D. Aydın, G. Yavuz, T. Stützle, ABC-X: a generalized, automatically configurable artificial bee colony framework. Swarm Intell. 11(1), 1–38 (2017)CrossRefGoogle Scholar
  11. 11.
    P. Balaprakash, M. Birattari, T. Stützle, Improvement strategies for the F-race algorithm: sampling design and iterative refinement, in Hybrid Metaheuristics, ed. by T. Bartz-Beielstein, M.J. Blesa, C. Blum, B. Naujoks, A. Roli, G. Rudolph, M. Sampels. Lecture Notes in Computer Science, vol. 4771 (Springer, Heidelberg, 2007), pp. 108–122Google Scholar
  12. 12.
    P. Balaprakash, M. Birattari, T. Stützle, M. Dorigo, Adaptive sampling size and importance sampling in estimation-based local search for the probabilistic traveling salesman problem. Eur. J. Oper. Res. 199(1), 98–110 (2009)CrossRefGoogle Scholar
  13. 13.
    P. Balaprakash, M. Birattari, T. Stützle, M. Dorigo, Estimation-based metaheuristics for the probabilistic travelling salesman problem. Comput. Oper. Res. 37(11), 1939–1951 (2010)CrossRefGoogle Scholar
  14. 14.
    P. Balaprakash, M. Birattari, T. Stützle, M. Dorigo, Estimation-based metaheuristics for the single vehicle routing problem with stochastic demands and customers. Comput. Optim. Appl. 61(2), 463–487 (2015)CrossRefGoogle Scholar
  15. 15.
    R.S. Barr, B.L. Golden, J.P. Kelly, M.G.C. Resende, W.R. Stewart, Designing and reporting on computational experiments with heuristic methods. J. Heuristics 1(1), 9–32 (1995)CrossRefGoogle Scholar
  16. 16.
    T. Bartz-Beielstein, S. Markon, Tuning search algorithms for real-world applications: a regression tree based approach, in Proceedings of the 2004 Congress on Evolutionary Computation (CEC 2004), September 2004 (IEEE Press, Piscataway, 2004), pp. 1111–1118Google Scholar
  17. 17.
    T. Bartz-Beielstein, C. Lasarczyk, M. Preuss, Sequential parameter optimization, in Proceedings of the 2005 Congress on Evolutionary Computation (CEC 2005), September 2005 (IEEE Press, Piscataway, 2005), pp. 773–780Google Scholar
  18. 18.
    M. Battistutta, A. Schaerf, T. Urli, Feature-based tuning of single-stage simulated annealing for examination timetabling. Ann. Oper. Res. 252(2), 239–254 (2017)CrossRefGoogle Scholar
  19. 19.
    R. Battiti, G. Tecchiolli, The reactive tabu search. ORSA J. Comput. 6(2), 126–140 (1994)CrossRefGoogle Scholar
  20. 20.
    R. Battiti, M. Brunato, F. Mascia, Reactive Search and Intelligent Optimization. Operations Research/Computer Science Interfaces, vol. 45 (Springer, New York, 2008)Google Scholar
  21. 21.
    E.B. Baum, Iterated descent: a better algorithm for local search in combinatorial optimization problems. Manuscript, 1986Google Scholar
  22. 22.
    E.B. Baum, Towards practical “neural” computation for combinatorial optimization problems, in AIP Conference Proceedings on Neural Networks for Computing (1986), pp. 53–64Google Scholar
  23. 23.
    J. Baxter, Local optima avoidance in depot location. J. Oper. Res. Soc. 32(9), 815–819 (1981)CrossRefGoogle Scholar
  24. 24.
    N. Belkhir, J. Dréo, P. Savéant, M. Schoenauer, Per instance algorithm configuration of CMA-ES with limited budget, in Genetic and Evolutionary Computation Conference, GECCO 2017, Berlin, 15–19 July 2017, ed. by P.A.N. Bosman (ACM Press, New York, 2017), pp. 681–688Google Scholar
  25. 25.
    J.S. Bergstra, Y. Bengio, Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)Google Scholar
  26. 26.
    L.C.T. Bezerra, M. López-Ibáñez, T. Stützle, Automatic design of evolutionary algorithms for multi-objective combinatorial optimization, in PPSN 2014, ed. by T. Bartz-Beielstein, J. Branke, B. Filipič, J. Smith. Lecture Notes in Computer Science, vol. 8672 (Springer, Heidelberg, 2014), pp. 508–517Google Scholar
  27. 27.
    L.C.T. Bezerra, M. López-Ibáñez, T. Stützle, Automatic component-wise design of multi-objective evolutionary algorithms. IEEE Trans. Evol. Comput. 20(3), 403–417 (2016)CrossRefGoogle Scholar
  28. 28.
    L.C.T. Bezerra, M. López-Ibáñez, T. Stützle, Automatic configuration of multi-objective optimizers and multi-objective configuration. Technical Report TR/IRIDIA/2017-011, IRIDIA, Université Libre de Bruxelles, Brussels, November 2017Google Scholar
  29. 29.
    M. Birattari, The problem of tuning metaheuristics as seen from a machine learning perspective. PhD thesis, IRIDIA, École polytechnique, Université Libre de Bruxelles, Brussels, 2004Google Scholar
  30. 30.
    M. Birattari, T. Stützle, L. Paquete, K. Varrentrapp, A racing algorithm for configuring metaheuristics, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2002, ed. by W.B. Langdon et al. (Morgan Kaufmann Publishers, San Francisco, 2002), pp. 11–18Google Scholar
  31. 31.
    M. Birattari, Z. Yuan, P. Balaprakash, T. Stützle, F-race and iterated F-race: an overview, in Experimental Methods for the Analysis of Optimization Algorithms, ed. by T. Bartz-Beielstein, M. Chiarandini, L. Paquete, M. Preuss (Springer, Berlin, 2010), pp. 311–336CrossRefGoogle Scholar
  32. 32.
    C. Blackmore, O. Ray, K. Eder, Automatically tuning the GCC compiler to optimize the performance of applications running on the ARM cortex-M3. Technical report, CoRR, 2017. https://arxiv.org/abs/1703.08228
  33. 33.
    L. Breiman, Random forests. Mach. Learn. 45(1), 5–32 (2001)CrossRefGoogle Scholar
  34. 34.
    E.K. Burke, M. Gendreau, M.R. Hyde, G. Kendall, G. Ochoa, E. Özcan, R. Qu, Hyper-heuristics: a survey of the state of the art. J. Oper. Res. Soc. 64(12), 1695–1724 (2013)CrossRefGoogle Scholar
  35. 35.
    S. Cahon, N. Melab, E.-G. Talbi, ParadisEO: a framework for the reusable design of parallel and distributed metaheuristics. J. Heuristics 10(3), 357–380 (2004)CrossRefGoogle Scholar
  36. 36.
    V. Černý, A thermodynamical approach to the traveling salesman problem: an efficient simulation algorithm. J. Optim. Theory Appl. 45(1), 41–51 (1985)CrossRefGoogle Scholar
  37. 37.
    M. Chiarandini, Stochastic local search methods for highly constrained combinatorial optimisation problems. PhD thesis, FB Informatik, TU Darmstadt, Darmstadt, 2005Google Scholar
  38. 38.
    M. Christen, O. Schenk, H. Burkhart, PATUS: a code generation and autotuning framework for parallel iterative stencil computations on modern microarchitectures, in Proceedings of the 2011 IEEE International Parallel & Distributed Processing Symposium, IPDPS ‘11 (IEEE Computer Society, Los Alamitos, 2011), pp. 676–687Google Scholar
  39. 39.
    W.J. Conover, Practical Nonparametric Statistics, 3rd edn. (Wiley, New York, 1999)Google Scholar
  40. 40.
    S.P. Coy, B.L. Golden, G.C. Runger, E.A. Wasil, Using experimental design to find effective parameter settings for heuristics. J. Heuristics 7(1), 77–97 (2001)CrossRefGoogle Scholar
  41. 41.
    N. Dang Thi Thanh, L. Pérez Cáceres, P. De Causmaecker, T. Stützle, Configuring irace using surrogate configuration benchmarks, in Genetic and Evolutionary Computation Conference, GECCO 2017, Berlin, 15–19 July 2017, ed. by P.A.N. Bosman (ACM Press, New York, 2017), pp. 243–250Google Scholar
  42. 42.
    U. Derigs, U. Vogel, Experience with a framework for developing heuristics for solving rich vehicle routing problems. J. Heuristics 20(1), 75–106 (2014)CrossRefGoogle Scholar
  43. 43.
    L. Di Gaspero, A. Schaerf, EasyLocal++: an object-oriented framework for flexible design of local search algorithms. Softw. Pract. Experience 33(8), 733–765 (2003)Google Scholar
  44. 44.
    J. Dubois-Lacoste, M. López-Ibáñez, T. Stützle, Automatic configuration of state-of-the-art multi-objective optimizers using the TP+PLS framework, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2011, ed. by N. Krasnogor, P.L. Lanzi (ACM Press, New York, 2011), pp. 2019–2026Google Scholar
  45. 45.
    J. Dubois-Lacoste, M. López-Ibáñez, T. Stützle, A hybrid TP+PLS algorithm for bi-objective flow-shop scheduling problems. Comput. Oper. Res. 38(8), 1219–1236 (2011)CrossRefGoogle Scholar
  46. 46.
    J. Dubois-Lacoste, M. López-Ibáñez, T. Stützle, Improving the anytime behavior of two-phase local search. Ann. Math. Artif. Intell. 61(2), 125–154 (2011)CrossRefGoogle Scholar
  47. 47.
    J. Dubois-Lacoste, M. López-Ibáñez, T. Stützle, Anytime Pareto local search. Eur. J. Oper. Res. 243(2), 369–385 (2015)CrossRefGoogle Scholar
  48. 48.
    A.E. Eiben, Z. Michalewicz, M. Schoenauer, J.E. Smith, Parameter control in evolutionary algorithms, in Parameter Setting in Evolutionary Algorithms, ed. by F. Lobo, C.F. Lima, Z. Michalewicz (Springer, Berlin, 2007), pp. 19–46CrossRefGoogle Scholar
  49. 49.
    C. Fawcett, H.H. Hoos, Analysing differences between algorithm configurations through ablation. J. Heuristics 22(4), 431–458 (2016)CrossRefGoogle Scholar
  50. 50.
    V. Fernandez-Viagas, R. Ruiz, J.M. Framiñán, A new vision of approximate methods for the permutation flowshop to minimise makespan: state-of-the-art and computational evaluation. Eur. J. Oper. Res. 257(3), 707–721 (2017)CrossRefGoogle Scholar
  51. 51.
    A. Fialho, L. Da Costa, M. Schoenauer, M. Sebag, Analyzing bandit-based adaptive operator selection mechanisms. Ann. Math. Artif. Intell. 60(1–2), 25–64 (2010)CrossRefGoogle Scholar
  52. 52.
    A. Franzin, T. Stützle, Exploration of metaheuristics through automatic algorithm configuration techniques and algorithmic frameworks, in GECCO (Companion), ed. by T. Friedrich, F. Neumann, A.M. Sutton (ACM Press, New York, 2016), pp. 1341–1347CrossRefGoogle Scholar
  53. 53.
    A.S. Fukunaga, Evolving local search heuristics for SAT using genetic programming, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2004, Part II, ed. by K. Deb et al. Lecture Notes in Computer Science, vol. 3103 (Springer, Heidelberg, 2004), pp. 483–494Google Scholar
  54. 54.
    A.S. Fukunaga, Automated discovery of local search heuristics for satisfiability testing. Evol. Comput. 16(1), 31–61 (2008)CrossRefGoogle Scholar
  55. 55.
    G. Fursin, Y. Kashnikov, A.W. Memon, Z. Chamski, O. Temam, M. Namolaru, E. Yom-Tov, B. Mendelson, A. Zaks, E. Courtois, F. Bodin, P. Barnard, E. Ashton, E. Bonilla, J. Thomson, C.K.I. Williams, M. O’Boyle, Milepost GCC: machine learning enabled self-tuning compiler. Int. J. Parallel Program. 39(3), 296–327 (2011)CrossRefGoogle Scholar
  56. 56.
    M. Gendreau, J.-Y. Potvin (eds.), Handbook of Metaheuristics. International Series in Operations Research & Management Science, vol. 146, 2nd edn. (Springer, New York, 2010)Google Scholar
  57. 57.
    J.J. Grefenstette, Optimization of control parameters for genetic algorithms. IEEE Trans. Syst. Man Cybern. 16(1), 122–128 (1986)CrossRefGoogle Scholar
  58. 58.
    Y. Hamadi, E. Monfroy, F. Saubion (eds.), Autonomous Search (Springer, Berlin, 2012)Google Scholar
  59. 59.
    N. Hansen, A. Ostermeier, Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)CrossRefGoogle Scholar
  60. 60.
    P. Hansen, N. Mladenović, J. Brimberg, J.A. Moreno Pérez, Variable Neighborhood Search, in Handbook of Metaheuristics, ed. by M. Gendreau, J.-Y. Potvin. International Series in Operations Research & Management Science, vol. 146, 2nd edn. (Springer, New York, 2010), pp. 61–86Google Scholar
  61. 61.
    H.H. Hoos, Programming by optimization. Commun. ACM 55(2), 70–80 (2012)Google Scholar
  62. 62.
    H.H. Hoos, T. Stützle, Stochastic Local Search—Foundations and Applications (Morgan Kaufmann Publishers, San Francisco, 2005)Google Scholar
  63. 63.
    B. Huberman, R. Lukose, T. Hogg, An economic approach to hard computational problems. Science 275, 51–54 (1997)CrossRefGoogle Scholar
  64. 64.
    J. Humeau, A. Liefooghe, E.-G. Talbi, S. Verel, ParadisEO-MO: from fitness landscape analysis to efficient local search algorithms. J. Heuristics 19(6), 881–915 (2013)CrossRefGoogle Scholar
  65. 65.
    M.S. Hussin, T. Stützle, Hierarchical iterated local search for the quadratic assignment problem, in Hybrid Metaheuristics, ed. by M.J. Blesa, C. Blum, L. Di Gaspero, A. Roli, M. Sampels, A. Schaerf. Lecture Notes in Computer Science, vol. 5818 (Springer, Heidelberg, 2009), pp. 115–129Google Scholar
  66. 66.
    F. Hutter, S. Ramage, Manual for SMAC, 2015. SMAC version 2.10.03Google Scholar
  67. 67.
    F. Hutter, D. Babić, H.H. Hoos, A.J. Hu, Boosting verification by automatic tuning of decision procedures, in FMCAD’07: Proceedings of the 7th International Conference Formal Methods in Computer Aided Design, Austin (IEEE Computer Society, Washington, 2007), pp. 27–34Google Scholar
  68. 68.
    F. Hutter, H.H. Hoos, T. Stützle, Automatic algorithm configuration based on local search, in Proceedings of the Twenty-Second Conference on Artificial Intelligence (AAAI ‘07), ed. by R.C. Holte, A. Howe (AAAI Press/MIT Press, Menlo Park, 2007), pp. 1152–1157Google Scholar
  69. 69.
    F. Hutter, H.H. Hoos, K. Leyton-Brown, T. Stützle, ParamILS: an automatic algorithm configuration framework. J. Artif. Intell. Res. 36, 267–306 (2009)CrossRefGoogle Scholar
  70. 70.
    F. Hutter, H.H. Hoos, K. Leyton-Brown, Automated configuration of mixed integer programming solvers, in 7th International Conference on Integration of AI and OR Techniques in Constraint Programming for Combinatorial Optimization Problems, CPAIOR 2010, ed. by A. Lodi, M. Milano, P. Toth. Lecture Notes in Computer Science, vol. 6140 (Springer, Heidelberg, 2010), pp. 186–202Google Scholar
  71. 71.
    F. Hutter, H.H. Hoos, K. Leyton-Brown, Sequential model-based optimization for general algorithm configuration, in 5th International Conference on Learning and Intelligent Optimization, LION 5, ed. by C.A. Coello Coello. Lecture Notes in Computer Science, vol. 6683 (Springer, Heidelberg, 2011), pp. 507–523Google Scholar
  72. 72.
    F. Hutter, H.H. Hoos, K. Leyton-Brown, An efficient approach for assessing hyperparameter importance, in Proceedings of the 31th International Conference on Machine Learning, vol. 32 (2014), pp. 754–762Google Scholar
  73. 73.
    T. Ibaraki, A personal perspective on problem solving by general purpose solvers. Int. Trans. Oper. Res. 17(3), 303–315 (2010)CrossRefGoogle Scholar
  74. 74.
    S. Irnich, A unified modeling and solution framework for vehicle routing and local search-based metaheuristics. INFORMS J. Comput. 20(2), 270–287 (2008)CrossRefGoogle Scholar
  75. 75.
    R.H.F. Jackson, P.T. Boggs, S.G. Nash, S. Powell, Guidelines for reporting results of computational experiments. Report of the ad hoc committee. Math. Program. 49(3), 413–425 (1991)Google Scholar
  76. 76.
    S. Kadioglu, Y. Malitsky, M. Sellmann, K. Tierney, ISAC: instance-specific algorithm configuration, in Proceedings of the 19th European Conference on Artificial Intelligence, ed. by H. Coelho, R. Studer, M. Wooldridge (IOS Press, Amsterdam, 2010), pp. 751–756Google Scholar
  77. 77.
    G. Karafotias, M. Hoogendoorn, A.E. Eiben, Parameter control in evolutionary algorithms: trends and challenges. IEEE Trans. Evol. Comput. 19(2), 167–187 (2015)CrossRefGoogle Scholar
  78. 78.
    G. Kendall, R. Bai, J. Blazewicz, P. De Causmaecker, M. Gendreau, R. John, J. Li, B. McCollum, E. Pesch, R. Qu, N.R. Sabar, G.V. Berghe, A. Yee, Good laboratory practice for optimization research. J. Oper. Res. Soc. 67(4), 676–689 (2016)CrossRefGoogle Scholar
  79. 79.
    A.R. KhudaBukhsh, L. Xu, H.H. Hoos, K. Leyton-Brown, SATenstein: automatically building local search SAT solvers from components, in Proceedings of the Twenty-First International Joint Conference on Artificial Intelligence (IJCAI-09), ed. by C. Boutilier (AAAI Press, Menlo Park, 2009), pp. 517–524Google Scholar
  80. 80.
    A.R. KhudaBukhsh, L. Xu, H.H. Hoos, K. Leyton-Brown, SATenstein: automatically building local search SAT Solvers from Components. Artif. Intell. 232, 20–42 (2016)CrossRefGoogle Scholar
  81. 81.
    S. Kirkpatrick, Optimization by simulated annealing: quantitative studies. J. Stat. Phys. 34(5–6), 975–986 (1984)CrossRefGoogle Scholar
  82. 82.
    S. Kirkpatrick, C.D. Gelatt, M.P. Vecchi, Optimization by simulated annealing. Science 220, 671–680 (1983)CrossRefGoogle Scholar
  83. 83.
    L. Kotthoff, Algorithm selection for combinatorial search problems: a survey. AI Mag. 35(3), 48–60 (2014)CrossRefGoogle Scholar
  84. 84.
    T. Liao, M.A. Montes de Oca, T. Stützle, Computational results for an automatically tuned CMA-ES with increasing population size on the CEC’05 benchmark set. Soft Comput. 17(6), 1031–1046 (2013)CrossRefGoogle Scholar
  85. 85.
    T. Liao, T. Stützle, M.A. Montes de Oca, M. Dorigo, A unified ant colony optimization algorithm for continuous optimization. Eur. J. Oper. Res. 234(3), 597–609 (2014)CrossRefGoogle Scholar
  86. 86.
    T. Liao, D. Molina, T. Stützle, Performance evaluation of automatically tuned continuous optimizers on different benchmark sets. Appl. Soft Comput. 27, 490–503 (2015)CrossRefGoogle Scholar
  87. 87.
    M.T. Lindauer, H.H. Hoos, F. Hutter, T. Schaub, AutoFolio: algorithm configuration for algorithm selection, in AAAI, ed. by B. Bonet, S. Koenig (AAAI Press, Menlo Park, 2015)Google Scholar
  88. 88.
    M.T. Lindauer, H.H. Hoos, F. Hutter, T. Schaub, AutoFolio: an automatically configured algorithm selector. J. Artif. Intell. Res. 53, 745–778 (2015)CrossRefGoogle Scholar
  89. 89.
    M. López-Ibáñez, T. Stützle, An analysis of algorithmic components for multiobjective ant colony optimization: a case study on the biobjective TSP, in Artificial Evolution: 9th International Conference, Evolution Artificielle, EA, 2009, ed. by P. Collet, N. Monmarché, P. Legrand, M. Schoenauer, E. Lutton. Lecture Notes in Computer Science, vol. 5975 (Springer, Heidelberg, 2010), pp. 134–145Google Scholar
  90. 90.
    M. López-Ibáñez, T. Stützle, Automatic configuration of multi-objective ACO algorithms, in Swarm Intelligence, 7th International Conference, ANTS 2010, ed. by M. Dorigo et al. Lecture Notes in Computer Science, vol. 6234 (Springer, Heidelberg, 2010), pp. 95–106Google Scholar
  91. 91.
    M. López-Ibáñez, T. Stützle, The impact of design choices of multi-objective ant colony optimization algorithms on performance: an experimental study on the biobjective TSP, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2010, ed. by M. Pelikan, J. Branke (ACM Press, New York, 2010), pp. 71–78Google Scholar
  92. 92.
    M. López-Ibáñez, T. Stützle, The automatic design of multi-objective ant colony optimization algorithms. IEEE Trans. Evol. Comput. 16(6), 861–875 (2012)CrossRefGoogle Scholar
  93. 93.
    M. López-Ibáñez, T. Stützle, An experimental analysis of design choices of multi-objective ant colony optimization algorithms. Swarm Intell. 6(3), 207–232 (2012)CrossRefGoogle Scholar
  94. 94.
    M. López-Ibáñez, T. Stützle, Automatically improving the anytime behaviour of optimisation algorithms. Eur. J. Oper. Res. 235(3), 569–582 (2014)CrossRefGoogle Scholar
  95. 95.
    M. López-Ibáñez, J. Dubois-Lacoste, T. Stützle, M. Birattari, The irace package, iterated race for automatic algorithm configuration. Technical Report TR/IRIDIA/2011-004, IRIDIA, Université Libre de Bruxelles, Brussels, 2011Google Scholar
  96. 96.
    M. López-Ibáñez, T. Liao, T. Stützle, On the anytime behavior of IPOP-CMA-ES, in Parallel Problem Solving from Nature, PPSN XII, ed. by C.A. Coello Coello et al. Lecture Notes in Computer Science, vol. 7491 (Springer, Heidelberg, 2012), pp. 357–366Google Scholar
  97. 97.
    M. López-Ibáñez, J. Dubois-Lacoste, L. Pérez Cáceres, T. Stützle, M. Birattari, The irace package: iterated racing for automatic algorithm configuration. Oper. Res. Perspect. 3, 43–58 (2016)CrossRefGoogle Scholar
  98. 98.
    M. López-Ibáñez, M.-E. Kessaci, T. Stützle, Automatic design of hybrid metaheuristics from algorithmic components. Technical Report TR/IRIDIA/2017-012, IRIDIA, Université Libre de Bruxelles, Brussels, November 2017Google Scholar
  99. 99.
    H.R. Lourenço, Job-shop scheduling: computational study of local search and large-step optimization methods. Eur. J. Oper. Res. 83(2), 347–364 (1995)CrossRefGoogle Scholar
  100. 100.
    H.R. Lourenço, O. Martin, T. Stützle, Iterated local search, in Handbook of Metaheuristics, ed. by F. Glover, G. Kochenberger (Kluwer Academic Publishers, Norwell, 2002), pp. 321–353Google Scholar
  101. 101.
    H.R. Lourenço, O. Martin, T. Stützle, Iterated local search: framework and applications, in Handbook of Metaheuristics, ed. by M. Gendreau, J.-Y. Potvin. International Series in Operations Research & Management Science, vol. 146, 2nd edn. (Springer, New York, 2010), pp. 363–397, chapter 9Google Scholar
  102. 102.
    Y. Malitsky, M. Sellmann, Instance-specific algorithm configuration as a method for non-model-based portfolio generation, in Integration of AI and OR Techniques in Constraint Programming for Combinatorial Optimization Problems, ed. by N. Beldiceanu, N. Jussien, E. Pinson. Lecture Notes in Computer Science, vol. 7298 (Springer, Heidelberg, 2012), pp. 244–259Google Scholar
  103. 103.
    V. Maniezzo, T. Stützle, S. Voß (eds.), Matheuristics—Hybridizing Metaheuristics and Mathematical Programming. Annals of Information Systems, vol. 10 (Springer, New York, 2009)Google Scholar
  104. 104.
    M.-E. Marmion, F. Mascia, M. López-Ibáñez, T. Stützle, Automatic design of hybrid stochastic local search algorithms, in Hybrid Metaheuristics, 8th International Workshop, HM 2013, Ischia, May 23–25, 2013. Proceedings, ed. by M.J. Blesa, C. Blum, P. Festa, A. Roli, M. Sampels. Lecture Notes in Computer Science, vol. 7919 (Springer, Heidelberg, 2013), pp. 144–158Google Scholar
  105. 105.
    O. Maron, A.W. Moore, The racing algorithm: model selection for lazy learners. Artif. Intell. Res. 11(1–5), 193–225 (1997)CrossRefGoogle Scholar
  106. 106.
    F. Mascia, M. Birattari, T. Stützle, Tuning algorithms for tackling large instances: an experimental protocol, in 7th International Conference on Learning and Intelligent Optimization, LION 7, ed. by P.M. Pardalos, G. Nicosia. Lecture Notes in Computer Science, vol. 7997 (Springer, Heidelberg, 2013), pp. 410–422Google Scholar
  107. 107.
    F. Mascia, M. López-Ibáñez, J. Dubois-Lacoste, T. Stützle, Grammar-based generation of stochastic local search heuristics through automatic algorithm configuration tools. Comput. Oper. Res. 51, 190–199 (2014)CrossRefGoogle Scholar
  108. 108.
    F. Mascia, P. Pellegrini, T. Stützle, M. Birattari, An analysis of parameter adaptation in reactive tabu search. Int. Trans. Oper. Res. 21(1), 127–152 (2014)CrossRefGoogle Scholar
  109. 109.
    F. Massen, M. López-Ibáñez, T. Stützle, Y. Deville, Experimental analysis of pheromone-based heuristic column generation using irace, in Hybrid Metaheuristics, 8th International Workshop, HM 2013, Ischia, May 23–25, 2013. Proceedings, ed. by M.J. Blesa, C. Blum, P. Festa, A. Roli, M. Sampels. Lecture Notes in Computer Science, vol. 7919 (Springer, Heidelberg, 2013), pp. 92–106.Google Scholar
  110. 110.
    G. Melvin, T.J. Dodd, R. Groß, Why ‘GSA: a gravitational search algorithm’ is not genuinely based on the law of gravity. Nat. Comput. 11(4), 719–720 (2012)CrossRefGoogle Scholar
  111. 111.
    ML4AAD Group. SMAC v3 project (2017). https://github.com/automl/SMAC3, Version visited last on August 2017
  112. 112.
    J. Mockus, Bayesian Approach to Global Optimization: Theory and Applications (Kluwer Academic Publishers, Dordrecht, 1989)CrossRefGoogle Scholar
  113. 113.
    M.A. Montes de Oca, D. Aydın, T. Stützle, An incremental particle swarm for large-scale continuous optimization problems: an example of tuning-in-the-loop (re)design of optimization algorithms. Soft Comput. 15(11), 2233–2255 (2011)CrossRefGoogle Scholar
  114. 114.
    D.C. Montgomery, Design and Analysis of Experiments, 8th edn. (Wiley, New York, 2012)Google Scholar
  115. 115.
    V. Nannen, A.E. Eiben, A method for parameter calibration and relevance estimation in evolutionary algorithms, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2006, ed. by M. Cattolico et al. (ACM Press, New York, 2006), pp. 183–190Google Scholar
  116. 116.
    V. Nannen, A.E. Eiben, Relevance estimation and value calibration of evolutionary algorithm parameters, in Proceedings of the Twentieth International Joint Conference on Artificial Intelligence (IJCAI-07), ed. by M.M. Veloso (AAAI Press, Menlo Park, 2007), pp. 975–980Google Scholar
  117. 117.
    R. Olsson, A. Løkketangen, Using automatic programming to generate state-of-the-art algorithms for random 3-SAT. J. Heuristics 19(5), 819–844 (2013)CrossRefGoogle Scholar
  118. 118.
    F. Pagnozzi, T. Stützle, Automatic design of hybrid stochastic local search algorithms for permutation flowshop problems. Technical Report TR/IRIDIA/2017-013, IRIDIA, Université Libre de Bruxelles, Brussels, November 2017Google Scholar
  119. 119.
    P. Pellegrini, M. Birattari, Implementation effort and performance, in Engineering Stochastic Local Search Algorithms. Designing, Implementing and Analyzing Effective Heuristics. SLS 2007, ed. by T. Stützle, M. Birattari, H.H. Hoos. Lecture Notes in Computer Science, vol. 4638 (Springer, Heidelberg, 2007), pp. 31–45Google Scholar
  120. 120.
    L. Pérez Cáceres, M. López-Ibáñez, T. Stützle, An analysis of parameters of irace, in Proceedings of EvoCOP 2014 – 14th European Conference on Evolutionary Computation in Combinatorial Optimization, ed. by C. Blum, G. Ochoa. Lecture Notes in Computer Science, vol. 8600 (Springer, Heidelberg, 2014), pp. 37–48Google Scholar
  121. 121.
    L. Pérez Cáceres, M. López-Ibáñez, T. Stützle, Ant colony optimization on a limited budget of evaluations. Swarm Intell. 9(2–3), 103–124 (2015)CrossRefGoogle Scholar
  122. 122.
    L. Pérez Cáceres, B. Bischl, T. Stützle, Evaluating random forest models for irace, in GECCO’17 Companion, ed. by P.A.N. Bosman (ACM Press, New York, 2017)Google Scholar
  123. 123.
    L. Pérez Cáceres, M. López-Ibáñez, H.H. Hoos, T. Stützle, An experimental study of adaptive capping in irace, in 11th International Conference on Learning and Intelligent Optimization, LION 11, ed. by R. Battiti, D.E. Kvasov, Y.D. Sergeyev. Lecture Notes in Computer Science, vol. 10556 (Springer, Cham, 2017), pp. 235–250Google Scholar
  124. 124.
    D. Pisinger, S. Ropke, A general heuristic for vehicle routing problems. Comput. Oper. Res. 34(8), 2403–2435 (2007)CrossRefGoogle Scholar
  125. 125.
    D. Plotnikov, D. Melnik, M. Vardanyan, R. Buchatskiy, R. Zhuykov, J.-H. Lee, Automatic tuning of compiler optimizations and analysis of their impact, in 2013 International Conference on Computational Science, ed. by V. Alexandrov, M. Lees, V. Krzhizhanovskaya, J. Dongarra, P.M.A. Sloot. Procedia Computer Science, vol. 18 (Elsevier, Amsterdam, 2013), pp. 1312–1321Google Scholar
  126. 126.
    M. Powell, The BOBYQA algorithm for bound constrained optimization without derivatives. Technical Report Cambridge NA Report NA2009/06, University of Cambridge, Cambridge, 2009Google Scholar
  127. 127.
    M. Püschel, F. Franchetti, Y. Voronenko, Spiral, in Encyclopedia of Parallel Computing, ed. by D. Padua (Springer, New York, 2011), pp. 1920–1933Google Scholar
  128. 128.
    A. Radulescu, M. López-Ibáñez, T. Stützle, Automatically improving the anytime behaviour of multiobjective evolutionary algorithms, in Evolutionary Multi-criterion Optimization, EMO 2013, ed. by R.C. Purshouse, P.J. Fleming, C.M. Fonseca, S. Greco, J. Shaw. Lecture Notes in Computer Science, vol. 7811 (Springer, Heidelberg, 2013), pp. 825–840Google Scholar
  129. 129.
    R.L. Rardin, R. Uzsoy, Experimental evaluation of heuristic optimization algorithms: a tutorial. J. Heuristics 7(3), 261–304 (2001)CrossRefGoogle Scholar
  130. 130.
    M.G.C. Resende, C.C. Ribeiro, Greedy randomized adaptive search procedures: advances, hybridizations, and applications, in Handbook of Metaheuristics, ed. by M. Gendreau, J.-Y. Potvin. International Series in Operations Research & Management Science, vol. 146, 2nd edn. (Springer, New York, 2010), pp. 283–319Google Scholar
  131. 131.
    J.R. Rice, The algorithm selection problem. Adv. Comput. 15, 65–118 (1976)CrossRefGoogle Scholar
  132. 132.
    E. Ridge, D. Kudenko, Tuning an algorithm using design of experiments, in Experimental Methods for the Analysis of Optimization Algorithms, ed. by T. Bartz-Beielstein, M. Chiarandini, L. Paquete, M. Preuss (Springer, Berlin, 2010), pp. 265–286CrossRefGoogle Scholar
  133. 133.
    M.-C. Riff, E. Montero, A new algorithm for reducing metaheuristic design effort, in Proceedings of the 2013 Congress on Evolutionary Computation (CEC 2013) (IEEE Press, Piscataway, 2013), pp. 3283–3290Google Scholar
  134. 134.
    S. Ropke, D. Pisinger, A unified heuristic for a large class of vehicle routing problems with backhauls. Eur. J. Oper. Res. 171(3), 750–775 (2006)CrossRefGoogle Scholar
  135. 135.
    R. Ruiz, C. Maroto, A comprehensive review and evaluation of permutation flowshop heuristics. Eur. J. Oper. Res. 165(2), 479–494 (2005)CrossRefGoogle Scholar
  136. 136.
    M. Schonlau, W.J. Welch, D.R. Jones, Global versus local search in constrained optimization of computer models. Lect. Notes Monogr. Ser. 34, 11–25 (1998)CrossRefGoogle Scholar
  137. 137.
    B. Shahriari, K. Swersky, Z. Wang, R.P. Adams, N. de Freitas, Taking the human out of the loop: a review of Bayesian optimization. Proc. IEEE 104(1), 148–175 (2016)CrossRefGoogle Scholar
  138. 138.
    S.K. Smit, A.E. Eiben, Beating the ‘world champion’ evolutionary algorithm via REVAC tuning, in Proceedings of the 2010 Congress on Evolutionary Computation (CEC 2010), ed. by H. Ishibuchi et al. (IEEE Press, Piscataway, 2010), pp. 1–8Google Scholar
  139. 139.
    S.K. Smit, A.E. Eiben, Parameter tuning of evolutionary algorithms: generalist vs. specialist, in EvoApplications (1), ed. by C. Di Chio, S. Cagnoni, C. Cotta, M. Ebner, A. Ekárt, A.I. Esparcia-Alcázar, C.K. Goh, J.-J. Merelo, F. Neri, M. Preuss, J. Togelius, G.N. Yannakakis. Lecture Notes in Computer Science, vol. 6024 (Springer, Heidelberg, 2010), pp. 542–551Google Scholar
  140. 140.
    J. Snoek, H. Larochelle, R.P. Adams, Practical Bayesian optimization of machine learning algorithms, in Advances in Neural Information Processing Systems 25: 26th Annual Conference on Neural Information Processing Systems 2012, ed. by P.L. Bartlett, F.C.N. Pereira, C.J.C. Burges, L. Bottou, K.Q. Weinberger (Curran Associates, Red Hook, 2012), pp. 2960–2968Google Scholar
  141. 141.
    K. Sörensen, Metaheuristics—the metaphor exposed. Int. Trans. Oper. Res. 22(1), 3–18 (2015)CrossRefGoogle Scholar
  142. 142.
    T. Stützle, Some thoughts on engineering stochastic local search algorithms, in Proceedings of the EU/MEeting 2009: Debating the Future: New Areas of Application and Innovative Approaches, ed. by A. Viana et al., 2009, pp. 47–52Google Scholar
  143. 143.
    T. Stützle, M. López-Ibáñez, P. Pellegrini, M. Maur, M.A. Montes de Oca, M. Birattari, M. Dorigo, Parameter adaptation in ant colony optimization, in Autonomous Search, ed. by Y. Hamadi, E. Monfroy, F. Saubion (Springer, Berlin, 2012), pp. 191–215Google Scholar
  144. 144.
    J. Styles, H.H. Hoos, Ordered racing protocols for automatically configuring algorithms for scaling performance, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2013, ed. by C. Blum E. Alba (ACM Press, New York, 2013), pp. 551–558Google Scholar
  145. 145.
    J. Styles, H.H. Hoos, M. Müller, Automatically configuring algorithms for scaling performance, in Learning and Intelligent Optimization, 6th International Conference, LION 6, ed. by Y. Hamadi, M. Schoenauer. Lecture Notes in Computer Science, vol. 7219 (Springer, Heidelberg, 2012), pp. 205–219Google Scholar
  146. 146.
    C. Thornton, F. Hutter, H.H. Hoos, K. Leyton-Brown, Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms, in The 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2013, ed. by I.S. Dhillon, Y. Koren, R. Ghani, T.E. Senator, P. Bradley, R. Parekh, J. He, R.L. Grossman, R. Uthurusamy (ACM Press, New York, 2013), pp. 847–855CrossRefGoogle Scholar
  147. 147.
    T. Vidal, T.G. Crainic, M. Gendreau, C. Prins, Heuristics for multi-attribute vehicle routing problems: a survey and synthesis. Eur. J. Oper. Res. 231(1), 1–21 (2013)CrossRefGoogle Scholar
  148. 148.
    T. Vidal, T.G. Crainic, M. Gendreau, C. Prins, A unified solution framework for multi-attribute vehicle routing problems. Eur. J. Oper. Res. 234(3), 658–673 (2014)CrossRefGoogle Scholar
  149. 149.
    B.W. Wah, Y.X. Chen, Optimal anytime constrained simulated annealing for constrained global optimization, in Principles and Practice of Constraint Programming, CP 2000, ed. by R. Dechter. Lecture Notes in Computer Science, vol. 1894 (Springer, Heidelberg, 2000), pp. 425–440Google Scholar
  150. 150.
    S. Wessing, N. Beume, G. Rudolph, B. Naujoks, Parameter tuning boosts performance of variation operators in multiobjective optimization, in Parallel Problem Solving from Nature, PPSN XI, ed. by R. Schaefer, C. Cotta, J. Kolodziej, G. Rudolph. Lecture Notes in Computer Science, vol. 6238 (Springer, Heidelberg, 2010), pp. 728–737Google Scholar
  151. 151.
    D. Weyland, A rigorous analysis of the harmony search algorithm: how the research community can be misled by a “novel” methodology. Int. J. Appl. Metaheuristic Comput. 12(2), 50–60 (2010)CrossRefGoogle Scholar
  152. 152.
    C.R. Whaley, Atlas (automatically tuned linear algebra software), in Encyclopedia of Parallel Computing, ed. by D. Padua (Springer, New York, 2011), pp. 95–101Google Scholar
  153. 153.
    L. Xu, F. Hutter, H.H. Hoos, K. Leyton-Brown, SATzilla: portfolio-based algorithm selection for SAT. J. Artif. Intell. Res. 32, 565–606 (2008)CrossRefGoogle Scholar
  154. 154.
    L. Xu, H.H. Hoos, K. Leyton-Brown, Hydra: automatically configuring algorithms for portfolio-based selection, in AAAI, ed. by M. Fox, D. Poole. (AAAI Press, Menlo Park, 2010)Google Scholar
  155. 155.
    L. Xu, F. Hutter, H.H. Hoos, K. Leyton-Brown, Hydra-MIP: automated algorithm configuration and selection for mixed integer programming. Technical Report TR-2011-01, Department of Computer Science, University of British Columbia, 2011Google Scholar
  156. 156.
    Z. Yuan, M.A. Montes de Oca, T. Stützle, M. Birattari, Continuous optimization algorithms for tuning real and integer algorithm parameters of swarm intelligence algorithms. Swarm Intell. 6(1), 49–75 (2012)CrossRefGoogle Scholar
  157. 157.
    S. Zilberstein, Using anytime algorithms in intelligent systems. AI Mag. 17(3), 73–83 (1996)Google Scholar
  158. 158.
    E. Zitzler, L. Thiele, Multiobjective optimization using evolutionary algorithms – a comparative case study, in Parallel Problem Solving from Nature, PPSN V, ed. by A.E. Eiben, T. Bäck, M. Schoenauer, H.-P. Schwefel. Lecture Notes in Computer Science, vol. 1498 (Springer, Heidelberg, 1998), pp. 292–301Google Scholar
  159. 159.
    E. Zitzler, L. Thiele, M. Laumanns, C.M. Fonseca, V. Grunert da Fonseca, Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)CrossRefGoogle Scholar
  160. 160.
    E. Zitzler, L. Thiele, J. Bader, On set-based multiobjective optimization. IEEE Trans. Evol. Comput. 14(1), 58–79 (2010)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Université Libre de Bruxelles (ULB)BrusselsBelgium
  2. 2.Alliance Manchester Business SchoolUniversity of ManchesterManchesterUK

Personalised recommendations