Abstract
The focus of this paper lies on automatic and interactive tuning methods for stochastic optimization algorithms, e.g., evolutionary algorithms. Algorithm tuning is important because it helps to avoid wrong parameter settings, to improve the existing algorithms, to select the best algorithm for working with a real-world problem, to show the value of a novel algorithm, to evaluate the performance of an optimization algorithm when different option settings are used, and to obtain an algorithm instance that is robust to changes in problem specification. This chapter discusses strategical issues and defines eight key topics for tuning, namely, optimization algorithms, test problems, experimental setup, performance metrics, reporting, parallelization, tuning methods, and software. Features of established tuning software packages such as IRACE, SPOT, SMAC, and ParamILS are compared.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Adam, S.P., Alexandropoulos, S.-A.N., Pardalos, P.M., Vrahatis, M.N.: No Free Lunch Theorem: A Review, pp. 57–82. Springer International Publishing, Cham (2019). ISBN 978-3-030-12767-1, https://doi.org/10.1007/978-3-030-12767-1_5
Addis, B., Locatelli, M.: A new class of test functions for global optimization. J. Glob. Optim. 38(3), 479–501 (2007) ISSN 0925-5001; 1573-2916/e
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008) ISSN 1841-4311/e
Ansótegui, C., Sellmann, M., Tierney, K.: A gender-based genetic algorithm for the automatic configuration of algorithms. In: Proceedings of Principles and Practice of Constraint Programming-CP 2009: 15th International Conference, CP 2009 Lisbon, 20–24 Sept 2009, p. 142. Springer, Berlin (2009)
Ansótegui, C., Malitsky, Y., Samulowitz, H., Sellmann, M., Tierney, K.: Model-based genetic algorithms for algorithm configuration. In: Twenty-Fourth International Joint Conference on Artificial Intelligence (2015)
Audet, C., Orban, D.: Finding optimal algorithmic parameters using derivative-free optimization. SIAM J. Optim. 17(3), 642–664 (2006). ISSN 1052-6234; 1095-7189/e
Audet, C., Dang, K.-C., Orban, D.: Optimization of algorithms with OPAL. Math. Program. Comput. 6(3), 233–254 (2014). ISSN 1867-2949; 1867-2957/e
Bäck, T.: Evolutionary Algorithms in Theory and Practice. Oxford University Press, New York (1996)
Barr, R., Hickman, B.: Reporting computational experiments with parallel algorithms: issues, measures, and experts’ opinions. ORSA J. Comput. 5(1), 2–18 (1993)
Barr, R., Golden, B., Kelly, J., Rescende, M., Stewart, W.: Designing and reporting on computational experiments with heuristic methods. J. Heuristics 1(1), 9–32 (1995)
Barton, R.R.: Testing strategies for simulation optimization. In: Proceedings of the 19th Conference on Winter Simulation, WSC ’87, pp. 391–401. ACM, New York (1987). ISBN 0-911801-32-4, http://doi.acm.org/10.1145/318371.318618
Bartz-Beielstein, T.: Experimental Research in Evolutionary Computation—The New Experimentalism. Natural Computing Series. Springer, Berlin (2006). ISBN 3-540-32026-1, http://dx.doi.org/10.1007/3-540-32027-X
Bartz-Beielstein, T.: How to create generalizable results. In: Kacprzyk, J., Pedrycz, W. (eds.) Springer Handbook of Computational Intelligence, pp. 1127–1142. Springer, Berlin (2015). ISBN 978-3-662-43504-5, http://dx.doi.org/10.1007/978-3-662-43505-2_56
Bartz-Beielstein, T., Preuss, M.: The future of experimental research. In: Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 17–46. Springer, Berlin (2010)
Bartz-Beielstein, T., Parsopoulos, K.E., Vrahatis, M.N.: Design and analysis of optimization algorithms using computational statistics. Appl. Numer. Anal. Comput. Math. 1(2), 413–433 (2004)
Bartz-Beielstein, T., Lasarczyk, C., Preuss, M.: Sequential parameter optimization. In: McKay, B., et al. (eds.) Proceedings 2005 Congress on Evolutionary Computation (CEC’05), Edinburgh, pp. 773–780. IEEE Press, Piscataway (2005). ISBN 0-7803-9363-5, https://doi.org/10.1109/CEC.2005.1554761
Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.): Experimental Methods for the Analysis of Optimization Algorithms. Springer, Berlin (2010). ISBN 978-3-642-02537-2, https://doi.org/10.1007/978-3-642-02538-9, http://www.springer.com/978-3-642-02537-2
Bartz-Beielstein, T., Branke, J., Mehnen, J., Mersmann, O.: Evolutionary algorithms. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 4(3), 178–195 (2014). ISSN 1942-4795. https://doi.org/10.1002/widm.1124
Beiranvand, V., Hare, W., Lucet, Y.: Best practices for comparing optimization algorithms. Optim. Eng. 18(4), 815–848 (2017)
Birattari, M., Stützle, T., Paquete, L., Varrentrapp, K.: A racing algorithm for configuring metaheuristics. In: Proceedings of the 4th Annual Conference on Genetic and Evolutionary Computation, GECCO’02, pp. 11–18. Morgan Kaufmann, San Francisco (2002). ISBN 1-55860-878-8, http://dl.acm.org/citation.cfm?id=2955491.2955494
Birattari, M., Yuan, Z., Balaprakash, P., Stützle, T.: Iterated F-race an overview. Technical report (2009)
Bischl, B., Wessing, S., Bauer, N., Friedrichs, K., Weihs, C.: MOI-MBO: multiobjective infill for parallel model-based optimization. In: International Conference on Learning and Intelligent Optimization, pp. 173–186. Springer, Berlin (2014)
Bongartz, I., Conn, A.R., Gould, N., Toint, P.L.: CUTE: constrained and unconstrained testing environment. ACM Trans. Math. Softw. 21(1), 123–160 (1995). ISSN 0098-3500; 1557-7295/e
Box, M.J.: A comparison of several current optimization methods, and the use of transformations in constrained problems. Comput. J. 9, 67–77 (1966). ISSN 0010-4620; 1460-2067/e
Box, G.E.P., Wilson, K.B.: On the experimental attainment of optimum conditions. J. R. Stat. Soc. Series B Methodol. 13(1), 1–45 (1951). http://www.jstor.org/stable/2983966
Breiman, L.: Stacked regression. Mach. Learn. 24, 49–64 (1996)
Buckley, A.G.: Algorithm 709: testing algorithm implementations. ACM Trans. Math. Softw. 18(4), 375–391 (1992). ISSN 0098-3500, http://doi.acm.org/10.1145/138351.138378
Bussieck, M.R., Dirkse, S.P., Vigerske, S.: PAVER 2.0: an open source environment for automated performance analysis of benchmarking data. J. Glob. Optim. 59(2–3), 259–275 (2014). ISSN 0925-5001; 1573-2916/e
Campelo, F., Takahashi, F.: Sample size estimation for power and accuracy in the experimental comparison of algorithms. J. Heuristics 25(2), 305–338 (2019). ISSN 1572-9397, https://doi.org/10.1007/s10732-018-9396-7
Chen, C.H.: An effective approach to smartly allocate computing budget for discrete event simulation. In: Proceedings of the 34th IEEE Conference on Decision and Control, pp. 2598–2605 (1995)
Chiarandini, M., Goegebeur, Y.: Mixed models for the analysis of optimization algorithms. In: Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 225–264. Springer, Berlin (2010). ISBN 978-3-642-02537-2, https://doi.org/10.1007/978-3-642-02538-9, http://bib.mathematics.dk/preprint.php?id=DMF-2009-07-001
Cohen, P.R.: Empirical Methods for Artificial Intelligence. MIT Press, Cambridge (1995)
Coy, S.P., Golden, B.L., Runger, G.C., Wasil, E.A.: Using experimental design to find effective parameter settings for heuristics. J. Heuristics 7(1), 77–97 (2000)
Crainic, T.: Parallel Metaheuristics and Cooperative Search, pp. 419–451. Springer International Publishing, Cham (2019). ISBN 978-3-319-91086-4, https://doi.org/10.1007/978-3-319-91086-4_13
Crowder, H.P., Dembo, R.S., Mulvey, J.M.: On reporting computational experiments with mathematical software. ACM Trans. Math. Softw. 5(2), 193–203 (1979)
Daniels, S.J., Rahat, A.A., Everson, R.M., Tabor, G.R., Fieldsend, J.E.: A suite of computationally expensive shape optimisation problems using computational fluid dynamics. In: International Conference on Parallel Problem Solving from Nature, pp. 296–307. Springer, Berlin (2018)
De Jong, K.A.: An analysis of the behavior of a class of genetic adaptive systems. PhD thesis, University of Michigan (1975)
De Jong, K.: Parameter Setting in EAs: A 30 Year Perspective, pp. 1–18. Springer, Berlin (2007). ISBN 978-3-540-69432-8, https://doi.org/10.1007/978-3-540-69432-8_1
Doerr, C., Wagner, M.: Sensitivity of parameter control mechanisms with respect to their initialization. In: International Conference on Parallel Problem Solving from Nature (PPSN 2018), Coimbra. Lecture Notes in Computer Science, vol. 11102, pp. 360–372, Sept 2018. https://doi.org/10.1007/978-3-319-99259-4_29, https://hal.sorbonne-universite.fr/hal-01921055
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002). http://link.springer.com/10.1007/s101070100263
Domes, F., Fuchs, M., Schichl, H., Neumaier, A.: The optimization test environment. Optim. Eng. 15(2), 443–468 (2014). ISSN 1389-4420; 1573-2924/e
Eason, E.D.: Evidence of fundamental difficulties in nonlinear optimization code comparisons. In: Mulvey, J.M. (ed.) Evaluating Mathematical Programming Techniques, pp. 60–71. Springer, Berlin (1982). ISBN 978-3-642-95406-1
Eason, E., Fenton, R.: A comparison of numerical optimization methods for engineering design. J. Eng. Ind. 96(1), 196–200 (1974)
Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. Springer, Berlin (2003). ISBN 3-540-40184-9, http://www.worldcat.org/title/introduction-to-evolutionary-computing/oclc/52559217
Eiben, A.E., Smit, S.K.: Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol. Comput. 1(1), 19–31 (2011). https://doi.org/10.1016/j.swevo.2011.02.001, http://www.sciencedirect.com/science/article/pii/S2210650211000022
Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Trans. Evol. Comput. 3(2), 124–141 (1999). citeseer.nj.nec.com/eiben00parameter.html
Floudas, C.A., Pardalos, P.M., Adjiman, C.S., Esposito, W.R., Gümüş, Z.H., Harding, S.T., Klepeis, J.L., Meyer, C.A., Schweiger, C.A.: Handbook of Test Problems in Local and Global Optimization, vol. 33. Kluwer Academic, Dordrecht (1999). ISBN 0-7923-5801-5/hbk
Forrester, A., Sóbester, A., Keane, A.: Multi-fidelity optimization via surrogate modelling. Proc. R. Soc. A Math. Phys. Eng. Sci. 463(2088), 3251–3269 (2007). https://doi.org/10.1098/rspa.2007.1900
Ginsbourger, D., Le Riche, R., Carraro, L.: Kriging is well-suited to parallelize optimization. In: Computational Intelligence in Expensive Optimization Problems, pp. 131–162. Springer, Berlin (2010)
Goldberg, D.E., Deb, K.: A comparative analysis of selection schemes used in genetic algorithms. Foundations of Genetic Algorithms, vol. 1, pp. 69–93. Elsevier, Amsterdam (1991). https://doi.org/10.1016/B978-0-08-050684-5.50008-2, http://www.sciencedirect.com/science/article/pii/B9780080506845500082
Goldberg, D.E., Deb, K., Clark, J.H.: Genetic algorithms, noise, and the sizing of populations. Complex Syst. 6, 333 (1992)
Gould, N., Scott, J.: A note on performance profiles for benchmarking software. ACM Trans. Math. Softw. 43(2), 5 (2016). ISSN 0098-3500; 1557-7295/e, Id/No 15
Grefenstette, J.: Optimization of control parameters for genetic algorithms. IEEE Trans. Syst. Man Cybern. 16(1), 122–128 (1986). ISSN 0018-9472, https://doi.org/10.1109/TSMC.1986.289288
Haftka, R.T.: Requirements for papers focusing on new or improved global optimization algorithms. Struct. Multidiscipl. Optim. 54(1), 1–1 (2016). ISSN 1615-1488, https://doi.org/10.1007/s00158-016-1491-5
Haftka, R.T., Villanueva, D., Chaudhuri, A.: Parallel surrogate-assisted global optimization with expensive functions—a survey. Struct. Multidiscipl. Optim. 54(1), 3–13 (2016). ISSN 1615-1488, https://doi.org/10.1007/s00158-016-1432-3
Hare, W., Wang, Y.: Fairer benchmarking of optimization algorithms via derivative free optimization. Technical report, Optimization-online (2010)
Harik, G.R., Lobo, F.G.: A parameter-less genetic algorithm. In: Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation - Volume 1, GECCO’99, pp. 258–265. Morgan Kaufmann, San Francisco (1999). ISBN 1-55860-611-4, http://dl.acm.org/citation.cfm?id=2933923.2933949
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, Berlin (2001)
Hillstrom, K.E.: A simulation test approach to the evaluation of nonlinear optimization algorithms. ACM Trans. Math. Softw. 3(4), 305–315 (1977). http://doi.acm.org/10.1145/355759.355760
Himmelblau, D.M.: Applied Nonlinear Programming. McGraw-Hill, New York (1972)
Hoos, H.H., Stützle, T.: Stochastic Local Search—Foundations and Applications. Elsevier, Amsterdam (2005)
Huang, D., Allen, T.T., Notz, W.I., Zeng, N.: Global optimization of stochastic black-box systems via sequential Kriging meta-models. J. Glob. Optim. 34(3), 441–466 (2006)
Hutter, F., Babic, D., Hoos, H.H., Hu, A.J.: Boosting verification by automatic tuning of decision procedures. In: Proceedings of the Formal Methods in Computer Aided Design, FMCAD ’07, pp. 27–34. IEEE Computer Society, Washington (2007). ISBN 0-7695-3023-0, https://doi.org/10.1109/FMCAD.2007.13
Hutter, F., Hoos, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: an automatic algorithm configuration framework. Technical report (2009)
Hutter, F., Hoos, H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Learning and Intelligent Optimization, pp. 507–523 (2011). https://maanvs03.gm.fh-koeln.de/webstore/Classified.d/Hutt11a.d/Hutt11a.pdf
IBM Corporation: CPLEX’s automatic tuning tool. Technical report, IBM (2014)
Jackson, R.H.F., Boggs, P.T., Nash, S.G., Powell, S.: Guidelines for reporting results of computational experiments. Report of the ad hoc committee. Math. Program. 49(1), 413–425 (1990). ISSN 1436-4646, https://doi.org/10.1007/BF01588801
Jin, Y., Wang, H., Chugh, T., Guo, D., Miettinen, K.: Data-driven evolutionary optimization: an overview and case studies. IEEE Trans. Evol. Comput. 23(3), 442–458 (2019). ISSN 1089-778X, https://doi.org/10.1109/TEVC.2018.2869001
Johnson, D.S., Aragon, C.R., McGeoch, L.A., Schevon, C.: Optimization by simulated annealing: an experimental evaluation. Part I, graph partitioning. Oper. Res. 37(6), 865–892 (1989)
Johnson, D.S., Aragon, C.R., McGeoch, L.A., Schevon, C.: Optimization by simulated annealing: an experimental evaluation. Part II, graph coloring and number partitioning. Oper. Res. 39(3), 378–406 (1991)
Johnson, D.S., McGeoch, L., Rothberg, E.: Asymptotic experimental analysis for the Held-Karp traveling salesman bound. In: Proceedings of the Seventh Annual ACM-SIAM Symposium on Discrete Algorithms, vol. 81, pp. 341–350 (1996)
Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)
Jung, C., Zaefferer, M., Bartz-Beielstein, T., Rudolph, G.: Metamodel-based optimization of hot rolling processes in the metal industry. Int. J. Adv. Manuf. Technol. 1–15 (2016). ISSN 1433-3015, https://doi.org/10.1007/s00170-016-9386-6
Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings IEEE International Conference on Neural Networks, pp. 1942–1948. IEEE, Piscataway (1995)
Kleijnen, J.P.C.: Statistical Tools for Simulation Practitioners. Marcel Dekker, New York (1987)
Kleijnen, J.P.C.: Design and Analysis of Simulation Experiments. Springer, New York (2008)
Kleijnen, J.P.C.: Design and Analysis of Simulation Experiments. International Series in Operations Research and Management Science. Springer International Publishing, New York (2015). ISBN 978-3-319-18087-8, https://books.google.de/books?id=Fq4YCgAAQBAJ
Kramer, O.: Evolutionary self-adaptation: a survey of operators and strategy parameters. Evol. Intell. 3(2), 51–65 (2010). https://maanvs03.gm.fh-koeln.de/webstore/Classified.d/Kram10a.d/Kram10a.pdf
Lenard, M.L., Minkoff, M.: Randomly generated test problems for positive definite quadratic programming. ACM Trans. Math. Softw. 10(1), 86–96 (1984). ISSN 0098-3500, http://doi.acm.org/10.1145/356068.356075
Liu, D., Zhang, X.: Test problem generator by neural network for algorithms that try solving nonlinear programming problems globally. J. Glob. Optim. 16(3), 229–243 (2000). ISSN 0925-5001; 1573-2916/e
Lobo, F.G., Lima, C.F., Michalewicz, Z. (eds.): Parameter Setting in Evolutionary Algorithms. Studies in Computational Intelligence, vol. 54. Springer, Berlin (2007). ISBN 978-3-540-69431-1
Lopez-Ibanez, M., Dubois-Lacoste, J., Stützle, T., Birattari, M.: The irace package, iterated race for automatic algorithm configuration. Technical Report 2011-004, IRIDIA (2011)
McGeoch, C.C.: Experimental Analysis of Algorithms. PhD thesis, Carnegie Mellon University, Pittsburgh (1986)
McGeoch, C.C.: Toward an experimental method for algorithm simulation. INFORMS J. Comput. 8(1), 1–15 (1996)
McGeoch, C.C.: Experimental algorithmics. Commun. ACM 50(11), 27–31 (2007). ISSN 0001-0782, http://doi.acm.org/10.1145/1297797.1297818
McGeoch, C.C.: A Guide to Experimental Algorithmics, 1st edn. Cambridge University Press, New York (2012). ISBN 0521173019, 9780521173018
Miele, A., Tietze, J., Levy, A.: Comparison of several gradient algorithms for mathematical programming problems. Technical report, Rice University (1972)
Montgomery, D.C.: Design and Analysis of Experiments, 5th edn. Wiley, New York (2001)
Moré, J.J., Wild, S.M.: Benchmarking derivative-free optimization algorithms. SIAM J. Optim. 20(1), 172–191 (2009). ISSN 1052-6234; 1095-7189/e
More, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)
Mühlenbein, H.: How genetic algorithms really work : I. Mutation and hill climbing. In: Proc. 2nd Int. Conf. on Parallel Problem Solving from Nature. Elsevier, Amsterdam (1992). https://ci.nii.ac.jp/naid/10022158367/en/
Muñoz, M.A., Sun, Y., Kirley, M., Halgamuge, S.K.: Algorithm selection for black-box continuous optimization problems: a survey on methods and challenges. Inf. Sci. 317, 224–245 (2015). ISSN 0020-0255, https://doi.org/10.1016/j.ins.2015.05.010, http://www.sciencedirect.com/science/article/pii/S0020025515003680
Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965)
Nell, C., Fawcett, C., Hoos, H.H., Leyton-Brown, K.: Hal: a framework for the automated analysis and design of high-performance algorithms. In: Coello, C.A.C. (ed.) Learning and Intelligent Optimization, pp. 600–615. Springer, Berlin (2011). ISBN 978-3-642-25566-3
Neumann-Brosig, M., Marco, A., Schwarzmann, D., Trimpe, S.: Data-efficient auto-tuning with Bayesian optimization: an industrial control study (2018). CoRR, abs/1812.06325, http://arxiv.org/abs/1812.06325
Parejo, J.A., Ruiz-Cortés, A., Lozano, S., Fernandez, P.: Metaheuristic optimization frameworks: a survey and benchmarking. Soft Comput. 16(3), 527–561 (2012). ISSN 1433-7479, https://doi.org/10.1007/s00500-011-0754-8
Pavón, R., Díaz, F., Laza, R., Luzón, V.: Automatic parameter tuning with a Bayesian case-based reasoning system. A case of study. Expert Syst. Appl. 36(2, Part 2), 3407–3420 (2009). ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2008.02.044, http://www.sciencedirect.com/science/article/pii/S0957417408001292
R Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna (2018). https://www.R-project.org
Rardin, R., Uzsoy, R.: Experimental evaluation of heuristic optimization algorithms: a tutorial. J. Heuristics 7(3), 261–304 (2001)
Rechenberg, I.: Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. PhD thesis, Department of Process Engineering, Technical University of Berlin (1971)
Ridge, E.: Design of experiments for the tuning of optimisation algorithms. PhD thesis, The University of York (2007)
Ridge, E., Kudenko, D.: Tuning an Algorithm Using Design of Experiments, pp. 265–286. Springer, Berlin (2010). ISBN 978-3-642-02538-9, https://doi.org/10.1007/978-3-642-02538-9_11
Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4(4), 409–435 (1989)
Saltelli, A., Tarantola, S., Campolongo, F., Ratto, M.: Sensitivity Analysis in Practice. Wiley, New York (2004). ISBN 978-0-470-87095-2, https://doi.org/10.1002/0470870958
Saltelli, A., Ratto, M., Andres, T., Campolongo, F., Cariboni, J., Gatelli, D., Saisana, M., Tarantola, S.: Global Sensitivity Analysis. Wiley, New York (2008)
Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. Springer, Berlin (2003)
Schagen, A., Rehbach, F., Bartz-Beielstein, T.: Model-based evolutionary algorithm for optimization of gas distribution systems in power plant electrostatic precipitators. Int. J. Gener. Storage Electricity Heat 9, 65–72 (2018)
Schwefel, H.-P.: Evolutionsstrategie und numerische Optimierung. PhD thesis, Technische Universität Berlin, Fachbereich Verfahrenstechnik, Berlin (1975)
Schwefel, H.P.: Evolution and Optimum Seeking. Sixth-Generation Computer Technology. Wiley, New York (1995)
Sloss, A.N., Gustafson, S.: 2019 Evolutionary Algorithms Review (2019). http://arxiv.org/abs/1906.08870
Smit, S.K., Eiben, A.E.: Multi-problem parameter tuning using BONESA. In: Hao, J.K., Legrand, P., Collet, P., Monmarché, N., Lutton, E., Schoenauer, M. (eds.) Artificial Evolution, 10th International Conference Evolution Artificielle, pp. 222–233. Springer, Berlin (2011)
Sóbester, A., Leary, S.J., Keane, A.J.: A parallel updating scheme for approximating and optimizing high fidelity computer simulations. Struct. Multidiscipl. Optim. 27(5), 371–383 (2004)
Tukey, J.W.: Exploratory Data Analysis. Addison-Wesley, Reading (1977)
Vodopija, A., Stork, J., Bartz-Beielstein, T., Filipič, B.: Model-based multiobjective optimization of elevator group control. In: Filipič, B., Bartz-Beielstein, T. (eds.) International Conference on High-Performance Optimization in Industry, HPOI 2018, Ljubljana, pp. 43–46, Oct 2018
Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)
Yeguas, E., Luzón, M., Pavón, R., Laza, R., Arroyo, G., Díaz, F.: Automatic parameter tuning for evolutionary algorithms using a Bayesian case-based reasoning system. Appl. Soft Comput. 18, 185–195 (2014). ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2014.01.032, http://www.sciencedirect.com/science/article/pii/S1568494614000519
Zheng, F., Simpson, A.R., Zecchin, A.C.: An efficient hybrid approach for multiobjective optimization of water distribution systems. Water Resourc. Res. 50(5), 3650–3671 (2014). https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1002/2013WR014143
Acknowledgment
This work was supported by OWOS (FKZ: 005-1703-0011).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Bartz-Beielstein, T., Rehbach, F., Rebolledo, M. (2021). Tuning Algorithms for Stochastic Black-Box Optimization: State of the Art and Future Perspectives. In: Pardalos, P.M., Rasskazova, V., Vrahatis, M.N. (eds) Black Box Optimization, Machine Learning, and No-Free Lunch Theorems. Springer Optimization and Its Applications, vol 170. Springer, Cham. https://doi.org/10.1007/978-3-030-66515-9_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-66515-9_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-66514-2
Online ISBN: 978-3-030-66515-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)