Abstract
Benchmark experiments are required to test, compare, tune, and understand optimization algorithms. Ideally, benchmark problems closely reflect real-world problem behavior. Yet, real-world problems are not always readily available for benchmarking. For example, evaluation costs may be too high, or resources are unavailable (e.g., software or equipment). As a solution, data from previous evaluations can be used to train surrogate models which are then used for benchmarking. The goal is to generate test functions on which the performance of an algorithm is similar to that on the real-world objective function. However, predictions from data-driven models tend to be smoother than the ground-truth from which the training data is derived. This is especially problematic when the training data becomes sparse. The resulting benchmarks may not reflect the landscape features of the ground-truth, are too easy, and may lead to biased conclusions.
To resolve this, we use simulation of Gaussian processes instead of estimation (or prediction). This retains the covariance properties estimated during model training. While previous research suggested a decomposition-based approach for a small-scale, discrete problem, we show that the spectral simulation method enables simulation for continuous optimization problems. In a set of experiments with an artificial ground-truth, we demonstrate that this yields more accurate benchmarks than simply predicting with the Gaussian process model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
See the GBEA website, at http://www.gm.fh-koeln.de/~naujoks/gbea/gamesbench_doc.html#subdata. Accessed on 2020-08-03.
- 2.
Reproducible code and a complete set of the presented figures is provided at https://github.com/martinzaefferer/zaef20b. For easily accessible interfaces and demonstrations see https://github.com/martinzaefferer/COBBS.
References
Ardia, D., Mullen, K.M., Peterson, B.G., Ulrich, J.: DEoptim: differential evolution in R, Version 2.2-5 (2020). https://CRAN.R-project.org/package=DEoptim. Accessed 25 Feb 2020
Bartz-Beielstein, T.: How to create generalizable results. In: Kacprzyk, J., Pedrycz, W. (eds.) Springer Handbook of Computational Intelligence, pp. 1127–1142. Springer, Heidelberg (2015). https://doi.org/10.1007/978-3-662-43505-2_56
Bartz-Beielstein, T., et al.: SPOT - Sequential Parameter Optimization Toolbox - v20200429 (2020). https://github.com/bartzbeielstein/SPOT/releases/tag/v20200429. Accessed 29 Apr 2020
Cressie, N.A.: Statistics for Spatial Data. Wiley, New York (1993)
Dang, N., Pérez Cáceres, L., De Causmaecker, P., Stützle, T.: Configuring irace using surrogate configuration benchmarks. In: Genetic and Evolutionary Computation Conference (GECCO 2017), pp. 243–250. ACM, Berlin, July 2017
Daniels, S.J., Rahat, A.A.M., Everson, R.M., Tabor, G.R., Fieldsend, J.E.: A suite of computationally expensive shape optimisation problems using computational fluid dynamics. In: Auger, A., Fonseca, C.M., Lourenço, N., Machado, P., Paquete, L., Whitley, D. (eds.) PPSN 2018. LNCS, vol. 11102, pp. 296–307. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99259-4_24
Fischbach, A., Zaefferer, M., Stork, J., Friese, M., Bartz-Beielstein, T.: From real world data to test functions. In: 26th Workshop Computational Intelligence, pp. 159–177. KIT Scientific Publishing, Dortmund, November 2016
Flasch, O.: A modular genetic programming system. Ph.D. thesis, Technische Universität Dortmund, Dortmund, Germany, May 2015
Forrester, A., Sobester, A., Keane, A.: Engineering Design via Surrogate Modelling. Wiley, New York (2008)
Goodfellow, I.J., et al.: Generative adversarial nets. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, NIPS 2014, vol. 2, pp. 2672–2680. MIT Press, Cambridge (2014)
Hansen, N., Auger, A., Mersmann, O., Tusar, T., Brockhoff, D.: COCO: a platform for comparing continuous optimizers in a black-box setting. ArXiv e-prints, August 2016. arXiv:1603.08785v3
Hansen, N., et al.: COmparing Continuous Optimizers: numbbo/COCO on Github, March 2019. https://doi.org/10.5281/zenodo.2594848
Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: noiseless functions definitions. Research report RR-6829, inria-00362633, INRIA, February 2009. https://hal.inria.fr/inria-00362633
Journel, A.G., Huijbregts, C.J.: Mining Geostatistics. Academic Press, London (1978)
Lantuéjoul, C.: Geostatistical Simulation: Models and Algorithms. Springer, Heidelberg (2002)
Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965)
Preuss, M., Rudolph, G., Wessing, S.: Tuning optimization algorithms for real-world problems by means of surrogate modeling. In: Genetic and Evolutionary Computation Conference (GECCO 2010), pp. 401–408. ACM, Portland, July 2010
Rudolph, G., Preuss, M., Quadflieg, J.: Two-layered surrogate modeling for tuning optimization metaheuristics. Technical report TR09-2-005, TU Dortmund, Dortmund, Germany. Algorithm Engineering Report, September 2009
Storn, R., Price, K.: Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341–359 (1997)
Volz, V., Naujoks, B., Kerschke, P., Tušar, T.: Single- and multi-objective game-benchmark for evolutionary algorithms. In: Genetic and Evolutionary Computation Conference (GECCO2019). ACM, Prague, July 2019
Wang, H., van Stein, B., Emmerich, M., Bäck, T.: Time complexity reduction in efficient global optimization using cluster Kriging. In: Genetic and Evolutionary Computation Conference (GECCO 2017), pp. 889–896. ACM, Berlin, July 2017
Ypma, J., Borchers, H.W., Eddelbuettel, D.: NLoptr vers-1.2.1: R interface to NLopt (2019). http://cran.r-project.org/package=nloptr. Accessed 20 Nov 2019
Zaefferer, M., Fischbach, A., Naujoks, B., Bartz-Beielstein, T.: Simulation-based test functions for optimization algorithms. In: Genetic and Evolutionary Computation Conference (GECCO 2017), pp. 905–912. ACM, Berlin, July 2017
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Zaefferer, M., Rehbach, F. (2020). Continuous Optimization Benchmarks by Simulation. In: Bäck, T., et al. Parallel Problem Solving from Nature – PPSN XVI. PPSN 2020. Lecture Notes in Computer Science(), vol 12269. Springer, Cham. https://doi.org/10.1007/978-3-030-58112-1_19
Download citation
DOI: https://doi.org/10.1007/978-3-030-58112-1_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-58111-4
Online ISBN: 978-3-030-58112-1
eBook Packages: Computer ScienceComputer Science (R0)