Advertisement

Uncertainty Management Using Sequential Parameter Optimization

  • Thomas Bartz-Beielstein
  • Christian Jung
  • Martin Zaefferer
Part of the Operations Research/Computer Science Interfaces Series book series (ORCS, volume 59)

Abstract

Sequential Parameter Optimization (SPO) is a meta-model based search heuristic that combines classical and modern statistical techniques. It was originally developed for the analysis of search heuristics such as simulated annealing, particle swarm optimization and evolutionary algorithms [6]. Here, SPO itself will be used as a search heuristic, i.e., SPO is applied to the objective function directly. An introduction to the state-of-the-art R implementation of SPO, the so-called sequential parameter optimization toolbox (SPOT), is presented in [5].

Keywords

Random Forest Design Point Latin Hypercube Sampling Kriging Model Meta Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Arnold, D.V., Beyer, H.-G.: A comparison of evolution strategies with other direct search methods in the presence of noise. Comput. Optim. Appl. 24(1), 135–159 (2003)CrossRefGoogle Scholar
  2. 2.
    Barton, R.R., Meckesheimer, M.: Metamodel-based simulation optimization. In: Henderson, S.G., Nelson, B.L. (eds.) Simulation. Handbooks in Operations Research and Management Science, vol. 13, pp. 535–574. Elsevier, Amsterdam (2006)Google Scholar
  3. 3.
    Bartz-Beielstein, T., Friese, M.: Sequential parameter optimization and optimal computational budget allocation for noisy optimization problems. CIOP Technical Report 02/11, Research Center CIOP (Computational Intelligence, Optimization andData Mining), Cologne University of Applied Science, Faculty of Computer Science and Engineering Science, Jan 2011Google Scholar
  4. 4.
    Bartz-Beielstein, T., Preuss, M.: The future of experimental research. In: Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 17–46. Springer, Berlin/Heidelberg/New York (2010)CrossRefGoogle Scholar
  5. 5.
    Bartz-Beielstein, T., Zaefferer, M.: A gentle introduction to sequential parameter optimization. Technical Report TR 01/2012, CIplus, 2012Google Scholar
  6. 6.
    Bartz-Beielstein, T., Parsopoulos, K.E., Vrahatis, M.N.: Design and analysis of optimization algorithms using computational statistics. Appl. Numer. Anal. Comput. Math. (ANACM), 1(2), 413–433 (2004)Google Scholar
  7. 7.
    Bartz-Beielstein, T., Lasarczyk, C., Preuss, M.: The sequential parameter optimization toolbox. In: Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 337–360. Springer, Berlin/Heidelberg/ New York (2010)Google Scholar
  8. 8.
    Bartz-Beielstein, T., Friese, M., Zaefferer, M., Naujoks, B., Flasch, O., Konen, W., Koch,P.: Noisy optimization with sequential parameter optimization and optimal computational budget allocation. In: Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO ’11, pp. 119–120. ACM, New York, NY (2011)Google Scholar
  9. 9.
    Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)CrossRefGoogle Scholar
  10. 10.
    Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth, Monterey, CA (1984)Google Scholar
  11. 11.
    Chambers, J., Cleveland, W., Kleiner, B., Tukey, P.: Graphical Methods for Data Analysis. Wadsworth, Belmont, CA (1983)Google Scholar
  12. 12.
    Chen, C.-H., Lee, L.H.: Stochastic Simulation Optimization. World Scientific, Singapore (2011)Google Scholar
  13. 13.
    Chen, H.C., Chen, C.H., Dai, L., Yücesan, E.: New development of optimal computing budget allocation for discrete event simulation. In: Andradóttir, S., Healy, K.J., Withers, D.H., Nelson, B.L. (eds.) Proceedings of the 1997 Winter Simulation Conference, pp. 334–341. IEEE Computer Society, Piscataway, NJ (1997)Google Scholar
  14. 14.
    Chen, J., Chen, C., Kelton, D.: Optimal computing budget allocation of indifference-zone-selection procedures. Technical Report, 2003. Working Paper. http://www.cba.uc.edu/faculty/keltonwd. Accessed 6 Jan 2005
  15. 15.
    Dancik, G.M., Dorman, K.S.: mlegp: statistical analysis for computer models of biological systems using R. Bioinformatics 24(17), 1966–1967 (2008)Google Scholar
  16. 16.
    Forrester, A., Sóbester, A., Keane, A.: Multi-fidelity optimization via surrogate modelling. Proc. Roy. Soc. A Math. Phys. Eng. Sci. 463(2088), 3251–3269 (2007)CrossRefGoogle Scholar
  17. 17.
    Forrester, A., Sobester, A., Keane, A.: Engineering Design via Surrogate Modelling. Wiley, New York (2008)CrossRefGoogle Scholar
  18. 18.
    Furrer, R., Nychka, D., Sain, S.: Fields: tools for spatial data. R package version 6.3 (2010)Google Scholar
  19. 19.
    Huang, D., Allen, T.T., Notz, W.I., Zeng, N.: Global optimization of stochastic black-box systems via sequential kriging meta-models. J. Glob. Optim. 34(3), 441–466 (2006)CrossRefGoogle Scholar
  20. 20.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft. Comput. 9(1), 3–12 (2005)CrossRefGoogle Scholar
  21. 21.
    Jin, Y., Branke, J.: Evolutionary optimization in uncertain environments—a survey. IEEE Trans. Evol. Comput. 9(3), 303–317 (2005)CrossRefGoogle Scholar
  22. 22.
    Jones, D., Schonlau, M., Welch, W.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13, 455–492 (1998)CrossRefGoogle Scholar
  23. 23.
    Karatzoglou, A., Smola, A., Hornik, K., Zeileis, A.: kernlab – an S4 package for kernel methods in R. J. Stat. Softw. 11(9), 1–20 (2004)Google Scholar
  24. 24.
    Kennedy, M.C., O’Hagan, A.: Predicting the output from a complex computer code when fast approximations are available. Biometrika 87(1), 1–13 (2000)CrossRefGoogle Scholar
  25. 25.
    Kleijnen, J.P.C.: Design and Analysis of Simulation Experiments. Springer, New York, NY (2008)Google Scholar
  26. 26.
    Krige, D.G.: A statistical approach to some basic mine valuation problems on the witwatersrand. J. Chem. Metall. Min. Soc. S. Afr. 52(6), 119–139 (1951)Google Scholar
  27. 27.
    Lasarczyk, C.W.G.: Genetische programmierung einer algorithmischen chemie. Ph.D. thesis, Technische Universität Dortmund (2007)Google Scholar
  28. 28.
    Liaw, A., Wiener, M.: Classification and regression by randomforest. R News 2(3), 18–22 (2002)Google Scholar
  29. 29.
    Lophaven, S., Nielsen, H., Søndergaard, J.: DACE—a matlab kriging toolbox. Technical Report IMM-REP-2002-12, Informatics and Mathematical Modelling, Technical University of Denmark, Copenhagen, Denmark (2002)Google Scholar
  30. 30.
    McKay, M.D., Beckman, R.J., Conover, W.J.: A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2), 239–245 (1979)Google Scholar
  31. 31.
    Okada, M., Ariizumi, T., Noma, Y., Yamazaki, Y.: On the behavior of edge rolling in hot strip mills. In: International Conference on Steel Rolling, vol. 1, pp. 275–286 (1980)Google Scholar
  32. 32.
    Pukelsheim, F.: Optimal Design of Experiments. Wiley, New York, NY (1993)Google Scholar
  33. 33.
    Roustant, O., Ginsbourger, D., Deville, Y.: Dicekriging, diceoptim: two r packages for the analysis of computer experiments by kriging-based metamodeling and optimization. J. Stat. Softw. 51, 1–55 (2010)Google Scholar
  34. 34.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4(4), 409–435 (1989)CrossRefGoogle Scholar
  35. 35.
    Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. Springer, Berlin/Heidelberg/New York (2003)CrossRefGoogle Scholar
  36. 36.
    Stagge, P.: Averaging efficiently in the presence of noise. In: Eiben, A. (ed.) Parallel Problem Solving from Nature, PPSN V, pp. 188–197. Springer, Berlin/Heidelberg/New York (1998)CrossRefGoogle Scholar
  37. 37.
    Takei, H., Onishi, Y., Yamasaki, Y., Takekoshi, A., Yamamoto, M., Okado,M.: Automatic width control of rougher in hot strip mill. Nippon Kokan Technical Report 34, Computer Systems Development Department Fukuyama Works (1982)Google Scholar
  38. 38.
    Takeuchi, M., Hoshiya, M., Watanabe, K., Hirata, O., Kikuma, T., Sadahiro, S.: Heavy width reduction rolling of slabs. Nippon Steel Technical Report. Overseas, No. 21, pp. 235–246 (1983)Google Scholar
  39. 39.
    Tukey, J.: The philosophy of multiple comparisons. Stat. Sci. 6, 100–116 (1991)CrossRefGoogle Scholar
  40. 40.
    Wankhede, M.J., Bressloff, N.W., Keane, A.J.: Combustor design optimization using co-kriging of steady and unsteady turbulent combustion. J. Eng. Gas Turbines Power 133(12), 121504 (2011)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • Thomas Bartz-Beielstein
    • 1
  • Christian Jung
    • 1
  • Martin Zaefferer
    • 1
  1. 1.Fachhochschule Köln, Faculty of Computer Science and Engineering ScienceGummersbachGermany

Personalised recommendations