Skip to main content

An Initialization Strategy for High-Dimensional Surrogate-Based Expensive Black-Box Optimization

  • Conference paper
  • First Online:
Modeling and Optimization: Theory and Applications

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 62))

Abstract

Surrogate-based optimization methods build surrogate models of expensive black-box objective and constraint functions using previously evaluated points and use these models to guide the search for an optimal solution. These methods require considerably more computational overhead and memory than other optimization methods, so their applicability to high-dimensional problems is somewhat limited. Many surrogates, such as radial basis functions (RBFs) with linear polynomial tails, require a maximal set of affinely independent points to fit the initial model. This paper proposes an initialization strategy for surrogate-based methods called underdetermined simplex gradient descent (USGD) that uses underdetermined simplex gradients to make progress towards the optimum while building a maximal set of affinely independent points. Numerical experiments on a 72-dimensional groundwater bioremediation problem and on 200-dimensional and 1000-dimensional instances of 16 well-known test problems demonstrate that the proposed USGD initialization strategy yields dramatic improvements in the objective function value compared to standard initialization procedures. Moreover, USGD initialization substantially improves the performance of two optimization algorithms that use RBF surrogates compared to standard initialization methods on the same test problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abramson, M.A., Audet, C.: Convergence of mesh adaptive direct search to second-order stationary points. SIAM J. Optim. 17(2), 606–619 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  2. Aleman, D.M., Romeijn, H.E., Dempsey, J.F.: A response surface approach to beam orientation optimization in intensity modulated radiation therapy treatment planning. INFORMS J. Comput. 21(1), 62–76 (2009)

    Article  Google Scholar 

  3. Audet, C., Dennis, J.E., Jr.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(2), 188–217 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  4. Audet, C., Dennis, J.E., Jr.: Le Digabel S.: Parallel space decomposition of the mesh adaptive direct search algorithm. SIAM J. Optim. 19(3), 1150–1170 (2008)

    Google Scholar 

  5. Bettonvil, B., Kleijnen J.P.C.: Searching for important factors in simulation models with many factors: Sequential bifurcation. Eur. J. Oper. Res. 96(1), 180–194 (1997)

    Article  MATH  Google Scholar 

  6. Björkman, M., Holmström, K.: Global optimization of costly nonconvex functions using radial basis functions. Optim. Eng. 1(4), 373–397 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  7. Booker, A.J., Dennis, J.E., Jr., Frank, P.D., Serafini, D.B., Torczon, V., Trosset M.W.: A rigorous framework for optimization of expensive functions by surrogates. Struct. Optim. 17(1), 1–13 (1999)

    Article  Google Scholar 

  8. Buhmann, M.D.: Radial Basis Functions. Cambridge University Press, Cambridge (2003)

    Book  MATH  Google Scholar 

  9. Bull, A.D.: Convergence rates of efficient global optimization algorithms. J. Mach. Learn. Res. 12(Oct), 2879–2904 (2011)

    MathSciNet  Google Scholar 

  10. Cassioli, A., Schoen, F.: Global optimization of expensive black box problems with a known lower bound. J. Global Optim. (2011). doi: 10.1007/s10898-011-9834-7

    Google Scholar 

  11. Chambers, M., Mount-Campbell, C.A.: Process optimization via neural network metamodeling. Int. J. Prod. Econ. 79(2), 93–100 (2002)

    Article  Google Scholar 

  12. Chen, L.-L., Liao, C., Lin, W.-B., Chang, L., Zhong, X.-M.: Hybrid-surrogate-model-based efficient global optimization for high-dimensional antenna design. Prog. Electromagnetics Res. 124, 85–100 (2012)

    Article  Google Scholar 

  13. Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of interpolation sets in derivative free optimization. Math. Program. 111(1–2), 141–172 (2008a)

    MathSciNet  MATH  Google Scholar 

  14. Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation. IMA J. Numer. Anal. 28(4), 721–748 (2008b)

    Article  MathSciNet  MATH  Google Scholar 

  15. Conn, A.R., Scheinberg, K., Vicente, L.N.: Global convergence of general derivative-free trust-region algorithms to first- and second-order critical points. SIAM J. Optim. 20(1), 387–415 (2009a)

    Article  MathSciNet  MATH  Google Scholar 

  16. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. SIAM, Philadelphia, PA (2009b)

    Book  MATH  Google Scholar 

  17. Conn, A.R., Scheinberg, K., Toint, Ph.L.: Recent progress in unconstrained nonlinear optimization without derivatives. Math. Program. 79(3), 397–414 (1997)

    MathSciNet  MATH  Google Scholar 

  18. Conn, A.R., Le Digabel, S.: Use of quadratic models with mesh-adaptive direct search for constrained black box optimization. Optim. Meth. Software 28(1), 139–158 (2013)

    Article  MATH  Google Scholar 

  19. Cressie, N.: Statistics for Spatial Data. Wiley, New York (1993)

    Google Scholar 

  20. Custódio, A.L., Rocha, H., Vicente, L.N.: Incorporating minimum Frobenius norm models in direct search. Computat. Optim. Appl. 46(2), 265–278 (2010)

    Article  MATH  Google Scholar 

  21. Custódio, A.L., Vicente, L.N.: Using sampling and simplex derivatives in pattern search methods. SIAM J. Optim. 18(2), 537–555 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  22. Egea, J.A., Vazquez, E., Banga, J.R., Marti, R.: Improved scatter search for the global optimization of computationally expensive dynamic models. J. Global Optim. 43(2–3), 175–190 (2009)

    Article  MATH  Google Scholar 

  23. García-Palomares, U.M., García-Urrea, I.J.,  Rodríguez-Hernández, P.S.: On sequential and parallel non-monotone derivative-free algorithms for box constrained optimization. Optim. Meth. Software (2012). doi:10.1080/10556788.2012.693926

    Google Scholar 

  24. Gray, G.A., Kolda, T.G.: Algorithm 856: APPSPACK 4.0: asynchronous parallel pattern search for derivative-free optimization. ACM Trans. Math. Software 32(3), 485–507 (2006)

    Google Scholar 

  25. Gutmann, H.-M.: A radial basis function method for global optimization. J. Global Optim. 19(3), 201–227 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  26. Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J.A., Larranga, P., Inza, I., Bengoetxea, E. (eds.) Towards a New Evolutionary Computation, pp. 75–102, Springer, Berlin (2006)

    Google Scholar 

  27. Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  28. Holmström, K.: An adaptive radial basis algorithm (ARBF) for expensive black-box global optimization. J. Global Optim. 41(3), 447–464 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  29. Huang, D., Allen, T.T., Notz, W.I., Zeng, N.: Global optimization of stochastic black-box systems via sequential kriging meta-models. J. Global Optim. 34(3), 441–466 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  30. Jakobsson, S., Patriksson, M., Rudholm, J., Wojciechowski, A.: A method for simulation based optimization using radial basis functions. Optim. Eng. 11(4), 501–532 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  31. Jin Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011)

    Article  Google Scholar 

  32. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6(5), 481–494 (2002)

    Article  Google Scholar 

  33. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492

    Google Scholar 

  34. Jones, D.R.: Large-scale multi-disciplinary mass optimization in the auto industry. Presented at the Modeling and Optimization: Theory and Applications (MOPTA) 2008 Conference, Ontario, Canada (2008)

    Google Scholar 

  35. Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45(3), 385–482 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  36. Kolda, T.G., Torczon, V.J.: On the convergence of asynchronous parallel pattern search. SIAM J. Optim. 14(4), 939–964 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  37. Le Digabel, S.: Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Trans. Math. Software 37(4), 44:1–44:15 (2011)

    Google Scholar 

  38. Le Thi, H.A., Vaz, A.I.F., Vicente, L.N.: Optimizing radial basis functions by D.C. programming and its use in direct search for global derivative-free optimization. TOP 20(1), 190–214 (2012)

    Google Scholar 

  39. Loshchilov, I., Schoenauer, M., Sebag, M.: Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2012), ACM Press, New York (2012)

    Google Scholar 

  40. Marsden, A.L., Wang, M., Dennis, J.E., Jr., Moin, P.: Optimal aeroacoustic shape design using the surrogate management framework. Optim. Eng. 5(2), 235–262 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  41. Minsker, B.S., Shoemaker, C.A.: Dynamic optimal control of in-situ bioremediation of groundwater. J. Water Resour. Plann. Manag. 124(3), 149–161 (1998)

    Article  Google Scholar 

  42. Moré, J., Wild, S.: Benchmarking derivative-free optimization algorithms. SIAM J. Optim. 20(1), 172–191 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  43. Moré, J., Garbow, B., Hillstrom, K.: Testing unconstrained optimization software. ACM Trans. Math. Software 7(1), 17–41 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  44. Myers, R.H., Montgomery, D.C.: Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 3rd edn. Wiley, New York (2009)

    Google Scholar 

  45. Oeuvray, R., Bierlaire, M.: BOOSTERS: A derivative-free algorithm based on radial basis functions. Int. J. Model. Simulat. 29(1), 26–36 (2009)

    Google Scholar 

  46. Oeuvray, R.: Trust-region methods based on radial basis functions with application to biomedical imaging. Ph.D. thesis, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland (2005)

    Google Scholar 

  47. Parr, J.M., Keane, A.J., Forrester, A.I.J., Holden, C.M.E.: Infill sampling criteria for surrogate-based optimization with constraint handling. Eng. Optim. 44(10), 1147–1166 (2012)

    Article  Google Scholar 

  48. Plantenga, T., Kolda, T.: HOPSPACK: Software framework for parallel derivative-free optimization. Sandia Technical Report (SAND 2009–6265). (2009)

    Google Scholar 

  49. Powell, M.J.D.: The theory of radial basis function approximation in 1990. In: Light, W. (ed.) Advances in Numerical Analysis, Volume 2: Wavelets, Subdivision Algorithms and Radial Basis Functions. pp. 105–210. Oxford University Press, Oxford (1992)

    Google Scholar 

  50. Powell, M.J.D.: UOBYQA: Unconstrained optimization by quadratic approximation. Math. Program. 92(3), 555–582 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  51. Powell, M.J.D.: The NEWUOA software for unconstrained optimization without derivatives. In: Di Pillo, G., Roma, M. (eds.) Large-Scale Nonlinear Optimization, pp. 255–297. Springer, New York (2006)

    Chapter  Google Scholar 

  52. Regis, R.G.: Stochastic radial basis function algorithms for large-scale optimization involving expensive black-box objective and constraint functions. Comput. Oper. Res. 38(5), 837–853 (2011)

    Article  MathSciNet  Google Scholar 

  53. Regis, R.G.: Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points. Eng. Optim. (2013). doi: 10.1080/0305215X.2013.765000.

    MathSciNet  Google Scholar 

  54. Regis, R.G., Shoemaker, C.A.: A stochastic radial basis function method for the global optimization of expensive functions. INFORMS J. Comput. 19(4), 497–509 (2007a)

    Article  MathSciNet  MATH  Google Scholar 

  55. Regis, R.G., Shoemaker, C.A.: Improved strategies for radial basis function methods for global optimization. J. Global Optim. 37(1), 113–135 (2007b)

    Article  MathSciNet  MATH  Google Scholar 

  56. Regis, R.G., Shoemaker, C.A.: A quasi-multistart framework for global optimization of expensive functions using response surface models. J. Global Optim. (2012). doi: 10.1007/s10898-012-9940-1

    Google Scholar 

  57. Regis, R.G., Shoemaker, C.A.: Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Eng. Optim. 45(5), 529–555 (2013)

    Article  MathSciNet  Google Scholar 

  58. Rocha, H., Dias, J.M., Ferreira, B.C., Lopes, M.C.: Selection of intensity modulated radiation therapy treatment beam directions using radial basis functions within a pattern search methods framework. J. Global Optim. (2012). doi: 10.1007/s10898-012-0002-5

    Google Scholar 

  59. Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4(4), 409–435 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  60. Scheinberg, K., Toint, Ph.L.: Self-correcting geometry in model-based algorithms for derivative-free unconstrained optimization. SIAM J. Optim. 20(6), 3512–3532 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  61. Shan, S., Wang, G.: Survey of modeling and optimization strategies to solve high-dimensional design problems with computationally-expensive black-box functions. Struct. Multidisci- plinary Optim. 41(2), 219–241 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  62. Shan, S., Wang, G.G.: Metamodeling for high dimensional simulation-based design problems. ASME Journal of Mechanical Design, 132(5), 051009 (2011)

    Article  MathSciNet  Google Scholar 

  63. Shoemaker, C.A., Willis, M., Zhang, W., Gossett, J.: Model analysis of reductive dechlorination with data from Cape Canaveral field site. In: Magar, V., Vogel, T., Aelion, C., Leeson, A. (eds.) Innovative Methods in Support of Bioremediation, pp. 125–131. Battelle Press, Columbus, OH (2001)

    Google Scholar 

  64. Tolson, B.A., Shoemaker, C.A.: Dynamically dimensioned search algorithm for computationally efficient watershed model calibration. Water Resour. Res. 43, W01413 (2007) doi:10.1029/2005WR004723.

    Article  Google Scholar 

  65. Torczon, V.: On the convergence of pattern search algorithms. SIAM J. Optim. 7(1), 1–25 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  66. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    Book  MATH  Google Scholar 

  67. Vaz, A.I.F., Vicente, L.N.: A particle swarm pattern search method for bound constrained global optimization. J. Global Optim. 39(2), 197–219 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  68. Vaz, A.I.F., Vicente, L.N.: PSwarm: A hybrid solver for linearly constrained global derivative-free optimization. Optim. Meth. Software 24(4–5), 669–685 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  69. Vazquez, E., Bect, J.: Convergence properties of the expected improvement algorithm with fixed mean and covariance functions. J. Stat. Plann. Infer. 140(11), 3088–3095 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  70. Viana, F.A.C., Haftka, R.T., Watson, L.T.: Why not run the efficient global optimization algorithm with multiple surrogates?. 51th AIAA/ASME /ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, AIAA 2010–3090, Orlando (2010)

    Google Scholar 

  71. Villemonteix, J., Vazquez, E., Walter, E.: An informational approach to the global optimization of expensive-to-evaluate functions. J. Global Optim. 44(4), 509–534 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  72. Wild, S.M., Regis, R.G., Shoemaker, C.A.: ORBIT: Optimization by radial basis function interpolation in trust-regions. SIAM J. Sci. Comput. 30(6), 3197–3219 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  73. Wild, S.M., Shoemaker, C.A.: Global convergence of radial basis function trust region derivative-free algorithms. SIAM J. Optim. 21(3), 761–781 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  74. Yoon, J.-H., Shoemaker, C.A.: Comparison of optimization methods for ground-water bioremediation. J. Water Resour. Plann. Manag. 125(1), 54–63 (1999)

    Article  Google Scholar 

  75. The MathWorks, Inc. Matlab Optimization Toolbox: User’s Guide, Version 4. Natick, MA (2009)

    Google Scholar 

Download references

Acknowledgements

I would like to thank Ismael Vaz and Luís Vicente for the PSwarm package, that includes a pattern search Matlab code with options for combining with RBF models. I am also grateful to Jorge Moré and Stefan Wild for their Matlab code that creates performance and data profiles.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rommel G. Regis .

Editor information

Editors and Affiliations

Appendix

Appendix

Figures 810 show the average progress curves for the surrogate-based optimization algorithms on the 1000-D test problems. Because the computational overheads of running some of these methods are enormous, the algorithms are only run for 10 trials instead of 30 trials on each problem.

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer Science+Business Media New York

About this paper

Cite this paper

Regis, R.G. (2013). An Initialization Strategy for High-Dimensional Surrogate-Based Expensive Black-Box Optimization. In: Zuluaga, L., Terlaky, T. (eds) Modeling and Optimization: Theory and Applications. Springer Proceedings in Mathematics & Statistics, vol 62. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-8987-0_3

Download citation

Publish with us

Policies and ethics