Skip to main content

Learning Enabled Constrained Black-Box Optimization

  • Chapter
  • First Online:
Black Box Optimization, Machine Learning, and No-Free Lunch Theorems

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 170))

  • 2368 Accesses

Abstract

This chapter looks at the issue of black-box constrained optimization where both the objective function and the constraints are unknown and can only be observed pointwise. Both deterministic and probabilistic surrogate models are considered: the latter, more specifically analysed, are based on Gaussian Processes and Bayesian Optimization to handle the exploration–exploitation dilemma and improve sample efficiency. Particularly challenging is the case when the feasible region might be disconnected and the objective function cannot be evaluated outside the feasible region; this situation, known as “partially defined objective function” or “non-computable domains”, requires a novel approach: a first phase is based on the SVM classification in order to learn the feasible region, and a second phase, optimization, is based on a Gaussian Process. This approach is the main focus of this chapter that analyses modelling and computational issues and demonstrates the sample efficiency of the resulting algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Adam, S.P., Alexandropoulus S.A.N., Pardalos, P., Vrahatis, M.: No free lunch theorem: A review. In: Approximation and Optimization, pp. 57–82. Springer, Berlin (2019)

    Google Scholar 

  2. Akimoto, Y., Auger, A., Hansen, N.: CMA-ES and advanced adaptation mechanisms. In: Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion, pp. 533–562. ACM, New York (2016)

    Google Scholar 

  3. Alexandropoulos, S.A.N., Aridas, C.K., Kotsiantis, S.B., Vrahatis, M.N.: Multi-objective evolutionary optimization algorithms for machine learning: A recent survey. In: Approximation and Optimization, pp. 35–55. Springer, Cham (2019)

    Google Scholar 

  4. Amaran, S., Sahinidis, N.V., Sharda, B., Bury, S.J.: Simulation optimization: a review of algorithms and applications. Ann. Operat. Res. 240(1), 351–380 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  5. Archetti, F., Betro, B.: A probabilistic algorithm for global optimization. Calcolo 16(3), 335–343 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  6. Archetti, F., Candelieri, A.: Bayesian Optimization and Data Science. SpringerBriefs in Optimization. Springer International Publishing, New York (2019)

    Book  MATH  Google Scholar 

  7. Auer, P. (2002). Using confidence bounds for exploitation-exploration trade-offs. Journal of Machine Learning Research, 3(Nov), 397–422

    MathSciNet  MATH  Google Scholar 

  8. Bachoc, F., Helbert, C., Picheny, V.: Gaussian process optimization with failures: classification and convergence proof. J. Global Optim. 78, 483–506 (2019). hal.archives-ouvertes.fr

    Google Scholar 

  9. Basudhar, A., Dribusch, C., Lacaze, S., Missoum, S.: Constrained efficient global optimization with support vector machines. Struct. Multidiscipl. Optim. 46(2), 201–221 (2012)

    Article  MATH  Google Scholar 

  10. Bernardo, J., Bayarri, M., Berger, J., Dawid, A., Heckerman, D., Smith, A., West, M.: Optimization under unknown constraints. Bayesian Stat. 9(9), 229 (2011)

    MathSciNet  Google Scholar 

  11. Bhosekar, A., Ierapetritou, M.: Advances in surrogate based modeling, feasibility analysis, and optimization: a review. Comput. Chem. Eng. 108, 250–267 (2018)

    Article  Google Scholar 

  12. Bouhlel, A.M., Bartoli, N., Regis, R.G., Otsmane, A., Morlier, J.: Efficient global optimization for high-dimensional constrained problems by using the Kriging models combined with the partial least squares method. Eng. Optim. 50(12), 2038–2053 (2018)

    Article  MathSciNet  Google Scholar 

  13. Box G. E. P.; Draper, N. R. (2007), Response Surfaces, Mixtures, and Ridge Analyses, John Wiley & Sons. pg. 414

    Google Scholar 

  14. Candelieri, A., Archetti, F.: Sequential model-based optimization with black-box constraints: Feasibility determination via machine learning. In: AIP Conference Proceedings, p. 020010 (2019)

    Google Scholar 

  15. Candelieri, A., Perego, R., Archetti, F.: Bayesian optimization of pump operations in water distribution systems. J. Global Optim. 71, 1–23 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  16. Cao, Y., Shen, Y.: Bayesian active learning for optimization and uncertainty quantification in protein docking (2019). Preprint arXiv:1902.00067

    Google Scholar 

  17. Chen, Y., Hoffman, M. W., Colmenarejo, S. G., Denil, M., Lillicrap, T.P., de Freitas, N.: Learning to learn for global optimization of black box functions (2016). Preprint arXiv:1611.03824

    Google Scholar 

  18. Costabal, F.S., Perdikaris, P., Kuhl, E., Hurtado, D.E.: Multi-fidelity classification using Gaussian processes: accelerating the prediction of large-scale computational models (2019). Preprint arXiv:1905.03406

    Google Scholar 

  19. Costabal, F.S., Yao, J., Sher, A., Kuhl, E.: Predicting critical drug concentrations and torsadogenic risk using a multiscale exposure-response simulator. Progress Biophys. Molecular Biolo. 144, 61–76 (2019)

    Article  Google Scholar 

  20. Cozad, A., Sahinidis, N.V., Miller, D.C.: Learning surrogate models for simulation-based optimization. AIChE J. 60(6), 2211–2227 (2014)

    Article  Google Scholar 

  21. Digabel, S.L., Wild, S.M.: A taxonomy of constraints in simulation-based optimization (2015). Preprint arXiv:1505.07881

    Google Scholar 

  22. Dong, H., Song, B., Dong, Z., Wang, P.: SCGOSR: surrogate-based constrained global optimization using space reduction. Appl. Soft Comput. 65, 462–477 (2018)

    Article  Google Scholar 

  23. Eggensperger, K., Lindauer, M., Hutter, F.: Pitfalls and best practices in algorithm configuration. J. Artif. Intell. Res. 64, 861–893 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  24. Feliot, P., Bect, J., Vazquez, E.: A Bayesian approach to constrained single-and multi-objective optimization. J. Global Optim. 67(1–2), 97–133 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  25. Frazier, P.I., Powell, W.B., Dayanik, S.: A knowledge-gradient policy for sequential information collection. SIAM J. Control Optim. 47(5), 2410–2439 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  26. Gardner, J.R., Kusner, M.J., Xu, Z.E., Weinberger, K.Q., Cunningham, J.P.: Bayesian optimization with inequality constraints. In: International Conference on Machine Learning, pp. 937–945 (2014)

    Google Scholar 

  27. Garnett, R., Osborne, M.A., Roberts, S.J.: Sequential Bayesian prediction in the presence of changepoints. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 345–352. ACM, New York (2009)

    Google Scholar 

  28. Gaviano, M., Kvasov, D.E., Lera, D., Sergeyev, Ya.D.: Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization. ACM Trans. Math. Softw. 29(4), 469–480 (2003)

    Google Scholar 

  29. Gergel, V., Barkalov, K., Lebedev, I., Rachinskaya, M., Sysoyev, A.: A flexible generator of constrained global optimization test problems. In: AIP Conference Proceedings, vol. 2070, no. 1, p. 020009. AIP Publishing, College Park (2019)

    Google Scholar 

  30. Ghoreishi, S.F., Allaire, D.: Multi-information source constrained Bayesian optimization. Struct. Multidiscip. Optim. 59, 1–15 (2018)

    MathSciNet  Google Scholar 

  31. Gramacy R.B., Lee, H.K.: Optimization under unknown constraints. Bayesian Stat. 9(9), 229 (2011)

    Article  MathSciNet  Google Scholar 

  32. Gramacy, R.B., Gray, G.A., Le Digabel, S., Lee, H.K., Ranjan, P., Wells, G., Wild, S.M.: Modeling an augmented Lagrangian for blackbox constrained optimization. Technometrics 58(1), 1–11 (2016)

    Article  MathSciNet  Google Scholar 

  33. Grishagin, V., Israfilov, R.: Multidimensional constrained global optimization in domains with computable boundaries. In: CEUR Workshop Proceedings. Vol. 1513: Proceedings of the 1st Ural Workshop on Parallel, Distributed, and Cloud Computing for Young Scientists (Ural-PDC 2015).—Yekaterinburg, 2015 (2015)

    Google Scholar 

  34. Hernández-Lobato, J.M., Gelbart, M.A., Hoffman, M.W., Adams, R.P., Ghahramani, Z.: Predictive entropy search for Bayesian optimization with unknown constraints. In: 32nd International Conference on Machine Learning, ICML 2015, pp. 1699–1707. International Machine Learning Society (IMLS) (2015)

    Google Scholar 

  35. Hu, W., Fathi, M., Pardalos, P.M.: A multi-objective evolutionary algorithm based on decomposition and constraint programming for the multi-objective team orienteering problem with time windows. Appl. Soft Comput. 73, 383–393 (2018)

    Article  Google Scholar 

  36. Huyer, W., Neumaier, A.: SNOBFIT-stable noisy optimization by branch and fit. ACM Trans. Math. Softw. 35(2), 200 (2006)

    MathSciNet  Google Scholar 

  37. Ilievski, I., Akhtar, T., Feng, J., Shoemaker, C.A.: Efficient hyperparameter optimization for deep learning algorithms using deterministic RBF surrogates. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)

    Google Scholar 

  38. Jain, P., Kar, P.: Non-convex optimization for machine learning. Found. Trends Mach. Learn. 10(3–4), 142–336 (2017)

    Article  MATH  Google Scholar 

  39. Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. J. Global Optim. 21(4), 345–383 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  40. Jones, D.R.: Large-scale multi-disciplinary mass optimization in the auto industry. In: MOPTA 2008 Conference (2008)

    Google Scholar 

  41. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  42. Kandasamy, K., Dasarathy, G., Schneider, J., Poczos, B.: Multi-fidelity Bayesian optimisation with continuous approximations. In Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1799–1808 (2017). https://JMLR.org

  43. Kleijnen J.P.C.: Kriging: Methods and Applications. CentER Discussion Paper Series No. 2017-047 (2017)

    Google Scholar 

  44. Klein, A., Falkner, S., Springenberg, J.T., Hutter, F.: Learning curve prediction with Bayesian neural networks. In: Published as a Conference Paper at ICLR 2017 (2016)

    Google Scholar 

  45. Koch, P., Bagheri, S., Konen, W., Foussette, C., Krause, P., Bäck, T.: A new repair method for constrained optimization. In: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 273–280. ACM, New York (2015)

    Google Scholar 

  46. Kushner, H.J.: A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86(1), 97–106 (1964)

    Article  Google Scholar 

  47. Lam, R., Willcox, K., Wolpert, D.H.: Bayesian optimization with a finite budget: An approximate dynamic programming approach. In: Advances in Neural Information Processing Systems, pp. 883–891 (2016)

    Google Scholar 

  48. Larson, J., Menickelly, M., Wild, S.M.: Derivative-free optimization methods (2019). Preprint arXiv:1904.11585

    Google Scholar 

  49. Letham, B., Karrer, B., Ottoni, G., Bakshy, E. (2017). Constrained Bayesian optimization with noisy experiments. arXiv preprint arXiv:1706.07094.

    Google Scholar 

  50. Letham, B., Karrer, B., Ottoni, G., Bakshy, E.: Constrained Bayesian optimization with noisy experiments. Bayesian Anal. 14(2), 495–519 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  51. Martì, R., Pardalos, P.M., Resende, M.G. (Eds.): Handbook of Heuristics. Springer, Berlin (2018)

    MATH  Google Scholar 

  52. Mehdad, E., Kleijnen, J.P.: Efficient global optimisation for black-box simulation via sequential intrinsic Kriging. J. Oper. Res. Soc. 69(11), 1725–1737 (2018)

    Article  Google Scholar 

  53. Mockus, J., Tiesis, V., Zilinskas, A.: The application of Bayesian methods for seeking the extremum. Towards global optimization, 2 (117–129), 2, Dixon, L.C.W., Szego, G.P. (eds.) (1978)

    Google Scholar 

  54. Moreno, J.D., Zhu, Z.I., Yang, P.C., Bankston, J.R., Jeng, M.T., Kang, C., Wang, L., Bayer, J.D., Christini, D.J., Trayanova, N.A., Ripplinger, C.M., Kass, R.S., Clancy, C.E.: A computational model to predict the effects of class I anti-arrhythmic drugs on ventricular rhythms. Sci. Transl. Med. 3(98), 98ra83 (2011)

    Google Scholar 

  55. Nuñez, L., Regis, R.G., Varela, K.: Accelerated random search for constrained global optimization assisted by radial basis function surrogates. J. Comput. Appl. Math. 340, 276–295 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  56. Ortega, P.A., Wang, J.X., Rowland, M., Genewein, T., Kurth-Nelson, Z., Pascanu, R., et al.: Meta-learning of Sequential Strategies (2019). Preprint arXiv:1905.03030

    Google Scholar 

  57. Parsopoulus K.E., Vrahatis, M.N.: Particle Swarm Optimization and Intelligence: Advances and Applications (2010). https://doi.org/10.4018/978-1-61520-666-7

  58. Peherstorfer, B., Willcox, K., Gunzburger, M.: Survey of multifidelity methods in uncertainty propagation, inference, and optimization. SIAM Rev. 60(3), 550–591 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  59. Perdikaris, P., Venturi, D., Karniadakis, G.E.: Multifidelity information fusion algorithms for high-dimensional systems and massive data sets. SIAM J. Sci. Comput. 38(4), B521–B538 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  60. Powell, W.B.: A unified framework for stochastic optimization. Euro. J. Oper. Res. 275(3), 795–821 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  61. Regis, R.G.: A Survey of Surrogate Approaches for Expensive Constrained Black-Box Optimization. In: Le Thi, H., Le, H., Pham Dinh, T. (eds.) Optimization of Complex Systems: Theory, Models, Algorithms and Applications, pp. 37–47. WCGO 2019. Springer, Cham (2020)

    Google Scholar 

  62. Regis, R.G., Shoemaker, C.A.: Constrained global optimization of expensive black box functions using radial basis functions, J. Global Optim. 31, 153–171 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  63. Regis, R.G., Shoemaker, C.A.: A stochastic radial basis function method for the global optimization of expensive functions. INFORMS J. Comput. 19(4), 497–509 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  64. Regis, R.G., Shoemaker, C.A.: Parallel radial basis function methods for the global optimization of expensive functions. Eur. J. Oper. Res. 182(2), 514–535 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  65. Regis, R.G., Shoemaker, C.A.: Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Eng. Optim. 45(5), 529–555 (2013)

    Article  MathSciNet  Google Scholar 

  66. Rudenko, L.I.: Objective functional approximation in a partially defined optimization problem. J. Math. Sci. 72(5), 3359–3363 (1994)

    Article  MathSciNet  Google Scholar 

  67. Sacher, M., Duvigneau, R., Le Maitre, O., Durand, M., Berrini, E., Hauville, F., Astolfi, J.A.: A classification approach to efficient global optimization in presence of non-computable domains. Struct. Multidiscip. Optim. 58(4), 1537–1557 (2018)

    Article  MathSciNet  Google Scholar 

  68. Sen, S., Deng, Y.: Learning enabled optimization: Towards a fusion of statistical learning and stochastic programming. INFORMS Journal on Optimization (2018)

    Google Scholar 

  69. Sergeyev, Y.D., Kvasov, D.E., Khalaf, F.M.: A one-dimensional local tuning algorithm for solving GO problems with partially defined constraints. Optim. Lett. 1(1), 85–99 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  70. Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: Emmental-type GKLS-based multiextremal smooth test problems with non-linear constraints. In: International Conference on Learning and Intelligent Optimization, pp. 383–388. Springer, Cham (2017)

    Google Scholar 

  71. Sra, S., Nowozin, S., Wright, S.J. (Eds.): Optimization for Machine Learning. Mit Press, Cambridge (2012)

    MATH  Google Scholar 

  72. Srinivas, N., Krause, A., Kakade, S. M., & Seeger, M. W. (2012). Information-theoretic regret bounds for gaussian process optimization in the bandit setting. IEEE Transactions on Information Theory, 58(5), 3250–3265

    Article  MathSciNet  MATH  Google Scholar 

  73. Sui, Y., Gotovos, A., Burdick, J., & Krause, A. (2015, June). Safe exploration for optimization with Gaussian processes. In International Conference on Machine Learning (pp. 997–1005). PMLR

    Google Scholar 

  74. Tsai, Y.A., Pedrielli, G., Mathesen, L., Zabinsky, Z.B., Huang, H., Candelieri, A., Perego, R.: Stochastic optimization for feasibility determination: An application to water pump operation in water distribution network. In: Proceedings of the 2018 Winter Simulation Conference, pp. 1945–1956. IEEE Press, New York (2018)

    Google Scholar 

  75. Volpp, M., Fr’́ohlich, L., Doerr, A., Hutter, F., Daniel, C.: Meta-Learning Acquisition Functions for Bayesian Optimization (2019). Preprint arXiv:1904.02642

    Google Scholar 

  76. Wang, Y., Shoemaker, C.A.: A General Stochastic Algorithmic Framework for Minimizing Expensive Black Box Objective Functions Based on Surrogate Models and Sensitivity Analysis (2014). Preprint arXiv:1410.6271

    Google Scholar 

  77. Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning, vol. 2, No. 3, p. 4. MIT Press, Cambridge (2006)

    Google Scholar 

  78. Wilson, Z.T., Sahinidis, N.V.: The ALAMO approach to machine learning. Comput. Chem. Eng. 106, 785–795 (2017)

    Article  Google Scholar 

  79. Wilson, Z.T., Sahinidis, N.V.: The ALAMO approach to machine learning. Comput. Chem. Eng. 106, 785–795 (2017)

    Article  Google Scholar 

  80. Zabinsky, Z.B.: Stochastic Adaptive Search for Global Optimization, vol. 72. Springer Science & Business Media, Berlin (2013)

    MATH  Google Scholar 

  81. Zhang, Z., Buisson, M., Ferrand, P., Henner, M.: Databases coupling for morphed-mesh simulations and application on fan optimal design. In: World Congress on Global Optimization, pp. 981–990. Springer, Cham (2019)

    Google Scholar 

  82. Ẑilinskas, A., Zhigljavsky, A.: Stochastic global optimization: a review on the occasion of 25 years of Informatica. Informatica 27(2), 229–256 (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. Candelieri .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Archetti, F., Candelieri, A., Galuzzi, B.G., Perego, R. (2021). Learning Enabled Constrained Black-Box Optimization. In: Pardalos, P.M., Rasskazova, V., Vrahatis, M.N. (eds) Black Box Optimization, Machine Learning, and No-Free Lunch Theorems. Springer Optimization and Its Applications, vol 170. Springer, Cham. https://doi.org/10.1007/978-3-030-66515-9_1

Download citation

Publish with us

Policies and ethics