Abstract
In engineering design optimization, there are often multiple conflicting optimization objectives. Bayesian optimization (BO) is successfully applied in solving multi-objective optimization problems to reduce computational expense. However, the expensive expense associated with high-fidelity simulations has not been fully addressed. Combining the BO methods with the bi-fidelity surrogate model can further reduce expense by using the information of samples with different fidelities. In this paper, a bi-fidelity BO method for multi-objective optimization based on lower confidence bound function and the hierarchical Kriging model is proposed. In the proposed method, a novel bi-fidelity acquisition function is developed to guide the optimization process, in which a cost coefficient is adopted to balance the sampling cost and the information provided by the new sample. The proposed method quantifies the effect of samples with different fidelities for improving the quality of the Pareto set and fills the blank of the research domain in extending BO based on the lower confidence bound (LCB) function with bi-fidelity surrogate model for multi-objective optimization. Compared with the four state-of-the-art BO methods, the results show that the proposed method is able to obviously reduce the expense while obtaining high-quality Pareto solutions.
Similar content being viewed by others
Abbreviations
- \(CR\) :
-
Cost ratio of samples with different fidelity
- \(F(x)\) :
-
Objective of multi-objective optimization
- \(f_{l} (x)\) :
-
Predicted mean of low-fidelity surrogate model
- \(f_{i}^{j}\) :
-
jTh Pareto solution of the ith objective
- \(f_{LCB} (x)\) :
-
LCB function at sample x
- \(f_{LCB}^{l} (x)\) :
-
Low-fidelity LCB function at sample x
- \(f_{LCB}^{h} (x)\) :
-
High-fidelity LCB function at sample x
- \(g_{j} (x)\) :
-
jTh Constrain of multi-objective optimization
- \(H(p)\) :
-
Hypervolume indicator
- \(I(x)\) :
-
Improvement function at sample x
- \(I_{LCB} (x)\) :
-
Improvement of LCB function at sample x beyond current Pareto set
- \(I_{LCB}^{l} (x)\) :
-
Improvement of low-fidelity LCB at sample x beyond current Pareto set
- \(I_{LCB}^{h} (x)\) :
-
Improvement of high-fidelity LCB at sample x beyond current Pareto set
- \(I(x,t)\) :
-
The novel improvement function with different fidelity
- \(N_{l}\) :
-
Quantity of the low-fidelity samples
- \(N_{h}\) :
-
Quantity of the high-fidelity samples
- \({\text{R}}\) :
-
Covariance matrix of the hierarchical Kriging model
- \({\text{R}}_{l}\) :
-
Covariance matrix of the low-fidelity Kriging model
- \(s_{l}^{2} (x)\) :
-
Mean-squared error of the low-fidelity Kriging model at unobserved points
- \(t\) :
-
Fidelity level
- \(X_{j}\) :
-
jTh Pareto solution
- \(Z( \cdot ),Z_{l} ( \cdot )\) :
-
Gaussian random process
- \(\beta_{0,l} {,}\,\beta_{0}\) :
-
Scaling factor of trend model
- \({\text{r}}\) :
-
Correlation vector of the hierarchical Kriging model
- \({\text{r}}_{l}\) :
-
Correlation vector of the low-fidelity Kriging model
- \(\theta\) :
-
“Roughness” parameter
- \(\sigma^{2}\) :
-
Process variance
- \(LCB\) :
-
Function associated with LCB
- \(l\) :
-
Low-fidelity value
- \(h\) :
-
High-fidelity value
- \(lb\) :
-
Lower bound of the value
- \(ub\) :
-
Upper bound of the value
- \(h\) :
-
High-fidelity value
- \(l\) :
-
Low-fidelity value
- \(\mu\) :
-
Value associated with predicted mean
References
Chen S, Jiang Z, Yang S, Chen W (2016) Multimodel fusion based sequential optimization. AIAA J 55(1):241–254
Choi S, Alonso JJ, Kroo IM (2009) Two-level multifidelity design optimization studies for supersonic jets. J Aircr 46(3):776–790
Deb K, Pratap A, Agarwal S, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197
Emmerich MTM, Deutz AH, Klinkenberg JW (2011) Hypervolume-based expected improvement: Monotonicity properties and exact computation. Paper presented at the 2011 IEEE congress of evolutionary computation (CEC)
Eweis-Labolle JT, Oune N, Bostanabad R (2022) Data fusion with latent map gaussian processes. J Mech Des 144(9):091703
Han ZH, Görtz S (2012) Hierarchical kriging model for variable-fidelity surrogate modeling. AIAA J 50:1885–1896
He Y, Sun J, Song P, Wang X (2022) Variable-fidelity hypervolume-based expected improvement criteria for multi-objective efficient global optimization of expensive functions. Eng Comput 38(4):3663–3689
Jeong S, Obayashi S (2005) Efficient global optimization (EGO) for multi-objective problem and data mining. Paper presented at the 2005 IEEE congress on evolutionary computation
Jerome S, William JW, Toby JM, Henry PW (1989) Design and analysis of computer experiments. Stat Sci 4(4):409–423
Jiang P, Cheng J, Zhou Q, Shu L, Hu J (2019) Variable-fidelity lower confidence bounding approach for engineering optimization problems with expensive simulations. AIAA J 57(12):5416–5430
Jin Y (2011) Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol Comput 1(2):61–70
Jones DR (2001) A taxonomy of global optimization methods based on response surfaces. J Global Optim 21(4):345–383
Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Global Optim 13(4):455–492
Keane AJ (2006) Statistical improvement criteria for use in multiobjective design optimization. AIAA J 44(4):879–891
Kennedy MC, O’Hagan A (2000) Predicting the output from a complex computer code when fast approximations are available. Biometrika 87(1):1–13
Knowles J (2006) ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans Evol Comput 10(1):50–66
Leifsson L, Koziel S, Tesfahunegn YA (2015) Multiobjective aerodynamic optimization by variable-fidelity models and response surface surrogates. AIAA J 54(2):531–541
Li X, Qiu H, Jiang Z, Gao L, Shao X (2017) A VF-SLP framework using least squares hybrid scaling for RBDO. Struct Multidiscip Optim 55(5):1629–1640
Liu H, Ong YS, Cai J, Wang Y (2018) Cope with diverse data structures in multi-fidelity modeling: a Gaussian process method. Eng Appl Artif Intell 67:211–225
Lophaven SN, Nielsen HB, Søndergaard J (2002) DACE: a Matlab kriging toolbox (Vol. 2): Citeseer
Perdikaris P, Venturi D, Royset JO, Karniadakis GE (2015) Multi-fidelity modelling via recursive co-kriging and Gaussian–Markov random fields. Proc R Soc a: Math, Phys Eng Sci 471:20150018
Qian J, Cheng Y, Zhang A, Zhou Q, Zhang J (2021) Optimization design of metamaterial vibration isolator with honeycomb structure based on multi-fidelity surrogate model. Struct Multidiscip Optim 64(1):423–439
Ruan X, Jiang P, Zhou Q, Hu J, Shu L (2020) Variable-fidelity probability of improvement method for efficient global optimization of expensive black-box problems. Struct Multidiscip Optim 62(6):3021–3052
Schonlau M, Welch W, Jones D (1998) Global versus local search in constrained optimization of computer models. Inst Math Stat Lect Notes Monograph Ser 34:11–25
Shahriari B, Swersky K, Wang Z, Adams RP, Freitas Nd (2016) Taking the human out of the loop: a review of bayesian optimization. Proc IEEE 104(1):148–175
Shi M, Lv L, Sun W, Song X (2020) A multi-fidelity surrogate model based on support vector regression. Struct Multidiscip Optim 61(6):2363–2375
Shu L, Jiang P, Zhou Q, Xie T (2019) An online variable-fidelity optimization approach for multi-objective design optimization. Struct Multidiscip Optim 60(3):1059–1077
Shu L, Jiang P, Shao X, Wang Y (2020) A new multi-objective bayesian optimization formulation with the acquisition function for convergence and diversity. J Mech Design. https://doi.org/10.1115/14046508
Shu L, Jiang P, Wang Y (2021) A multi-fidelity Bayesian optimization approach based on the expected further improvement. Struct Multidiscip Optim 63(4):1709–1719
Singh P, Couckuyt I, Elsayed K, Deschrijver D, Dhaene T (2017) Multi-objective geometry optimization of a gas cyclone using triple-fidelity co-kriging surrogate models. J Optim Theory Appl 175(1):172–193
Sóbester A, Leary SJ, Keane AJ (2005) On the design of optimization strategies based on global response surface approximation models. J Global Optim 33(1):31–59
Srinivas N, Krause A, Kakade SM, Seeger MW (2012) Information-theoretic regret bounds for Gaussian process optimization in the bandit setting. IEEE Trans Inf Theory 58(5):3250–3265
Sun G, Li L, Fang J, Li Q (2021) On lower confidence bound improvement matrix-based approaches for multiobjective Bayesian optimization and its applications to thin-walled structures. Thin Walled Struct 161:107248
Svenson J, Santner T (2016) Multiobjective optimization of expensive-to-evaluate deterministic computer simulator models. Comput Stat Data Anal 94:250–264
Tran A, Tran M, Wang Y (2019) Constrained mixed-integer Gaussian mixture Bayesian optimization and its applications in designing fractal and auxetic metamaterials. Struct Multidiscip Optim 59(6):2131–2154
Wang GG (2003) Adaptive response surface method using inherited Latin hypercube design points. J Mech Des 125(2):210–220
Williams B, Cremaschi S (2021) Selection of surrogate modeling techniques for surface approximation and surrogate-based optimization. Chem Eng Res Des 170:76–89
Williams CK, Rasmussen CE (2006) Gaussian processes for machine learning. MIT press Cambridge, MA
Yang K, Emmerich M, Deutz A, Bäck T (2019) Multi-objective Bayesian global optimization using expected hypervolume improvement gradient. Swarm Evol Comput 44:945–956
Zanjani Foumani Z, Shishehbor M, Yousefpour A, Bostanabad R (2022). Multi-Fidelity Cost-Aware Bayesian Optimization. arXiv:2211.02732
Zhan D, Cheng Y, Liu J (2017) Expected improvement matrix-based infill criteria for expensive multiobjective optimization. IEEE Trans Evol Comput 21(6):956–975
Zhan D, Meng Y, Xing H (2022) A fast multi-point expected improvement for parallel expensive optimization. IEEE Trans Evol Comput. https://doi.org/10.1109/TEVC.2022.3232776
Zhang Y, Han ZH, Zhang KS (2018) Variable-fidelity expected improvement method for efficient global optimization of expensive functions. Struct Multidiscip Optim 58(4):1431–1451
Zhang Y, Kim NH, Park C, Haftka RT (2018) Multifidelity surrogate based on single linear regression. AIAA J 56(12):4944–4952
Zheng J, Li Z, Gao L, Jiang G (2016) A parameterized lower confidence bounding scheme for adaptive metamodel-based design optimization. Eng Comput 33(7):2165–2184
Zhou Q, Jiang P, Shao X, Hu J, Cao L, Wan L (2017) A variable fidelity information fusion method based on radial basis function. Adv Eng Inform 32:26–39
Funding
This research has been supported by the National Natural Science Foundation of China (NSFC) under Grant No. 52105256 and No. 52188102.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Replication of results
The main step of the proposed method is introduced in Sect. 3. The Matlab codes can be available from the website: https://pan.baidu.com/s/1B0vf4QaWByIoNUvoa9vFUA by using the password: 0ubx.
Additional information
Responsible Editor: Ramin Bostanabad.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Xu, K., Shu, L., Zhong, L. et al. A bi-fidelity Bayesian optimization method for multi-objective optimization with a novel acquisition function. Struct Multidisc Optim 66, 53 (2023). https://doi.org/10.1007/s00158-023-03509-9
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s00158-023-03509-9