Skip to main content
Log in

A bi-fidelity Bayesian optimization method for multi-objective optimization with a novel acquisition function

  • Research Paper
  • Published:
Structural and Multidisciplinary Optimization Aims and scope Submit manuscript

Abstract

In engineering design optimization, there are often multiple conflicting optimization objectives. Bayesian optimization (BO) is successfully applied in solving multi-objective optimization problems to reduce computational expense. However, the expensive expense associated with high-fidelity simulations has not been fully addressed. Combining the BO methods with the bi-fidelity surrogate model can further reduce expense by using the information of samples with different fidelities. In this paper, a bi-fidelity BO method for multi-objective optimization based on lower confidence bound function and the hierarchical Kriging model is proposed. In the proposed method, a novel bi-fidelity acquisition function is developed to guide the optimization process, in which a cost coefficient is adopted to balance the sampling cost and the information provided by the new sample. The proposed method quantifies the effect of samples with different fidelities for improving the quality of the Pareto set and fills the blank of the research domain in extending BO based on the lower confidence bound (LCB) function with bi-fidelity surrogate model for multi-objective optimization. Compared with the four state-of-the-art BO methods, the results show that the proposed method is able to obviously reduce the expense while obtaining high-quality Pareto solutions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig.1
Fig.2
Fig.3
Fig.4
Fig.5
Fig.6
Fig.7
Fig.8
Fig.9
Fig.10
Fig.11
Fig.12
Fig.13
Fig.14
Fig.15

Similar content being viewed by others

Abbreviations

\(CR\) :

Cost ratio of samples with different fidelity

\(F(x)\) :

Objective of multi-objective optimization

\(f_{l} (x)\) :

Predicted mean of low-fidelity surrogate model

\(f_{i}^{j}\) :

jTh Pareto solution of the ith objective

\(f_{LCB} (x)\) :

LCB function at sample x

\(f_{LCB}^{l} (x)\) :

Low-fidelity LCB function at sample x

\(f_{LCB}^{h} (x)\) :

High-fidelity LCB function at sample x

\(g_{j} (x)\) :

jTh Constrain of multi-objective optimization

\(H(p)\) :

Hypervolume indicator

\(I(x)\) :

Improvement function at sample x

\(I_{LCB} (x)\) :

Improvement of LCB function at sample x beyond current Pareto set

\(I_{LCB}^{l} (x)\) :

Improvement of low-fidelity LCB at sample x beyond current Pareto set

\(I_{LCB}^{h} (x)\) :

Improvement of high-fidelity LCB at sample x beyond current Pareto set

\(I(x,t)\) :

The novel improvement function with different fidelity

\(N_{l}\) :

Quantity of the low-fidelity samples

\(N_{h}\) :

Quantity of the high-fidelity samples

\({\text{R}}\) :

Covariance matrix of the hierarchical Kriging model

\({\text{R}}_{l}\) :

Covariance matrix of the low-fidelity Kriging model

\(s_{l}^{2} (x)\) :

Mean-squared error of the low-fidelity Kriging model at unobserved points

\(t\) :

Fidelity level

\(X_{j}\) :

jTh Pareto solution

\(Z( \cdot ),Z_{l} ( \cdot )\) :

Gaussian random process

\(\beta_{0,l} {,}\,\beta_{0}\) :

Scaling factor of trend model

\({\text{r}}\) :

Correlation vector of the hierarchical Kriging model

\({\text{r}}_{l}\) :

Correlation vector of the low-fidelity Kriging model

\(\theta\) :

“Roughness” parameter

\(\sigma^{2}\) :

Process variance

\(LCB\) :

Function associated with LCB

\(l\) :

Low-fidelity value

\(h\) :

High-fidelity value

\(lb\) :

Lower bound of the value

\(ub\) :

Upper bound of the value

\(h\) :

High-fidelity value

\(l\) :

Low-fidelity value

\(\mu\) :

Value associated with predicted mean

References

  • Chen S, Jiang Z, Yang S, Chen W (2016) Multimodel fusion based sequential optimization. AIAA J 55(1):241–254

    Article  Google Scholar 

  • Choi S, Alonso JJ, Kroo IM (2009) Two-level multifidelity design optimization studies for supersonic jets. J Aircr 46(3):776–790

    Article  Google Scholar 

  • Deb K, Pratap A, Agarwal S, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197

    Article  Google Scholar 

  • Emmerich MTM, Deutz AH, Klinkenberg JW (2011) Hypervolume-based expected improvement: Monotonicity properties and exact computation. Paper presented at the 2011 IEEE congress of evolutionary computation (CEC)

  • Eweis-Labolle JT, Oune N, Bostanabad R (2022) Data fusion with latent map gaussian processes. J Mech Des 144(9):091703

    Article  Google Scholar 

  • Han ZH, Görtz S (2012) Hierarchical kriging model for variable-fidelity surrogate modeling. AIAA J 50:1885–1896

    Article  Google Scholar 

  • He Y, Sun J, Song P, Wang X (2022) Variable-fidelity hypervolume-based expected improvement criteria for multi-objective efficient global optimization of expensive functions. Eng Comput 38(4):3663–3689

    Article  Google Scholar 

  • Jeong S, Obayashi S (2005) Efficient global optimization (EGO) for multi-objective problem and data mining. Paper presented at the 2005 IEEE congress on evolutionary computation

  • Jerome S, William JW, Toby JM, Henry PW (1989) Design and analysis of computer experiments. Stat Sci 4(4):409–423

    MathSciNet  MATH  Google Scholar 

  • Jiang P, Cheng J, Zhou Q, Shu L, Hu J (2019) Variable-fidelity lower confidence bounding approach for engineering optimization problems with expensive simulations. AIAA J 57(12):5416–5430

    Article  Google Scholar 

  • Jin Y (2011) Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol Comput 1(2):61–70

    Article  Google Scholar 

  • Jones DR (2001) A taxonomy of global optimization methods based on response surfaces. J Global Optim 21(4):345–383

    Article  MathSciNet  MATH  Google Scholar 

  • Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Global Optim 13(4):455–492

    Article  MathSciNet  MATH  Google Scholar 

  • Keane AJ (2006) Statistical improvement criteria for use in multiobjective design optimization. AIAA J 44(4):879–891

    Article  Google Scholar 

  • Kennedy MC, O’Hagan A (2000) Predicting the output from a complex computer code when fast approximations are available. Biometrika 87(1):1–13

    Article  MathSciNet  MATH  Google Scholar 

  • Knowles J (2006) ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans Evol Comput 10(1):50–66

    Article  Google Scholar 

  • Leifsson L, Koziel S, Tesfahunegn YA (2015) Multiobjective aerodynamic optimization by variable-fidelity models and response surface surrogates. AIAA J 54(2):531–541

    Article  Google Scholar 

  • Li X, Qiu H, Jiang Z, Gao L, Shao X (2017) A VF-SLP framework using least squares hybrid scaling for RBDO. Struct Multidiscip Optim 55(5):1629–1640

    Article  MathSciNet  Google Scholar 

  • Liu H, Ong YS, Cai J, Wang Y (2018) Cope with diverse data structures in multi-fidelity modeling: a Gaussian process method. Eng Appl Artif Intell 67:211–225

    Article  Google Scholar 

  • Lophaven SN, Nielsen HB, Søndergaard J (2002) DACE: a Matlab kriging toolbox (Vol. 2): Citeseer

  • Perdikaris P, Venturi D, Royset JO, Karniadakis GE (2015) Multi-fidelity modelling via recursive co-kriging and Gaussian–Markov random fields. Proc R Soc a: Math, Phys Eng Sci 471:20150018

    Article  Google Scholar 

  • Qian J, Cheng Y, Zhang A, Zhou Q, Zhang J (2021) Optimization design of metamaterial vibration isolator with honeycomb structure based on multi-fidelity surrogate model. Struct Multidiscip Optim 64(1):423–439

    Article  Google Scholar 

  • Ruan X, Jiang P, Zhou Q, Hu J, Shu L (2020) Variable-fidelity probability of improvement method for efficient global optimization of expensive black-box problems. Struct Multidiscip Optim 62(6):3021–3052

    Article  MathSciNet  Google Scholar 

  • Schonlau M, Welch W, Jones D (1998) Global versus local search in constrained optimization of computer models. Inst Math Stat Lect Notes Monograph Ser 34:11–25

    Article  MathSciNet  Google Scholar 

  • Shahriari B, Swersky K, Wang Z, Adams RP, Freitas Nd (2016) Taking the human out of the loop: a review of bayesian optimization. Proc IEEE 104(1):148–175

    Article  Google Scholar 

  • Shi M, Lv L, Sun W, Song X (2020) A multi-fidelity surrogate model based on support vector regression. Struct Multidiscip Optim 61(6):2363–2375

    Article  MathSciNet  Google Scholar 

  • Shu L, Jiang P, Zhou Q, Xie T (2019) An online variable-fidelity optimization approach for multi-objective design optimization. Struct Multidiscip Optim 60(3):1059–1077

    Article  MathSciNet  Google Scholar 

  • Shu L, Jiang P, Shao X, Wang Y (2020) A new multi-objective bayesian optimization formulation with the acquisition function for convergence and diversity. J Mech Design. https://doi.org/10.1115/14046508

    Article  Google Scholar 

  • Shu L, Jiang P, Wang Y (2021) A multi-fidelity Bayesian optimization approach based on the expected further improvement. Struct Multidiscip Optim 63(4):1709–1719

    Article  MathSciNet  Google Scholar 

  • Singh P, Couckuyt I, Elsayed K, Deschrijver D, Dhaene T (2017) Multi-objective geometry optimization of a gas cyclone using triple-fidelity co-kriging surrogate models. J Optim Theory Appl 175(1):172–193

    Article  MathSciNet  MATH  Google Scholar 

  • Sóbester A, Leary SJ, Keane AJ (2005) On the design of optimization strategies based on global response surface approximation models. J Global Optim 33(1):31–59

    Article  MathSciNet  MATH  Google Scholar 

  • Srinivas N, Krause A, Kakade SM, Seeger MW (2012) Information-theoretic regret bounds for Gaussian process optimization in the bandit setting. IEEE Trans Inf Theory 58(5):3250–3265

    Article  MathSciNet  MATH  Google Scholar 

  • Sun G, Li L, Fang J, Li Q (2021) On lower confidence bound improvement matrix-based approaches for multiobjective Bayesian optimization and its applications to thin-walled structures. Thin Walled Struct 161:107248

    Article  Google Scholar 

  • Svenson J, Santner T (2016) Multiobjective optimization of expensive-to-evaluate deterministic computer simulator models. Comput Stat Data Anal 94:250–264

    Article  MathSciNet  MATH  Google Scholar 

  • Tran A, Tran M, Wang Y (2019) Constrained mixed-integer Gaussian mixture Bayesian optimization and its applications in designing fractal and auxetic metamaterials. Struct Multidiscip Optim 59(6):2131–2154

    Article  MathSciNet  Google Scholar 

  • Wang GG (2003) Adaptive response surface method using inherited Latin hypercube design points. J Mech Des 125(2):210–220

    Article  Google Scholar 

  • Williams B, Cremaschi S (2021) Selection of surrogate modeling techniques for surface approximation and surrogate-based optimization. Chem Eng Res Des 170:76–89

    Article  Google Scholar 

  • Williams CK, Rasmussen CE (2006) Gaussian processes for machine learning. MIT press Cambridge, MA

    MATH  Google Scholar 

  • Yang K, Emmerich M, Deutz A, Bäck T (2019) Multi-objective Bayesian global optimization using expected hypervolume improvement gradient. Swarm Evol Comput 44:945–956

    Article  Google Scholar 

  • Zanjani Foumani Z, Shishehbor M, Yousefpour A, Bostanabad R (2022). Multi-Fidelity Cost-Aware Bayesian Optimization. arXiv:2211.02732

  • Zhan D, Cheng Y, Liu J (2017) Expected improvement matrix-based infill criteria for expensive multiobjective optimization. IEEE Trans Evol Comput 21(6):956–975

    Article  Google Scholar 

  • Zhan D, Meng Y, Xing H (2022) A fast multi-point expected improvement for parallel expensive optimization. IEEE Trans Evol Comput. https://doi.org/10.1109/TEVC.2022.3232776

    Article  Google Scholar 

  • Zhang Y, Han ZH, Zhang KS (2018) Variable-fidelity expected improvement method for efficient global optimization of expensive functions. Struct Multidiscip Optim 58(4):1431–1451

    Article  MathSciNet  Google Scholar 

  • Zhang Y, Kim NH, Park C, Haftka RT (2018) Multifidelity surrogate based on single linear regression. AIAA J 56(12):4944–4952

    Article  Google Scholar 

  • Zheng J, Li Z, Gao L, Jiang G (2016) A parameterized lower confidence bounding scheme for adaptive metamodel-based design optimization. Eng Comput 33(7):2165–2184

    Article  Google Scholar 

  • Zhou Q, Jiang P, Shao X, Hu J, Cao L, Wan L (2017) A variable fidelity information fusion method based on radial basis function. Adv Eng Inform 32:26–39

    Article  Google Scholar 

Download references

Funding

This research has been supported by the National Natural Science Foundation of China (NSFC) under Grant No. 52105256 and No. 52188102.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Leshi Shu.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Replication of results

The main step of the proposed method is introduced in Sect. 3. The Matlab codes can be available from the website: https://pan.baidu.com/s/1B0vf4QaWByIoNUvoa9vFUA by using the password: 0ubx.

Additional information

Responsible Editor: Ramin Bostanabad.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, K., Shu, L., Zhong, L. et al. A bi-fidelity Bayesian optimization method for multi-objective optimization with a novel acquisition function. Struct Multidisc Optim 66, 53 (2023). https://doi.org/10.1007/s00158-023-03509-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s00158-023-03509-9

Keywords

Navigation