# MISO: mixed-integer surrogate optimization framework

- First Online:

- Received:
- Revised:
- Accepted:

DOI: 10.1007/s11081-015-9281-2

- Cite this article as:
- Müller, J. Optim Eng (2016) 17: 177. doi:10.1007/s11081-015-9281-2

- 2 Citations
- 401 Downloads

## Abstract

We introduce MISO, the mixed-integer surrogate optimization framework. MISO aims at solving computationally expensive black-box optimization problems with mixed-integer variables. This type of optimization problem is encountered in many applications for which time consuming simulation codes must be run in order to obtain an objective function value. Examples include optimal reliability design and structural optimization. A single objective function evaluation may take from several minutes to hours or even days. Thus, only very few objective function evaluations are allowable during the optimization. The development of algorithms for this type of optimization problems has, however, rarely been addressed in the literature. Because the objective function is black-box, derivatives are not available and numerically approximating the derivatives requires a prohibitively large number of function evaluations. Therefore, we use computationally cheap surrogate models to approximate the expensive objective function and to decide at which points in the variable domain the expensive objective function should be evaluated. We develop a general surrogate model framework and show how sampling strategies of well-known surrogate model algorithms for continuous optimization can be modified for mixed-integer variables. We introduce two new algorithms that combine different sampling strategies and local search to obtain high-accuracy solutions. We compare MISO in numerical experiments to a genetic algorithm, NOMAD version 3.6.2, and SO-MI. The results show that MISO is in general more efficient than NOMAD and the genetic algorithm with respect to finding improved solutions within a limited budget of allowable evaluations. The performance of MISO depends on the chosen sampling strategy. The MISO algorithm that combines a coordinate perturbation search with a target value strategy and a local search performs best among all algorithms.

### Keywords

Mixed-integer optimization Surrogate models Black-box optimization Radial basis functions Derivative-free Global optimization### Abbreviations

- DYCORS
DYnamic COordinate search using response surface models (Regis and Shoemaker 2013 )

- EGO
Efficient global optimization (Jones et al. 1998)

- MISO
Mixed-integer surrogate optimization

- MISO-CP
MISO-coordinate perturbation

- MISO-EI
MISO-expected improvement

- MISO-RS
MISO-random sampling

- MISO-SM
MISO-surface minimum

- MISO-TV
MISO-target value

- MISO-CPTV
MISO with combination of CP and TV

- MISO-CPTV-local
MISO with combination of CP, TV, and a local search

- MISO-CPTV-l(f)
MISO-CPTV-local that uses fmincon as local optimizer

- MISO-CPTV-l(o)
MISO-CPTV-local that uses ORBIT (Wild et al. 2007) as local optimizer

- NOMAD
Nonlinear optimization by mesh-adaptive direct search (Le Digabel 2011), version 3.6.2

- RBF
Radial basis function

- SO-MI
Surrogate optimization-mixed integer (Müller et al. 2013b)

- SRBF
Stochastic radial basis function algorithm (Regis and Shoemaker 2007)

*f*(·)Computationally expensive objective function

*d*Problem dimension

*d*_{1}Number of integer variables

*d*_{2}Number of continuous variables

- \({\mathbf {z}}\)
Variable vector

- \(z_i^l\), \(z_i^u\)
Lower and upper variable bounds of the

*i*th variable- \({\mathcal {Z}}\)
Set of evaluated points, \({\mathcal {Z}}=\{{\mathbf {z}}_1,\ldots ,{\mathbf {z}}_n\}\)

- \({\mathcal {Z}}_l\)
Set of points evaluated during the local search (

*l*-step)*n*_{0}Number of points in the initial experimental design

*n*_{1}Number of function evaluations done in the local search (

*l*-step)*n*Number of already evaluated points

- \(s_n(\cdot )\)
Surrogate model fit to

*n*data points- \({\mathbb {I}}\)
Indices of the integer variables