Abstract
This paper is concerned with the definition of new derivative-free methods for box constrained multiobjective optimization. The method that we propose is a non-trivial extension of the well-known implicit filtering algorithm to the multiobjective case. Global convergence results are stated under smooth assumptions on the objective functions. We also show how the proposed method can be used as a tool to enhance the performance of the Direct MultiSearch (DMS) algorithm. Numerical results on a set of test problems show the efficiency of the implicit filtering algorithm when used to find a single Pareto solution of the problem. Furthermore, we also show through numerical experience that the proposed algorithm improves the performance of DMS alone when used to reconstruct the entire Pareto front.
This is a preview of subscription content, access via your institution.








References
Bailey, K.R., Fitzpatrick, B.G.: Estimation of groundwater flow parameters using least squares. Math. Comput. Model. 26(11), 117–127 (1997)
Carter, R.G., Gablonsky, J.M., Patrick, A., Kelley, C.T., Eslinger, O.J.: Algorithms for noisy problems in gas transmission pipeline optimization. Optim. Eng. 2(2), 139–157 (2001)
Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. Society for Industrial and Applied Mathematics, Philadelphia (2009)
Custódio, A.L., Madeira, J.F.A., Vaz, A.I.F., Vicente, L.N.: Direct multisearch for multiobjective optimization. SIAM J. Optim. 21(3), 1109–1140 (2011)
David, J., Ives, R.L., Tran, H.T., Bui, T., Read, M.E.: Computer optimized design of electron guns. IEEE Trans. Plasma Sci. 36(1), 156–168 (2008)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)
dos Santos Coelho, L., Mariani, V.C.: Combining of differential evolution and implicit filtering algorithm applied to electromagnetic design optimization. In: Saad, A., Dahal, K., Sarfraz, M., Roy, R. (eds.) Soft Computing in Industrial Applications, pp. 233–240. Springer, Berlin, Heidelberg (2007)
Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)
Fowler, K.R., Kelley, C.T., Kees, C.E., Miller, C.T.: A hydraulic capture application for optimal remediation design. Dev. Water Sci. 55, 1149–1157 (2004)
Fowler, K.R., Kelley, C.T., Miller, C.T., Kees, C.E., Darwin, R.W., Reese, J.P., Farthing, M.W., Reed, M.S.C.: Solution of a well-field design problem with implicit filtering. Optim. Eng. 5(2), 207–234 (2004)
Gen, M., Cheng, R., Lin, L.: Multiobjective genetic algorithms. In: Network Models and Optimization: Multiobjective Genetic Algorithm Approach, pp. 1–47. Springer, London (2008)
Gilmore, P., Kelley, C.T.: An implicit filtering algorithm for optimization of functions with many local minima. SIAM J. Optim. 5(2), 269–285 (1995)
Gilmore, P., Kelley, C.T., Miller, C.T., Williams, G.A.: Implicit filtering and optimal design problems. In: Borggaard, J., Burkardt, J., Gunzburger, M., Peterson, J. (eds.) Optimal Design and Control, pp. 159–176. Springer, Boston, MA (1995)
Kelley, C.T.: Implicit Filtering. Society for Industrial and Applied Mathematics, Philadelphia (2011)
Lin, C.-J., Lucidi, S., Palagi, L., Risi, A., Sciandrone, M.: Decomposition algorithm model for singly linearly constrained problems subject to lower and upper bounds. J. Optim. Theory Appl. 141, 107–126 (2009)
Liuzzi, G., Lucidi, S., Rinaldi, F.: A derivative-free approach to constrained multiobjective nonsmooth optimization. SIAM J. Optim. 26(4), 2744–2774 (2016)
Lucidi, S., Palagi, L., Risi, A., Sciandrone, M.: A convergent decomposition algorithm for support vector machines. Comput. Optim. Appl. 38, 217–234 (2007)
Miettinen, K.: Nonlinear Multiobjective Optimization. International Series in Operations Research and Management Science. Springer, Berlin (1998)
Van Veldhuizen, D.A.: Multiobjective evolutionary algorithms: classifications, analyses, and new innovations. Ph.D. thesis, Wright Patterson AFB, OH, USA, AAI9928483 (1999)
Zhou, A., Qu, B.-Y., Li, H., Zhao, S.-Z., Suganthan, P.N., Zhang, Q.: Multiobjective evolutionary algorithms: a survey of the state of the art. Swarm Evol. Comput. 1(1), 32–49 (2011)
Acknowledgements
We are thankful to three anonymous reviewers whose stimulating comments and suggestions greatly helped us improving the paper. Also, we would like to thank Prof. Ana Luísa Custódio, José F. Aguilar Madeira, A. Ismael F. Vaz, and Luís Nunes Vicente for providing us the matlab code of their direct multisearch algorithm (DMS). Work partially supported by INDAM-GNCS.
Author information
Authors and Affiliations
Corresponding author
Appendix: Technical results
Appendix: Technical results
In the appendix we prove two technical results that are used for the convergence analysis.
Proposition 2
Let \(f:\mathbb {R}^n \rightarrow \mathbb {R}\) be continuously differentiable and let \(x\in \mathcal {F}\). Let \(\{z_k\}\subset \mathcal {F}\) and \(\{h_k\}\subset \mathbb {R}^+\) be sequences such that
Assume that, for \(i=1,\ldots ,n\), at least one of the following condition holds
Then we have
Proof
Let \(i\in \{1,\ldots ,n\}\) and define the following subsets
By definition of approximated gradient we have
Suppose that \(K_1\) is an infinite subset. For all \(k\in K_1\), by the Mean Value Theorem, we can write
where \(\xi _k=z_k+\theta _k h_ke_i\), with \(\theta _k\in (0,1)\). Taking the limits for \(k\in K_1\) and \(k\rightarrow \infty \), recalling (28) and the continuity of the gradient, we obtain
By repeating the same reasonings using the sets \(K_2\) and \(K_3\), we have
and the thesis is proved. \(\square \)
Proposition 3
Consider Problem (1), let \(F:\mathbb {R}^n \rightarrow \mathbb {R}^m\) be continuously differentiable, \(x\in \mathcal {F}\), and let \(\theta :\mathcal {F}\times R^+ \rightarrow \mathbb {R}\) be defined as in (9). Then:
-
(i)
\(\theta (x,h)\le 0\) for all \(x \in \mathcal {F}\) and \(h>0\);
-
(ii)
let \(\{z_k\}\subset \mathcal {F}\) and \(\{h_k\}\subset \mathbb {R}^+\) be sequences satisfying the assumptions of Proposition 2; we have
$$\begin{aligned} \lim _{k\rightarrow \infty }\theta (z_k,h_k)=\theta (x). \end{aligned}$$
Proof
(i) Given \(x,y\in \mathcal {F}\) and \(h>0\), we consider the function g defined as follows:
and note that
Then \(\theta (x,h)\le 0\) follows easily from \(g(x,h,x) =0\).
(ii) We preliminary observe that
Let us define
so that
Denote by \(J_{h_k}(z_k)\) the approximated Jacobian \(J_{h_k}(z_k)=[ \nabla _{h_k}f_1(z_k),\ldots , \nabla _{h_k} f_m(z_k) ]^\top \). We can write
A quite similar bound, with \(y_k\) in place of y(x), can be obtained for \(\theta (x)-\theta (z_k,h_k)\). Then, as \(z_k\) and \(y_k\) belong to the compact set \(\mathcal {F}\), by Proposition 2, \(|\theta (z_k,h_k)-\theta (x)|\rightarrow 0\) for \(k\rightarrow \infty \). \(\square \)
Rights and permissions
About this article
Cite this article
Cocchi, G., Liuzzi, G., Papini, A. et al. An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints. Comput Optim Appl 69, 267–296 (2018). https://doi.org/10.1007/s10589-017-9953-2
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-017-9953-2
Keywords
- Multiobjective nonlinear programming
- Derivative-free optimization
- Implicit filtering
Mathematics Subject Classification
- 90C30
- 90C56
- 65K05