Skip to main content
Log in

A Projected Subgradient Method for Nondifferentiable Quasiconvex Multiobjective Optimization Problems

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose a projected subgradient method for solving constrained nondifferentiable quasiconvex multiobjective optimization problems. The algorithm is based on the Plastria subdifferential to overcome potential shortcomings known from algorithms based on the classical gradient. Under suitable, yet rather general assumptions, we establish the convergence of the full sequence generated by the algorithm to a Pareto efficient solution of the problem. Numerical results are presented to illustrate our findings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. Motivating the algorithm and its parameters using this example was prompted to us by one of the anonymous referees.

References

  1. Bauschke, H.H., Borwein, J.M.: On projection algorithms for solving convex feasibility problems. SIAM Rev. 38(3), 367–426 (1996). https://doi.org/10.1137/s0036144593251710

    Article  MathSciNet  MATH  Google Scholar 

  2. Bello Cruz, J.Y.: A subgradient method for vector optimization problems. SIAM J. Optim. 23(4), 2169–2182 (2013). https://doi.org/10.1137/120866415

    Article  MathSciNet  MATH  Google Scholar 

  3. Bello Cruz, J.Y., Lucambio Pérez, L.R., Melo, J.G.: Convergence of the projected gradient method for quasiconvex multiobjective optimization. Nonlinear Anal. Theory Methods Appl. 74(16), 5268–5273 (2011). https://doi.org/10.1016/j.na.2011.04.067

    Article  MathSciNet  MATH  Google Scholar 

  4. Bento, G.C., Cruz Neto, J.X., Oliveira, P.R., Soubeyran, A.: The self regulation problem as an inexact steepest descent method for multicriteria optimization. Eur. J. Oper. Res. 235(3), 494–502 (2014). https://doi.org/10.1016/j.ejor.2014.01.002

    Article  MathSciNet  MATH  Google Scholar 

  5. Bento, G.C., Cruz Neto, J.X., Santos, P.S.M., Souza, S.S.: A weighting subgradient algorithm for multiobjective optimization. Optim. Lett. 12(2), 399–410 (2018). https://doi.org/10.1007/s11590-017-1133-x

    Article  MathSciNet  MATH  Google Scholar 

  6. Brito, A.S., Cruz Neto, J.X., Santos, P.S.M., Souza, S.S.: A relaxed projection method for solving multiobjective optimization problems. Eur. J. Oper. Res. 256(1), 17–23 (2017). https://doi.org/10.1016/j.ejor.2016.05.026

    Article  MathSciNet  MATH  Google Scholar 

  7. Burachik, R., Graña Drummond, L.M., Iusem, A.N., Svaiter, B.F.: Full convergence of the steepest descent method with inexact line searches. Optimization 32(2), 137–146 (1995). https://doi.org/10.1080/02331939508844042

    Article  MathSciNet  MATH  Google Scholar 

  8. Ceng, L.C., Yao, J.C.: Approximate proximal methods in vector optimization. Eur. J. Oper. Res. 183(1), 1–19 (2007). https://doi.org/10.1016/j.ejor.2006.09.070

    Article  MathSciNet  MATH  Google Scholar 

  9. Da Cruz Neto, J.X., Da Silva, G.J.P., Ferreira, O.P., Lopes, J.O.: A subgradient method for multiobjective optimization. Comput. Optim. Appl. 54(3), 461–472 (2013). https://doi.org/10.1007/s10589-012-9494-7

    Article  MathSciNet  MATH  Google Scholar 

  10. Daniilidis, A., Hadjisavvas, N., Martínez-Legaz, J.E.: An appropriate subdifferential for quasiconvex functions. SIAM J. Optim. 12(2), 407–420 (2001). https://doi.org/10.1137/S1052623400371399

    Article  MathSciNet  MATH  Google Scholar 

  11. Dinh, N., Goberna, M.A., Long, D.H., López-Cerdá, M.A.: New Farkas-type results for vector-valued functions: a non-abstract approach. J. Optim. Theory Appl. 182(1), 4–29 (2019). https://doi.org/10.1007/s10957-018-1352-z

    Article  MathSciNet  MATH  Google Scholar 

  12. Eschenauer, H., Koski, J., Osyczka, A. (eds.): Multicriteria Design Optimization. Springer, Berlin Heidelberg (1990). https://doi.org/10.1007/978-3-642-48697-5

    Book  MATH  Google Scholar 

  13. Fliege, J.: OLAF—A general modeling system to evaluate and optimize the location of an air polluting facility. OR Spektrum 23(1), 117–136 (2001). https://doi.org/10.1007/pl00013342

    Article  MATH  Google Scholar 

  14. Flores-Bazán, F., Vera, C.: Weak efficiency in multiobjective quasiconvex optimization on the real-line without derivatives. Optimization 58(1), 77–99 (2009). https://doi.org/10.1080/02331930701761524

    Article  MathSciNet  MATH  Google Scholar 

  15. Flores-Bazán, F., Vera, C.: Efficiency in quasiconvex multiobjective nondifferentiable optimization on the real line. Optimization pp. 1–23 (2021). https://doi.org/10.1080/02331934.2021.1892103

  16. Fu, Y., Diwekar, U.M.: An efficient sampling approach to multiobjective optimization. Ann. Oper. Res. 132(1–4), 109–134 (2004). https://doi.org/10.1023/b:anor.0000045279.46948.dd

    Article  MATH  Google Scholar 

  17. Fukuda, E.H., Graña Drummond, L.M.: On the convergence of the projected gradient method for vector optimization. Optimization 60(8–9), 1009–1021 (2011). https://doi.org/10.1080/02331934.2010.522710

    Article  MathSciNet  MATH  Google Scholar 

  18. Graña Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28(1), 5–29 (2004). https://doi.org/10.1023/b:coap.0000018877.86161.8b

    Article  MathSciNet  MATH  Google Scholar 

  19. Gromicho, JAd.S.: Quasiconvex Optimization and Location Theory. Springer, US (1998). https://doi.org/10.1007/978-1-4613-3326-5

    Book  MATH  Google Scholar 

  20. Gutiérrez, C., Jiménez, B., Novo, V.: On second-order Fritz John type optimality conditions in nonsmooth multiobjective programming. Math. Program. 123(1), 199–223 (2010). https://doi.org/10.1007/s10107-009-0318-1

    Article  MathSciNet  MATH  Google Scholar 

  21. Huong, N.T.T., Luan, N.N., Yen, N.D., Zhao, X.: The Borwein proper efficiency in linear fractional vector optimization. J. Nonlinear Convex Anal. 20(12), 2579–2595 (2019)

    MathSciNet  Google Scholar 

  22. IBM Corp.: IBM ILOG CPLEX Optimization Studio CPLEX User’s Manual (2009 et seqq.). Version 12.1

  23. Kim, D.S., Phạm, T.S., Tuyen, N.V.: On the existence of Pareto solutions for polynomial vector optimization problems. Math. Program. 177(1–2), 321–341 (2019). https://doi.org/10.1007/s10107-018-1271-7

    Article  MathSciNet  MATH  Google Scholar 

  24. Luc, D.T.: Theory of Vector Optimization. Springer, Berlin Heidelberg (1989). https://doi.org/10.1007/978-3-642-50280-4

    Book  Google Scholar 

  25. Maingé, P.E.: Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization. Set-Valued Anal. 16(7–8), 899–912 (2008). https://doi.org/10.1007/s11228-008-0102-z

    Article  MathSciNet  MATH  Google Scholar 

  26. Miettinen, K.M.: Nonlinear Multiobjective Optimization. Springer, US (1998). https://doi.org/10.1007/978-1-4615-5563-6

    Book  MATH  Google Scholar 

  27. Papa Quiroz, E.A., Apolinário, H.C.F., Villacorta, K.D., Oliveira, P.R.: A linear scalarization proximal point method for quasiconvex multiobjective minimization. J. Optim. Theory Appl. 183(3), 1028–1052 (2019). https://doi.org/10.1007/s10957-019-01582-z

    Article  MathSciNet  MATH  Google Scholar 

  28. Papa Quiroz, E.A., Cruzado, S.: An inexact scalarization proximal point method for multiobjective quasiconvex minimization. Ann. Oper. Res. (2020). https://doi.org/10.1007/s10479-020-03622-8

    Article  Google Scholar 

  29. Pardalos, P.M., Steponavičė, I., Žilinskas, A.: Pareto set approximation by the method of adjustable weights and successive lexicographic goal programming. Optim. Lett. 6(4), 665–678 (2012). https://doi.org/10.1007/s11590-011-0291-5

    Article  MathSciNet  MATH  Google Scholar 

  30. Plastria, F.: Lower subdifferentiable functions and their minimization by cutting planes. J. Optim. Theory Appl. 46(1), 37–53 (1985). https://doi.org/10.1007/bf00938758

    Article  MathSciNet  MATH  Google Scholar 

  31. Prabuddha, D., Ghosh, J.B., Wells, C.E.: On the minimization of completion time variance with a bicriteria extension. Oper. Res. 40(6), 1148–1155 (1992). https://doi.org/10.1287/opre.40.6.1148

    Article  MATH  Google Scholar 

  32. Rockafellar, R.T.: Convex Analysis. Princeton University Press (1970 (reprint 2015))

  33. Thomann, J., Eichfelder, G.: A trust-region algorithm for heterogeneous multiobjective optimization. SIAM J. Optim. 29(2), 1017–1047 (2019). https://doi.org/10.1137/18m1173277

    Article  MathSciNet  MATH  Google Scholar 

  34. Wang, J., Hu, Y., Yu, C.K.W., Li, C., Yang, X.: Extended Newton methods for multiobjective optimization: majorizing function technique and convergence analysis. SIAM J. Optim. 29(3), 2388–2421 (2019). https://doi.org/10.1137/18m1191737

    Article  MathSciNet  MATH  Google Scholar 

  35. White, D.J.: Epsilon-dominating solutions in mean-variance portfolio analysis. Eur. J. Oper. Res. 105(3), 457–466 (1998). https://doi.org/10.1016/s0377-2217(97)00056-8

    Article  MATH  Google Scholar 

  36. Xu, H., Rubinov, A.M., Glover, B.M.: Strict lower subdifferentiability and applications. J. Aust. Math. Soc. Ser. B Appl. Math. 40(3), 379–391 (1999). https://doi.org/10.1017/s0334270000010961

    Article  MathSciNet  MATH  Google Scholar 

  37. Alber, Y.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math. Program. 81(1), 23–35 (1998). https://doi.org/10.1007/bf01584842

    Article  MathSciNet  MATH  Google Scholar 

  38. Zhao, X., Köbis, M.A.: On the convergence of general projection methods for solving convex feasibility problems with applications to the inverse problem of image recovery. Optimization 67(9), 1409–1427 (2018). https://doi.org/10.1080/02331934.2018.1474355

    Article  MathSciNet  MATH  Google Scholar 

  39. Zhao, X., Ng, K.F., Li, C., Yao, J.C.: Linear regularity and linear convergence of projection-based methods for solving convex feasibility problems. Appl. Math. Optim. 78(3), 613–641 (2018). https://doi.org/10.1007/s00245-017-9417-1

    Article  MathSciNet  MATH  Google Scholar 

  40. Zhao, X., Yao, Y.: Modified extragradient algorithms for solving monotone variational inequalities and fixed point problems. Optimization 69(9), 1987–2002 (2020). https://doi.org/10.1080/02331934.2019.1711087

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are grateful to the handling editor and the anonymous referees for their valuable remarks, comments and new references, which helped to improve the original presentation. The research of Xiaopeng Zhao was supported in part by the National Natural Science Foundation of China (Grant Number 11801411) and the Natural Science Foundation of Tianjin (Grant Number 18JCQNJC01100). The work of M. Köbis was carried out during the tenure of an ERCIM ‘Alain Bensoussan’ Fellowship Programme of the author at the Norwegian University of Science and Technology. The research of Yonghong Yao was partially supported in part by University Innovation Team of Tianjin [Grant Number TD13-5033]. The research of Jen-Chih Yao was supported in part by the Grant MOST 108-2115-M-039-005-MY3.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jen-Chih Yao.

Additional information

Communicated by Fabián Flores-Bazán.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix: Analytically Finding Pareto Optimal Solutions in the Small Test Cases

Appendix: Analytically Finding Pareto Optimal Solutions in the Small Test Cases

The (weak) Pareto optimal solutions in Examples 5.15.3 can be obtained by applying the techniques from [14, 15]. We partially adapt the notation of these two references to outline the analysis below:

  • Example 5.1 Here, we have \(f_1 = x^2\), \(f_2 = \exp (-x)\) and no constraints (i.e., \(C = \mathbb {R}\)). In this example, we obtain \(\mathop {{\text {argmin}}}_{x \in C} f_1 = \lbrace 0 \rbrace = [0,0]\mathbin {=:}[\alpha ,\beta ]\) (bounded) and \(\mathop {{\text {argmin}}}_{x \in C} f_2 = \emptyset \) (empty), corresponding to ‘Case 2’ in [14, p. 95]. Then, the inclusion \({[}0,\infty {)} \subseteq E_{\mathrm {w}}\) follows from [14, Theorem 4.3(b)]. On the other hand, note that

    $$\begin{aligned} B_- {:=} \lbrace x \in C: x \ge \beta ,\, f_2(x) = f_2(\beta )\rbrace = [ 0, 0] \mathrel {=:}[\beta , \beta ^-] \end{aligned}$$

    and that \(\lbrace x \in C: x < 0,\, \exp (-x) = \exp (0) = 1\rbrace = \emptyset \). So, we obtain from the second case in [14, Theorem 4.3(b)] that \(E_{\mathrm {w}} \cap \lbrace x \in C: x < 0 \rbrace = \emptyset \). Consequently, \(E_{\mathrm {w}} = {[} 0, \infty {)}\).

    Furthermore, note that \(f_2\) is strictly decreasing. From [15, Corollary 5.5(b)], it can be obtained immediately that \(E = {[}0, \infty {)}\).

  • Example 5.2 For \(f_1 = \tfrac{x^5}{5}\), \(f_2 = \tfrac{x^3}{3}\), \(C = [-1,1]\), we get

    $$\begin{aligned} \mathop {{\text {argmin}}}_{x \in C} f_1 \cap \mathop {{\text {argmin}}}_{x \in C} f_2 = \lbrace -1 \rbrace \ne \emptyset \,. \end{aligned}$$

    From [14, Proposition 3.1] (resp. [15, Proposition 2.1]), we therefore conclude directly that \(E_{\mathrm {w}} = \lbrace -1 \rbrace \) (resp. \(E = \lbrace -1 \rbrace \)).

  • Example 5.3 For a single objective, we always have the set of (classically) optimal solutions to this one dimension coinciding with E (if existent). From that (or formally also from [14, Proposition 3.1] and [15, Proposition 2.1]), we get \(E = E_{\mathrm {w}} = \lbrace 0 \rbrace \).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, X., Köbis, M.A., Yao, Y. et al. A Projected Subgradient Method for Nondifferentiable Quasiconvex Multiobjective Optimization Problems. J Optim Theory Appl 190, 82–107 (2021). https://doi.org/10.1007/s10957-021-01872-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-021-01872-5

Keywords

Mathematics Subject Classification

Navigation