Skip to main content
Log in

A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

We propose a BFGS method with Wolfe line searches for unconstrained multiobjective optimization problems. The algorithm is well defined even for general nonconvex problems. Global convergence and R-linear convergence to a Pareto optimal point are established for strongly convex problems. In the local convergence analysis, if the objective functions are locally strongly convex with Lipschitz continuous Hessians, the rate of convergence is Q-superlinear. In this respect, our method exactly mimics the classical BFGS method for single-criterion optimization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Data Availability

Codes supporting the numerical results are freely available in the GitHub repository, https://github.com/lfprudente/bfgsMOP.

References

  1. Ansary, M.A., Panda, G.: A modified quasi-Newton method for vector optimization problem. Optimization 64(11), 2289–2306 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  2. Bello Cruz, J., Lucambio Pérez, L., Melo, J.: Convergence of the projected gradient method for quasiconvex multiobjective optimization. Nonlinear Anal. 74(16), 5268–5273 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bhaskar, V., Gupta, S.K., Ray, A.K.: Applications of multiobjective optimization in chemical engineering. Rev. Chem. Eng. 16(1), 1–54 (2000)

    Article  Google Scholar 

  4. Birgin, E., Martinez, J.: Practical Augmented Lagrangian Methods for Constrained Optimization. SIAM, Philadelphia (2014)

    Book  MATH  Google Scholar 

  5. Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15(4), 953–970 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  6. Byrd, R.H., Nocedal, J.: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26(3), 727–739 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  7. Ceng, L.C., Mordukhovich, B.S., Yao, J.C.: Hybrid approximate proximal method with auxiliary variational inequality for vector optimization. J. Optim. Theory Appl. 146(2), 267–303 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  8. Ceng, L.C., Yao, J.C.: Approximate proximal methods in vector optimization. Eur. J. Oper. Res. 183(1), 1–19 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  9. Chuong, T.D.: Generalized proximal method for efficient solutions in vector optimization. Numer. Funct. Anal. Optim. 32(8), 843–857 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  10. Chuong, T.D.: Newton-like methods for efficient solutions in vector optimization. Comput. Optim. Appl. 54(3), 495–516 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  11. Chuong, T.D., Mordukhovich, B.S., Yao, J.C.: Hybrid approximate proximal algorithms for efficient solutions in vector optimization. J. Nonlinear Convex Anal. 12(2), 257–285 (2011)

    MathSciNet  MATH  Google Scholar 

  12. Custódio, A.L., Madeira, J.F.A., Vaz, A.I.F., Vicente, L.N.: Direct multisearch for multiobjective optimization. SIAM J. Optim. 21(3), 1109–1140 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dai, Y.-H.: Convergence properties of the BFGS algorithm. SIAM J. Optim. 13(3), 693–701 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  14. Dai, Y.-H.: A perfect example for the BFGS method. Math. Program. 138(1–2), 501–530 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  15. Das, I., Dennis, J.: Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  16. Davidon, W.C.: Variable metric methods for minimization, aec. Research and Development Report, No. ANL-5990, Argonne Nat’l Lab., Argonne, Illinois (1959)

  17. Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable test problems for evolutionary multiobjective optimization. In: Abraham, A., Jain, L., Goldberg, R. (eds.) Evolutionary Multiobjective Optimization: Theoretical Advances and Applications, pp. 105–145. Springer, London (2005)

    Chapter  MATH  Google Scholar 

  18. Dennis, J.E., Moré, J.J.: A characterization of superlinear convergence and its application to quasi-Newton methods. Math. Comput. 28(126), 549–560 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  19. Dennis, J.E., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. SIAM, Philadelphia (1996)

    Book  MATH  Google Scholar 

  20. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  21. Eichfelder, G.: Adaptive Scalarization Methods in Multiobjective Optimization. Springer, Berlin (2008)

    Book  MATH  Google Scholar 

  22. Fazzio, N.S., Schuverdt, M.L.: Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems. Optim. Lett. 13(6), 1365–1379 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  23. Fletcher, R., Powell, M.J.D.: A Rapidly Convergent Descent Method for Minimization. Comput. J. 6(2), 163–168 (1963)

    Article  MathSciNet  MATH  Google Scholar 

  24. Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  25. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  26. Fukuda, E.H., Graña Drummond, L.M.: On the convergence of the projected gradient method for vector optimization. Optimization 60(8–9), 1009–1021 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  27. Fukuda, E.H., Graña Drummond, L.M.: Inexact projected gradient method for vector optimization. Comput. Optim. Appl. 54(3), 473–493 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  28. Gonçalves, M.L.N., Prudente, L.F.: On the extension of the Hager–Zhang conjugate gradient method for vector optimization. Comput. Optim. Appl. 76(3), 889–916 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  29. Graña Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28(1), 5–29 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  30. Graña Drummond, L.M., Raupp, F.M.P., Svaiter, B.F.: A quadratically convergent Newton method for vector optimization. Optimization 63(5), 661–677 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  31. Graña Drummond, L.M., Svaiter, B.F.: A steepest descent method for vector optimization. J. Comput. Appl. Math. 175(2), 395–414 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  32. Hillermeier, C.: Generalized homotopy approach to multiobjective optimization. J. Optim. Theory Appl. 110(3), 557–583 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  33. Huband, S., Hingston, P., Barone, L., While, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10(5), 477–506 (2006)

    Article  MATH  Google Scholar 

  34. Jin, Y., Olhofer, M., Sendhoff, B.: Dynamic weighted aggregation for evolutionary multi-objective optimization: why does it work and how? In: Spector, L.A., Goodman, E.D., Wu, A., Langdon, W.B., Voigt, H.M. (eds.) Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, GECCO’01, pp. 1042–1049. Morgan Kaufmann Publishers Inc, San Francisco (2001)

  35. Kim, I., de Weck, O.: Adaptive weighted-sum method for bi-objective optimization: Pareto front generation. Struct. Multidiscip. Optim. 29(2), 149–158 (2005)

    Article  Google Scholar 

  36. Lai, K.K., Mishra, S.K., Ram, B.: On q-quasi-Newton’s method for unconstrained multiobjective optimization problems. Mathematics 8(4), 616 (2020)

    Article  Google Scholar 

  37. Laumanns, M., Thiele, L., Deb, K., Zitzler, E.: Combining convergence and diversity in evolutionary multiobjective optimization. Evol. Comput. 10(3), 263–282 (2002)

    Article  Google Scholar 

  38. Li, D.-H., Fukushima, M.: On the global convergence of the BFGS method for nonconvex unconstrained optimization problems. SIAM J. Optim. 11(4), 1054–1064 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  39. Lovison, A.: Singular continuation: generating piecewise linear approximations to Pareto sets via global analysis. SIAM J. Optim. 21(2), 463–490 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  40. Lucambio Pérez, L.R., Prudente, L.F.: Nonlinear conjugate gradient methods for vector optimization. SIAM J. Optim. 28(3), 2690–2720 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  41. Lucambio Pérez, L.R., Prudente, L.F.: A Wolfe line search algorithm for vector optimization. ACM Trans. Math. Softw. 45(4), 23 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  42. Mahdavi-Amiri, N., Salehi Sadaghiani, F.: A superlinearly convergent nonmonotone quasi-Newton method for unconstrained multiobjective optimization. Optim. Method Softw. 35(6), 1223–1247 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  43. Mascarenhas, W.F.: The BFGS method with exact line searches fails for non-convex objective functions. Math. Program. 99(1), 49–61 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  44. Miglierina, E., Molho, E., Recchioni, M.: Box-constrained multi-objective optimization: a gradient-like method without a priori scalarization. Eur. J. Oper. Res. 188(3), 662–682 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  45. Mita, K., Fukuda, E.H., Yamashita, N.: Nonmonotone line searches for unconstrained multiobjective optimization problems. J. Glob. Optim. 75, 63–90 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  46. Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  47. Morovati, V., Basirzadeh, H., Pourkarimi, L.: Quasi-Newton methods for multiobjective optimization problems. 4OR 16(3), 261–294 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  48. Nocedal, J., Wright, S.: Numerical Optimization. Springer, Berlin (2006)

    MATH  Google Scholar 

  49. Povalej, Z.: Quasi-Newton method for multiobjective optimization. J. Comput. Appl. Math. 255, 765–777 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  50. Powell, M.J.: Some global convergence properties of a variable metric algorithm for minimization without exact line searches. In: Cottle, R.W., Lemke, C.E. (eds.) Nonlinear Programming, SIAM-AMS Proceedings, vol. 9 (1976)

  51. Preuss, M., Naujoks, B., Rudolph, G.: Pareto set and EMOA behavior for simple multimodal multiobjective functions. In: Runarsson, T.P., Beyer, H.-G., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) Parallel Problem Solving from Nature—PPSN IX, pp. 513–522. Springer, Berlin (2006)

    Chapter  Google Scholar 

  52. Qu, S., Goh, M., Chan, F.T.: Quasi-Newton methods for solving multiobjective optimization. Oper. Res. Lett. 39(5), 397–399 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  53. Qu, S., Liu, C., Goh, M., Li, Y., Ji, Y.: Nonsmooth multiobjective programming with quasi-Newton methods. Eur. J. Oper. Res. 235(3), 503–510 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  54. Schütze, O., Laumanns, M., Coello Coello, C.A., Dellnitz, M., Talbi, E.-G.: Convergence of stochastic search algorithms to finite size Pareto set approximations. J. Glob. Optim. 41(4), 559–577 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  55. Stadler, W., Dauer, J.: Multicriteria optimization in engineering: A tutorial and survey. In: Kamat, M.P. (ed.) Structural Optimization: Status And Promise, chapter 10, pp. 209–249. American Institute of Aeronautics and Astronautics (1992)

  56. Stewart, T., Bandte, O., Braun, H., Chakraborti, N., Ehrgott, M., Göbelt, M., Jin, Y., Nakayama, H., Poles, S., Di Stefano, D.: Real-world applications of multiobjective optimization. In: Branke, J., Deb, K., Miettinen, K., Słowiński, R. (eds.) Multiobjective Optimization: Interactive and Evolutionary Approaches, pp. 285–327. Springer, Berlin (2008)

    Chapter  Google Scholar 

  57. Svaiter, B.F.: The multiobjective steepest descent direction is not Lipschitz continuous, but is Hölder continuous. Oper. Res. Lett. 46(4), 430–433 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  58. Toint, P.L.: Test problems for partially separable optimization and results for the routine PSPMIN. Tech. Rep., The University of Namur, Department of Mathematics, Belgium (1983)

  59. Wang, J., Hu, Y., Wai Yu, C.K., Li, C., Yang, X.: Extended Newton methods for multiobjective optimization: majorizing function technique and convergence analysis. SIAM J. Optim. 29(3), 2388–2421 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  60. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evol. Comput. 8(2), 173–195 (2000)

    Article  Google Scholar 

Download references

Acknowledgements

This work was funded by FAPEG (Grants PPP03/15-201810267001725) and CNPq (Grants 424860/2018-0, 309628/2020-2, 405349/2021-1).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to L. F. Prudente.

Ethics declarations

Conflict of interest

The authors declare no conflicts of interest.

Additional information

Communicated by Nobuo Yamashita.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Prudente, L.F., Souza, D.R. A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization. J Optim Theory Appl 194, 1107–1140 (2022). https://doi.org/10.1007/s10957-022-02072-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-022-02072-5

Keywords

Mathematics Subject Classification

Navigation