Skip to main content
Log in

Spectral conjugate gradient methods for vector optimization problems

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In this work, we present an extension of the spectral conjugate gradient (SCG) methods for solving unconstrained vector optimization problems, with respect to the partial order induced by a pointed, closed and convex cone with a nonempty interior. We first study the direct extension version of the SCG methods and its global convergence without imposing an explicit restriction on parameters. It shows that the methods may lose their good scalar properties, like yielding descent directions, in the vector setting. By using a truncation technique, we then propose a modified self-adjusting SCG algorithm which is more suitable for various parameters. Global convergence of the new scheme covers the vector extensions of three different spectral parameters and the corresponding Perry, Andrei, and Dai–Kou conjugate parameters (SP, N, and JC schemes, respectively) without regular restarts and any convex assumption. Under inexact line searches, we prove that the sequences generated by the proposed methods find points that satisfy the first-order necessary condition for Pareto-optimality. Finally, numerical experiments illustrating the practical behavior of the methods are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Algorithm 2
Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availibility

We do not analyse or generate any datasets, because our work proceeds within a theoretical and mathematical approach.

References

  1. Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)

    MathSciNet  MATH  Google Scholar 

  2. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)

    MathSciNet  MATH  Google Scholar 

  3. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)

    MathSciNet  MATH  Google Scholar 

  4. Andrei, N.: New accelerated conjugate gradient algorithms as a modification of Dai–Yuan’s computational scheme for unconstrained optimization. J. Comput. Appl. Math. 234(12), 3397–3410 (2010)

    MathSciNet  MATH  Google Scholar 

  5. Ansary, M.A., Panda, G.: A modified quasi-Newton method for vector optimization problem. Optimization 64(11), 2289–2306 (2015)

    MathSciNet  MATH  Google Scholar 

  6. Assunção, P.B., Ferreira, O.P., Prudente, L.F.: Conditional gradient method for multiobjective optimization. Comput. Optim. Appl. 78(3), 741–768 (2021)

    MathSciNet  MATH  Google Scholar 

  7. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    MathSciNet  MATH  Google Scholar 

  8. Bello Cruz, J.Y.: A subgradient method for vector optimization problems. SIAM J. Optim. 23(4), 2169–2182 (2013)

    MathSciNet  MATH  Google Scholar 

  9. Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)

    MathSciNet  MATH  Google Scholar 

  10. Birgin, E.G., Martínez, J.M.: Practical Augmented Lagrangian Methods for Constrained Optimization. SIAM, New York (2014)

    MATH  Google Scholar 

  11. Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15(4), 953–970 (2005)

    MathSciNet  MATH  Google Scholar 

  12. Ceng, L.C., Yao, J.C.: Approximate proximal methods in vector optimization. Eur. J. Oper. Res. 183(1), 1–19 (2007)

    MathSciNet  MATH  Google Scholar 

  13. Chuong, T.D.: Newton-like methods for efficient solutions in vector optimization. Comput. Optim. Appl. 54(3), 495–516 (2013)

    MathSciNet  MATH  Google Scholar 

  14. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    MathSciNet  MATH  Google Scholar 

  15. Das, I., Dennis, J.E.: Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)

    MathSciNet  MATH  Google Scholar 

  16. Dola, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    MathSciNet  MATH  Google Scholar 

  17. EI Moudden, M., EI Ghali, A.: Multiple reduced gradient method for multiobjective optimization problems. Numer. Algorithms 79(4), 1257–1282 (2018)

    MathSciNet  MATH  Google Scholar 

  18. EI Moudden, M., EI Mouatasim, A.: Accelerated diagonal steepest descent method for unconstrained multiobjective optimization. J. Optim. Theory Appl 188(1), 220–242 (2021)

    MathSciNet  MATH  Google Scholar 

  19. Faramarzi, P., Amini, K.: A modified spectral conjugate gradient method with global convergence. J. Optim. Theory Appl. 182(2), 667–690 (2019)

    MathSciNet  MATH  Google Scholar 

  20. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    MathSciNet  MATH  Google Scholar 

  21. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)

    MathSciNet  MATH  Google Scholar 

  22. Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009)

    MathSciNet  MATH  Google Scholar 

  23. Fukuda, E.H., Graña Drummond, L.M.: Inexact projected gradient method for vector optimization. Comput. Optim. Appl. 54(3), 473–493 (2013)

    MathSciNet  MATH  Google Scholar 

  24. Gonçalves, M.L.N., Prudente, L.F.: On the extension of the Hager-Zhang conjugate gradient method for vector optimization. Comput. Optim. Appl. 76(3), 889–916 (2020)

    MathSciNet  MATH  Google Scholar 

  25. Gonçalves, M.L.N., Lima, F.S., Prudente, L.F.: A study of Liu-Storey conjugate gradient methods for vector optimization. Appl. Math. Comput. 425, 127099 (2022)

    MathSciNet  MATH  Google Scholar 

  26. Gonçalves, M.L.N., Lima, F.S., Prudente, L.F.: Globally convergent Newton-type methods for multiobjective optimization. Comput. Optim. Appl. 83(2), 403–434 (2022)

    MathSciNet  MATH  Google Scholar 

  27. Graña Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28(1), 5–29 (2004)

    MathSciNet  MATH  Google Scholar 

  28. Graña Drummond, L.M., Svaiter, B.F.: A steepest descent method for vector optimization. J. Comput. Appl. Math. 175(2), 395–414 (2005)

    MathSciNet  MATH  Google Scholar 

  29. Graña Drummond, L.M., Raupp, F.M.P., Svaiter, B.F.: A quadratically convergent Newton method for vector optimization. Optimization 63(5), 661–677 (2014)

    MathSciNet  MATH  Google Scholar 

  30. Hillermeier, C.: Generalized homotopy approach to multiobjective optimization. J. Optim. Theory Appl. 110(3), 557–583 (2001)

    MathSciNet  MATH  Google Scholar 

  31. Huband, S., Hingston, P., Barone, L., While, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10(5), 477–506 (2006)

    MATH  Google Scholar 

  32. Jian, J., Chen, Q., Jiang, X., Zeng, Y., Yin, J.: A new spectral conjugate gradient method for large-scale unconstrained optimization. Optim. Methods Softw. 32(3), 503–515 (2017)

    MathSciNet  MATH  Google Scholar 

  33. Liu, J.K., Feng, Y.M., Zou, L.M.: A spectral conjugate gradient method for solving large-scale unconstrained optimization. Comput. Math. Appl. 77(3), 731–739 (2019)

    MathSciNet  MATH  Google Scholar 

  34. Lovison, A.: Singular continuation: generating piecewise linear approximations to Pareto sets via global analysis. SIAM J. Optim. 21(2), 463–490 (2011)

    MathSciNet  MATH  Google Scholar 

  35. Lu, F., Chen, C.R.: Newton-like methods for solving vector optimization problems. Appl. Anal. 93(8), 1567–1586 (2014)

    MathSciNet  MATH  Google Scholar 

  36. Luc, D.T.: Theory of vector optimization. In: Lectures Notes in Economics and Mathematical Systems, vol. 319. Springer, Berlin (1989)

  37. Lucambio Pérez, L.R., Prudente, L.F.: Nonlinear conjugate gradient methods for vector optimization. SIAM J. Optim. 28(3), 2690–2720 (2018)

    MathSciNet  MATH  Google Scholar 

  38. Lucambio Pérez, L.R., Prudente, L.F.: A Wolfe line search algorithm for vector optimization. ACM Trans. Math. Software 45(4), 1–23 (2019)

    MathSciNet  MATH  Google Scholar 

  39. Miglierina, E., Molho, E., Recchioni, M.C.: Box-constrained multi-objective optimization: a gradient-like method without a priori scalarization. Eur. J. Oper. Res. 188(3), 662–682 (2008)

    MathSciNet  MATH  Google Scholar 

  40. Mita, K., Fukuda, E.H., Yamashita, N.: Nonmonotone line searches for unconstrained multiobjective optimization problems. J. Glob. Optim. 75(1), 63–90 (2019)

    MathSciNet  MATH  Google Scholar 

  41. Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)

    MathSciNet  MATH  Google Scholar 

  42. Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1978)

    MathSciNet  MATH  Google Scholar 

  43. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Franca̧ise Inform. Rech. Opér. Sér. Rouge 3(1), 35–43 (1969)

    MATH  Google Scholar 

  44. Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    MATH  Google Scholar 

  45. Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Lecture Notes in Mathematics, vol. 1066. Springer, Berlin (1984)

  46. Schütze, O., Laumanns, M., Coello Coello, C.A., Dellnitz, M., Talbi, E.G.: Convergence of stochastic search algorithms to finite size Pareto set approximations. J. Glob. Optim. 41(4), 559–577 (2008)

    MathSciNet  MATH  Google Scholar 

  47. Sun, Z., Li, H., Wang, J., Tian, Y.: Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization. Int. J. Comput. Math. 95(10), 2082–2099 (2018)

    MathSciNet  MATH  Google Scholar 

  48. Tanabe, H., Fukuda, E.H., Yamashita, N.: An accelerated proximal gradient method for multiobjective optimization. Comput. Optim. Appl. (2023). https://doi.org/10.1007/s10589-023-00497-w

    Article  MATH  Google Scholar 

  49. Toint, P.L.: Test problems for partially separable optimization and results for the routine PSPMIN. Technical Report, The University of Namur, Department of Mathematics, Belgium (1983)

  50. Yu, G., Guan, L., Chen, W.: Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization. Optim. Methods Softw. 23(2), 275–293 (2008)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

We would like to thank two anonymous referees and the editor for their valuable comments and suggestions, which improved the quality of this paper. We are also deeply indebted to references [24, 37, 38] for their Fortran codes.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chun-Rong Chen.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by the National Key Research and Development Program of China (Grant No. 2022YFB3304500) and the National Natural Science Foundation of China (Grant Nos. 11971078 and 12271072).

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

He, QR., Chen, CR. & Li, SJ. Spectral conjugate gradient methods for vector optimization problems. Comput Optim Appl 86, 457–489 (2023). https://doi.org/10.1007/s10589-023-00508-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-023-00508-w

Keywords

Mathematics Subject Classification

Navigation