Skip to main content
Log in

Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

At present, many conjugate gradient methods with global convergence have been proposed in unconstrained optimization, such as MPRP algorithm proposed by Zhang et al. (IMA J. Numer. Anal. 26(4):629–640, 2006). Unfortunately, almost all of these methods require gradient Lipschitz continuity condition. As far as we know, how do the current conjugate gradient methods deal with gradient non-Lipschitz continuity problems is basically blank. For gradient non-Lipschitz continuity problems, Algorithm 1 and Algorithm 2 are proposed in this paper based on MPRP algorithm. The proposed algorithms have the following characteristics: (i) Algorithm 1 retains sufficient descent property independent of line search technology in MPRP algorithm; (ii) for nonconvex and gradient non-Lipschitz continuous functions, the global convergence of Algorithm 1 is obtained in combination with the trust region property and the weak Wolfe-Powell line search technique; (iii) based on Algorithm 1, Algorithm 2 is further improved which global convergence can be obtained independently of line search technique; (iv) according to numerical experiments, the proposed algorithms perform competitively with other similar algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Al-Baali, M.: Descent property and global convergence of the Fletcher-Reeves method with inexact line search[J]. IMA J. Numer. Anal. 5(1), 121–124 (1985)

    Article  MathSciNet  Google Scholar 

  2. Dai, Y.: Analysis of conjugate gradient methods, Ph.D. Thesis. Institute of Computational Mathe Matics and Scientifific/Engineering Computing. Chese Academy of Sciences (1997)

  3. Dai, Y.: Convergence properties of the BFGS algoritm[J]. SIAM J. Optim. 13(3), 693–701 (2002)

    Article  MathSciNet  Google Scholar 

  4. Dai, Y, Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property[J]. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  5. Fletcher, R.: Practical Methods of Optimization, 2nd. Wiley, New York (1987)

    MATH  Google Scholar 

  6. Fletcher, R, Reeves, C.: Function minimization by conjugate gradients[J]. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  7. Gilbert, J, Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization[J]. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  Google Scholar 

  8. Goldstein, A.A.: On steepest descent[J]. J. Soc. Indus. Appl. Math. Series A: Control 3(1), 147–151 (1965)

    Article  MathSciNet  Google Scholar 

  9. Hestenes, M, Stiefel, E.: Method of conjugate gradient for solving linear equations. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  Google Scholar 

  10. Hager, W, Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search[J]. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  11. Hager, W, Zhang H.: Algorithm 851: CG-DESCENT, a conjugate gradient method with guaranteed descent[J]. ACM Trans. Math. Softw. (TOMS) 32(1), 113–137 (2006)

    Article  MathSciNet  Google Scholar 

  12. Liu, J, Lu, Z, Xu, J, Wu, S, Tu, Z: An efficient projection-based algorithm without Lipschitz continuity for large-scale nonlinear pseudo-monotone equations[J]. Journal of Computational and Applied Mathematics, 113822. https://doi.org/10.1016/j.cam.2021.113822 (2021)

  13. Levenberg, K.: A method for the solution of certain non-linear problems in least squares[J]. Quart. Appl. Math. 2(2), 164–168 (1944)

    Article  MathSciNet  Google Scholar 

  14. Li, X, Wang, S, Jin, Z, et al: A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models[J]. Math. Probl. Eng., 2018 (2018)

  15. Liu, Y, Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: Theory[J]. J. Optim. Theory Appl. 69(1), 129–137 (1991)

    Article  MathSciNet  Google Scholar 

  16. Polyak, B.: The conjugate gradient method in extremal problems[J]. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  Google Scholar 

  17. Polak, E, Ribiere, G.: Note sur la convergence de mèthodes de directions conjuguèes[J]. ESAIM: Mathematical Modelling and Numerical Analysis-Modèlisation MathématiquŸe et Analyse Numérique 3(R1), 35–43 (1969)

    MATH  Google Scholar 

  18. Powell, M.: Convergence properties of algorithms for nonlinear optimization[J]. SIAM Rev. 28(4), 487–500 (1986)

    Article  MathSciNet  Google Scholar 

  19. Powell, M.: Nonconvex Minimization Calculations and the Conjugate Gradient Method[M]. Springer, Berlin (1984)

    Book  Google Scholar 

  20. Powell, M.J.D.: Convergence Properties of a Class of Minimization Algorithms[M]//Nonlinear Programming 2. Academic Press, pp. 1–27 (1975)

  21. Sheng, Z, Ouyang, A, Liu, L., et al.: A novel parameter estimation method for Muskingum model using new Newton-type trust region algorithm[J]. Math. Probl. Eng., 2014 (2014)

  22. Sheng, Z, Yuan, G.: An effective adaptive trust region algorithm for nonsmooth minimization[J]. Comput. Optim. Appl. 71(1), 251–271 (2018)

    Article  MathSciNet  Google Scholar 

  23. Sheng, Z, Yuan, G, Cui, Z, et al.: An adaptive trust region algorithm for large-residual nonsmooth least squares problems[J]. J. Industr. Manag. Optim. 14(2), 707 (2018)

    Article  MathSciNet  Google Scholar 

  24. Sheng, Z, Gonglin, Y, Zengru, C.U.I.: A new adaptive trust region algorithm for optimization problems[J]. Acta Math. Sci. 38(2), 479–496 (2018)

    Article  MathSciNet  Google Scholar 

  25. Wolfe, P.: Convergence conditions for ascent methods[J]. SIAM Rev. 11(2), 226–235 (1969)

    Article  MathSciNet  Google Scholar 

  26. Wolfe, P.: Convergence conditions for ascent methods. II Some corrections[J]. SIAM Rev. 13(2), 185–188 (1971)

    Article  MathSciNet  Google Scholar 

  27. Wei, Z, Yao, S, Liu, L.: The convergence properties of some new conjugate gradient methods[J]. Appl. Math. Comput. 183(2), 1341–1350 (2006)

    MathSciNet  MATH  Google Scholar 

  28. Yuan, G, Li, T, Hu, W.: A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems[J]. Appl. Numer. Math. 147, 129–141 (2020)

    Article  MathSciNet  Google Scholar 

  29. Yuan, G, Lu, X.: A modified PRP conjugate gradient method[J]. Ann. Oper. Res. 166(1), 73–90 (2009)

    Article  MathSciNet  Google Scholar 

  30. Yuan, G, Lu, J, Wang, Z.: The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems[J]. Appl. Numer. Math. 152, 1–11 (2020)

    Article  MathSciNet  Google Scholar 

  31. Yuan, G, Lu, J, Wang, Z.: The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems[J]. Soft. Comput. 25(8), 5867–5879 (2021)

    Article  Google Scholar 

  32. Yuan, G, Lu, X, Wei, Z.: A conjugate gradient method with descent direction for unconstrained optimization[J]. J. Comput. Appl. Math. 233(2), 519–530 (2009)

    Article  MathSciNet  Google Scholar 

  33. Yuan, G, Lu, S, Wei, Z.: A new trust-region method with line search for solving symmetric nonlinear equations[J]. Int. J. Comput. Math. 88(10), 2109–2123 (2011)

    Article  MathSciNet  Google Scholar 

  34. Yuan, G, Lu, X, Wei, Z.: BFGS trust-region method for symmetric nonlinear equations[J]. J. Comput. Appl. Math. 230(1), 44–58 (2009)

    Article  MathSciNet  Google Scholar 

  35. Yuan, G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)

    Article  MathSciNet  Google Scholar 

  36. Yuan, G, Meng, Z, Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations[J]. J. Optim. Theory Appl. 168(1), 129–152 (2016)

    Article  MathSciNet  Google Scholar 

  37. Yuan, G, Zhang, M, Zhou, Y: Adaptive scaling damped BFGS method without gradient Lipschitz continuity. [J] 124, 107634 (2022). https://doi.org/10.1016/j.aml.2021.107634

    MathSciNet  MATH  Google Scholar 

  38. Yuan, G, Wei, Z, Li, G.: A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs[J]. J. Comput. Appl. Math. 255, 86–96 (2014)

    Article  MathSciNet  Google Scholar 

  39. Yuan, G, Wei, Z, Yang, Y.: The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions[J]. J. Comput. Appl. Math. 362, 262–275 (2019)

    Article  MathSciNet  Google Scholar 

  40. Yuan, G, Zhang, M.: A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations[J]. J. Comput. Appl. Math. 286, 186–195 (2015)

    Article  MathSciNet  Google Scholar 

  41. Yuan, Y.: Analysis on the conjugate gradient method[J]. Dyn. Syst. 2(1), 19–29 (1993)

    MathSciNet  Google Scholar 

  42. Zoutendijk, G.: Nonlinear programming, computational methods[J]. Integer & Nonlinear Programming, 37–86 (1970)

  43. Zhang, L, Zhou, W, Li, D.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence[J]. IMA J. Numer. Anal. 26(4), 629–640 (2006)

    Article  MathSciNet  Google Scholar 

Download references

Funding

This work was supported by the National Natural Science Foundation of China (Grant No. 11661009), the High Level Innovation Teams and Excellent Scholars Program in Guangxi institutions of higher education (Grant No. [2019]52), the Guangxi Natural Science Key Fund (No. 2017GXNSFDA198046), the Special Funds for Local Science and Technology Development Guided by the Central Government (No. ZY20198003), and Special Foundation for Guangxi Ba Gui Scholars.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Heshu Yang.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yuan, G., Yang, H. & Zhang, M. Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions. Numer Algor 91, 145–160 (2022). https://doi.org/10.1007/s11075-022-01257-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-022-01257-3

Keywords

Mathematics Subject Classifications (2010)

Navigation