Skip to main content
Log in

A limited memory descent Perry conjugate gradient method

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this work, we present a new limited memory conjugate gradient method which is based on the study of Perry’s method. An attractive property of the proposed method is that it corrects the loss of orthogonality that can occur in ill-conditioned optimization problems, which can decelerate the convergence of the method. Moreover, an additional advantage is that the memory is only used to monitor the orthogonality relatively cheaply; and when orthogonality is lost, the memory is used to generate a new orthogonal search direction. Under mild conditions, we establish the global convergence of the proposed method provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate the efficiency and robustness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. http://clas.ufl.edu/users/hager/papers/Software/.

References

  1. Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234, 1374–1386 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  2. Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two effective hybrid conjugate gradient algorithms based on modified bfgs updates. Numer. Algorithms 58(3), 315–331 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  3. Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two hybrid nonlinear conjugate gradient methods based on a modified secant equation. Optimization, pp. 1–16 (2012)

  4. Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18(1), 32–52 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  5. Barzilai, J., Borwein, J.M.: Two point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  6. Bongartz, I., Conn, A., Gould, N., Toint, P.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21(1), 123–160 (1995)

    Article  MATH  Google Scholar 

  7. Chen, W., Liu, Q.: Sufficient descent nonlinear conjugate gradient methods with conjugacy condition. Numer. Algorithms 53, 113–131 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dai, Y.H., Yuan, Y.X.: Nonlinear conjugate gradient methods. Shanghai Scientific and Technical Publishers, Shanghai (2000)

    Google Scholar 

  9. Dai, Z., Wen, F.: A modified CG-DESCENT method for unconstrained optimization. J. Comput. Appl. Math. 235(11), 3332–3341 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dai, Z.F., Tian, B.S.: Global convergence of some modified PRP nonlinear conjugate gradient methods. Optim. Lett. 5, 615–630 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dolan, E., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  12. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  13. Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40, 191–216 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  14. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  15. Gill, P.E., Leonard, M.W.: Reduced-Hessian quasi-Newton methods for unconstrained optimization. SIAM J. Optim. 12, 209–237 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  16. Gill, P.E., Leonard, M.W.: Limited memory reduced-Hessian methods for large-scale unconstrained optimization. SIAM J. Optim. 14, 380–401 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  17. Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23(4), 2150–2168 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  18. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  19. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  20. Hestenes, M.R., Stiefel, E.: Methods for conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  21. Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202, 523–539 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  22. Livieris, I.E., Pintelas, P.: Globally convergent modified Perry conjugate gradient method. Appl. Math. Comput. 218(18), 9197–9207 (2012)

    MathSciNet  MATH  Google Scholar 

  23. Livieris, I.E., Pintelas, P.: A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization. J. Comput. Appl. Math. 239, 396–405 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  24. Livieris, I.E., Pintelas, P.: A modified Perry conjugate gradient method and its global convergence. Optim. Lett. (2014)

  25. Narushima, Y., Yabe, H.: Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization. J. Comput. Appl. Math. 236(17), 4303–4317 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  26. Nocedal, J., Wright, S.J.: Numerical optimization. Springer-Verlag, New York (1999)

    Book  MATH  Google Scholar 

  27. Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26, 1073–1078 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  28. Polak, E., Ribière, G.: Note sur la convergence de methods de directions conjuguees. Revue Francais d’Informatique et de Recherche Operationnelle 16, 35–43 (1969)

    MATH  Google Scholar 

  29. Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  30. Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  31. Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  32. Zhang, L.: Two modified Dai-Yuan nonlinear conjugate gradient methods. Numer. Algorithms 50, 1–16 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  33. Zhang, L., Zhou, W.: Two descent hybrid conjugate gradient methods for optimization. J. Comput. Appl. Math. 216, 251–264 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  34. Zhang, L., Zhou, W., Li, D.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  35. Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numerische Mathematik 104, 561–572 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  36. Zoutendijk, G.: Nonlinear programming. In: Abadie, J. (ed.) Integer and nonlinear programming, pp. 37–86. North Holland, Amsterdam (1970)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ioannis E. Livieris.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Livieris, I.E., Pintelas, P. A limited memory descent Perry conjugate gradient method. Optim Lett 10, 1725–1742 (2016). https://doi.org/10.1007/s11590-015-0979-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-015-0979-z

Keywords

Navigation