Abstract
In this work, we present a new limited memory conjugate gradient method which is based on the study of Perry’s method. An attractive property of the proposed method is that it corrects the loss of orthogonality that can occur in ill-conditioned optimization problems, which can decelerate the convergence of the method. Moreover, an additional advantage is that the memory is only used to monitor the orthogonality relatively cheaply; and when orthogonality is lost, the memory is used to generate a new orthogonal search direction. Under mild conditions, we establish the global convergence of the proposed method provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate the efficiency and robustness of the proposed method.
Similar content being viewed by others
References
Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234, 1374–1386 (2010)
Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two effective hybrid conjugate gradient algorithms based on modified bfgs updates. Numer. Algorithms 58(3), 315–331 (2011)
Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two hybrid nonlinear conjugate gradient methods based on a modified secant equation. Optimization, pp. 1–16 (2012)
Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18(1), 32–52 (2013)
Barzilai, J., Borwein, J.M.: Two point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)
Bongartz, I., Conn, A., Gould, N., Toint, P.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21(1), 123–160 (1995)
Chen, W., Liu, Q.: Sufficient descent nonlinear conjugate gradient methods with conjugacy condition. Numer. Algorithms 53, 113–131 (2010)
Dai, Y.H., Yuan, Y.X.: Nonlinear conjugate gradient methods. Shanghai Scientific and Technical Publishers, Shanghai (2000)
Dai, Z., Wen, F.: A modified CG-DESCENT method for unconstrained optimization. J. Comput. Appl. Math. 235(11), 3332–3341 (2011)
Dai, Z.F., Tian, B.S.: Global convergence of some modified PRP nonlinear conjugate gradient methods. Optim. Lett. 5, 615–630 (2011)
Dolan, E., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40, 191–216 (2008)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Gill, P.E., Leonard, M.W.: Reduced-Hessian quasi-Newton methods for unconstrained optimization. SIAM J. Optim. 12, 209–237 (2001)
Gill, P.E., Leonard, M.W.: Limited memory reduced-Hessian methods for large-scale unconstrained optimization. SIAM J. Optim. 14, 380–401 (2003)
Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23(4), 2150–2168 (2013)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods for conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202, 523–539 (2007)
Livieris, I.E., Pintelas, P.: Globally convergent modified Perry conjugate gradient method. Appl. Math. Comput. 218(18), 9197–9207 (2012)
Livieris, I.E., Pintelas, P.: A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization. J. Comput. Appl. Math. 239, 396–405 (2013)
Livieris, I.E., Pintelas, P.: A modified Perry conjugate gradient method and its global convergence. Optim. Lett. (2014)
Narushima, Y., Yabe, H.: Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization. J. Comput. Appl. Math. 236(17), 4303–4317 (2012)
Nocedal, J., Wright, S.J.: Numerical optimization. Springer-Verlag, New York (1999)
Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26, 1073–1078 (1978)
Polak, E., Ribière, G.: Note sur la convergence de methods de directions conjuguees. Revue Francais d’Informatique et de Recherche Operationnelle 16, 35–43 (1969)
Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)
Zhang, L.: Two modified Dai-Yuan nonlinear conjugate gradient methods. Numer. Algorithms 50, 1–16 (2009)
Zhang, L., Zhou, W.: Two descent hybrid conjugate gradient methods for optimization. J. Comput. Appl. Math. 216, 251–264 (2008)
Zhang, L., Zhou, W., Li, D.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)
Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numerische Mathematik 104, 561–572 (2006)
Zoutendijk, G.: Nonlinear programming. In: Abadie, J. (ed.) Integer and nonlinear programming, pp. 37–86. North Holland, Amsterdam (1970)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Livieris, I.E., Pintelas, P. A limited memory descent Perry conjugate gradient method. Optim Lett 10, 1725–1742 (2016). https://doi.org/10.1007/s11590-015-0979-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-015-0979-z