A delayed weighted gradient method for strictly convex quadratic minimization
- 94 Downloads
In this paper is developed an accelerated version of the steepest descent method by a two-step iteration. The new algorithm uses information with delay to define the iterations. Specifically, in the first step, a prediction of the new test point is calculated by using the gradient method with the exact minimal gradient steplength and then, a correction is computed by a weighted sum between the prediction and the predecessor iterate of the current point. A convergence result is provided. In order to compare the efficiency and effectiveness of the proposal, with similar methods existing in the literature, numerical experiments are performed. The numerical comparison of the new algorithm with the classical conjugate gradient method shows that our method is a good alternative to solve large-scale problems.
KeywordsGradient methods Convex quadratic optimization Linear system of equations
Mathematics Subject Classification90C20 90C25 90C52 65F10
I would like to thank Dr. Marcos Raydan for your helpful comments and suggestions on this work, and also for sending me pertinent information. The author also would like to thank Dr. Hugo Lara and two anonymous referees for their useful suggestions and comments.
- 1.Kincaid, D., Kincaid, D.R., Cheney, E.W.: Numerical analysis: mathematics of scientific computing, vol. 2. American Mathematical Society (2009)Google Scholar
- 2.Cauchy, A.: Méthode générale pour la résolution des systemes d’équations simultanées. Comptes Rendus Sci. Paris 25(1847), 536–538 (1847)Google Scholar