An efficient gradient method using the Yuan steplength
- 381 Downloads
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepest descent (SD) iterates with some gradient iterates that use a constant steplength computed through the Yuan formula. The SDC method exploits the asymptotic spectral behaviour of the Yuan steplength to foster a selective elimination of the components of the gradient along the eigenvectors of the Hessian matrix, i.e., to push the search in subspaces of smaller and smaller dimensions. The new method has global and \(R\)-linear convergence. Furthermore, numerical experiments show that it tends to outperform the Dai–Yuan method, which is one of the fastest methods among the gradient ones. In particular, SDC appears superior as the Hessian condition number and the accuracy requirement increase. Finally, if the number of consecutive SD iterates is not too small, the SDC method shows a monotonic behaviour.
KeywordsGradient methods Yuan steplength Quadratic programming
We wish to thank the anonymous referees for their constructive and detailed comments, which helped to improve the quality of this paper. This work was partially supported by INdAM-GNCS (2013 Project Numerical methods and software for large-scale optimization with applications to image processing and 2014 Project First-order optimization methods for image restoration and analysis), by the National Science Foundation (Grants 1016204 and 1115568), and by the Office of Naval Research (Grant N00014-11-1-0068).
- 3.Birgin, E., Martínez, J., Raydan, M.: Spectral projected gradient methods: review and perspectives. J. Stat. Softw. (2012, to appear)Google Scholar
- 5.Cauchy, A.: Méthodes générales pour la résolution des systèmes d’équations simultanées. CR. Acad. Sci. Par. 25, 536–538 (1847)Google Scholar
- 13.De Asmundis, R., di Serafino, D., Landi, G.: On the regularizing behavior of recent gradient methods in the solution of linear ill-posed problems (2014). http://www.optimization-online.org/DB_HTML/2014/06/4393.html
- 17.Fletcher, R.: Low storage method for uncostrained optimization. In: Allgower, E.L., Georg, K. (eds.) Computational Solution of Nonlinear Systems of Equations. Lectures in Applied Mathematics, vol. 26, pp. 165–179. AMS Publications, Providence, RI (1990)Google Scholar
- 18.Fletcher, R.: On the Barzilai–Borwein method. In: Qi, L., Teo, K., Yang, X. (eds.) Optimization and Control with Applications. Applied Optimization Series, vol. 96, pp. 235–256. Springer, New York, NY (2005)Google Scholar
- 23.Huang, H.: Efficient reconstruction of 2D images and 3D surfaces. Ph.D. Thesis, University of BC, Vancouver (2008)Google Scholar
- 25.Loosli, G., Canu, S.: Quadratic programming and machine learning large scale problems and sparsity. In: Siarry, P. (ed.) Optimization in Signal and Image Processing, pp. 111–135. Wiley ISTE, Hoboken, NJ (2009)Google Scholar
- 32.Yuan, Y.: Step-sizes for the gradient method. AMS/IP Stud. Adv. Math. 42(2), 785–796 (2008)Google Scholar