Skip to main content

A Limited Memory Gradient Projection Method for Box-Constrained Quadratic Optimization Problems

  • Conference paper
  • First Online:
Numerical Computations: Theory and Algorithms (NUMTA 2019)

Abstract

Gradient Projection (GP) methods are a very popular tool to address box-constrained quadratic problems thanks to their simple implementation and low computational cost per iteration with respect, for example, to Newton approaches. It is however possible to include, in GP schemes, some second order information about the problem by means of a clever choice of the steplength parameter which controls the decrease along the anti-gradient direction. Borrowing the analysis developed by Barzilai and Borwein (BB) for an unconstrained quadratic programming problem, in 2012 Roger Fletcher proposed a limited memory steepest descent (LMSD) method able to effectively sweep the spectrum of the Hessian matrix of the quadratic function to optimize. In this work we analyze how to extend the Fletcher’s steplength selection rule to GP methods employed to solve box-constrained quadratic problems. Particularly, we suggest a way to take into account the lower and the upper bounds in the steplength definition, providing also a theoretical and numerical evaluation of our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  2. Bertsekas, D.P.: Nonlinear Programming, 2nd edn. Athena Scientific, Boston (1999)

    MATH  Google Scholar 

  3. Birgin, E.G., Martinez, J.M., Raydan, M.: Non-monotone spectral projected gradient methods on convex sets. SIAM J. Optim. 10(4), 1196–1211 (2000)

    Article  MathSciNet  Google Scholar 

  4. Crisci, S., Ruggiero, V., Zanni, L.: Steplength selection in gradient projection methods for box-constrained quadratic programs. Appl. Math. Comput. 356, 312–327 (2019)

    MathSciNet  MATH  Google Scholar 

  5. Fletcher, R.: A limited memory steepest descent method. Math. Program. Ser. A 135, 413–436 (2012)

    Article  MathSciNet  Google Scholar 

  6. Frassoldati, G., Zanghirati, G., Zanni, L.: New adaptive stepsize selections in gradient methods. J. Ind. Manage. Optim. 4(2), 299–312 (2008)

    MathSciNet  MATH  Google Scholar 

  7. Golub, G.H., van Loan, C.F.: Matrix Computations, 3rd edn. John Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  8. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line-search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)

    Article  MathSciNet  Google Scholar 

  9. Horn, R.A., Johnson, C.R.: Matrix Analysis. Cambridge University Press, Cambridge (2012)

    Book  Google Scholar 

  10. Huang, Y., Liu, H.: On the rate of convergence of projected Barzilai-Borwein methods. Optim. Meth. Softw. 30(4), 880–892 (2015)

    Article  MathSciNet  Google Scholar 

  11. Porta, F., Prato, M., Zanni, L.: A new steplength selection for scaled gradient methods with application to image deblurring. J. Sci. Comput. 65, 895–919 (2015)

    Article  MathSciNet  Google Scholar 

  12. di Serafino, D., Ruggiero, V., Toraldo, G., Zanni, L.: On the steplength selection in gradient methods for unconstrained optimization. Appl. Math. Comput. 318, 176–195 (2018)

    MathSciNet  MATH  Google Scholar 

  13. Zhou, B., Gao, L., Dai, Y.H.: Gradient methods with adaptive step-sizes. Comput. Optim. Appl. 35(1), 69–86 (2006)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

We thank the anonymous reviewers for their careful reading of our manuscript and their many insightful comments and suggestions. This work has been partially supported by the INDAM research group GNCS.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Serena Crisci .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Crisci, S., Porta, F., Ruggiero, V., Zanni, L. (2020). A Limited Memory Gradient Projection Method for Box-Constrained Quadratic Optimization Problems. In: Sergeyev, Y., Kvasov, D. (eds) Numerical Computations: Theory and Algorithms. NUMTA 2019. Lecture Notes in Computer Science(), vol 11973. Springer, Cham. https://doi.org/10.1007/978-3-030-39081-5_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-39081-5_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-39080-8

  • Online ISBN: 978-3-030-39081-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics