Skip to main content

Numerical experiments with partially separable optimization problems

  • Conference paper
  • First Online:
Numerical Analysis

Part of the book series: Lecture Notes in Mathematics ((LNM,volume 1066))

Abstract

In this paper, we present some numerical experiments with an algorithm that uses the partial separability of an optimization problem. This research is motivated by the very large number of minimization problems in many variables having that particular property. The results discussed in the paper cover both unconstrained and bound constrained cases, as well as numerical estimation of gradient vectors. It is shown that exploiting the present underlying structure can lead to efficient algorithms, especially when the problem dimension is large.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 34.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 46.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. D.P. Bertsekas. Projected Newton Methods for Optimization Problems with Simple Constraints. SIAM Journal of Control and Optimization 20(2):221–246, 1982.

    Article  MathSciNet  MATH  Google Scholar 

  2. J. Cullum and R.K. Brayton. Some Remarks on the Symmetric Rank-One Update. Journal of Optimization Theory and Applications 29(4):493–519, 1979.

    Article  MathSciNet  MATH  Google Scholar 

  3. J.E. Dennis and H.H.W. Mei. Two New Unconstrained Optimization Algorithms Which Use Function and Gradient Values. Journal of Optimization Theory and Applications 28(4):453–482, 1979.

    Article  MathSciNet  MATH  Google Scholar 

  4. Ph.E. Gill and W. Murray. Conjugate Gradient Methods for Large Scale Nonlinear Optimization. Technical Report SOL 79-15, Dept. of Operations Research, Stanford University, Stanford, 1979.

    Google Scholar 

  5. Ph.E. Gill, W. Murray and M.H. Wright. Practical Optimization. Academic Press, London, 1981.

    MATH  Google Scholar 

  6. A. Griewank and Ph.L. Toint. Partitioned Variable Metric Updates for Large Structured Optimization Problems. Numerische Mathematik (39):119–137, 1982.

    Article  MathSciNet  MATH  Google Scholar 

  7. A. Griewank and Ph.L. Toint. Local Convergence Analysis for Partitioned Quasi-Newton Updates. Numerische Mathematik (39):429–448, 1982.

    Article  MathSciNet  MATH  Google Scholar 

  8. A. Griewank and Ph.L. Toint. On the Unconstrained Optimization of Partially Separable Functions. In M.J.D. Powell (editor), Nonlinear Optimization 1981. Academic Press, New-York, 1982.

    Google Scholar 

  9. A. Griewank and Ph.L. Toint. On the Existence of Convex Decompositions of Partially Separable Functions. Mathematical Programming to appear, 1983.

    Google Scholar 

  10. W. Hock and K. Schittkowski. Test Examples for Nonlinear Programming Codes. Lectures Notes in Economics and Mathematical Systems 187, Springer Verlag, Berlin, 1981.

    MATH  Google Scholar 

  11. H.Y. Huang. Unified Approach to Quadratically Convergent Algorithms For Function Minimization. Journal of Optimization Theory and Applications 5(6):405–423, 1970.

    Article  MathSciNet  MATH  Google Scholar 

  12. E. Marwil. Exploiting Sparsity in Newton-Like Methods. PhD thesis, Cornell University, Ithaca, New-York, 1978.

    Google Scholar 

  13. D.P. O’Leary. A Discrete Newton Algorithm For Minimizing A Function of Many Variables. Mathematical Programming 23:20–33, 1982.

    Article  MathSciNet  MATH  Google Scholar 

  14. M.J.D. Powell and Ph.L. Toint. The Shanno-Toint Procedure for Updating Sparse Symmetric Matrices. I.M.A. Journal of Numerical Analysis 1:403–413, 1981.

    Article  MathSciNet  MATH  Google Scholar 

  15. D. F. Shanno. On Variable Metric Methods for Sparse Hessians. Mathematics of Computation 34:499–514, 1980.

    Article  MathSciNet  MATH  Google Scholar 

  16. D.F. Shanno and K.H. Phua. Matrix Conditionning and Nonlinear Optimization. Mathematical Programming 14, 1978.

    Google Scholar 

  17. G.W. Stewart. A Modification of Davidon’s Minimization Method to Accept Difference Approximations of Derivatives. Journal of the ACM 14, 1967.

    Google Scholar 

  18. Ph.L. Toint. On Sparse And Symmetric Matrix Updating Subject To A Linear Equation. Mathematics of Computation 31:954–961, 1977.

    Article  MathSciNet  MATH  Google Scholar 

  19. Ph.L. Toint. On the Superlinear Convergence of an Algorithm for Solving a Sparse Minimization Problem. SIAM Journal on Numerical Analysis 16:1036–1045, 1979.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Authors

Editor information

David F. Griffiths

Rights and permissions

Reprints and permissions

Copyright information

© 1984 Springer-Verlag

About this paper

Cite this paper

Griewank, A., Toint, P.L. (1984). Numerical experiments with partially separable optimization problems. In: Griffiths, D.F. (eds) Numerical Analysis. Lecture Notes in Mathematics, vol 1066. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0099526

Download citation

  • DOI: https://doi.org/10.1007/BFb0099526

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-13344-5

  • Online ISBN: 978-3-540-38881-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics