Skip to main content
Log in

An incremental primal–dual method for nonlinear programming with special structure

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

We propose a new class of incremental primal–dual techniques for solving nonlinear programming problems with special structure. Specifically, the objective functions of the problems are sums of independent nonconvex continuously differentiable terms minimized subject to a set of nonlinear constraints for each term. The technique performs successive primal–dual increments for each decomposition term of the objective function. The primal–dual increments are calculated by performing one Newton step towards the solution of the Karush–Kuhn–Tucker optimality conditions of each subproblem associated with each objective function term. We show that the resulting incremental algorithm is q-linearly convergent under mild assumptions for the original problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Abello, J., Pardalos, P.M., Resende, M. (eds): Handbook of Massive Data Sets. Series: Massive Computing, vol. 4. Springer, New York (2002)

    Google Scholar 

  2. Bertsekas D.P.: Incremental least squares methods and the extended Kalman filter. SIAM J. Optim. 6(3), 807–822 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bertsekas D.P.: A new class of incremental gradient methods for least squares problems. SIAM J. Optim. 7(4), 913–926 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bersekas, D.P.: Incremental gradient, subgradient, and proximal methods for convex optimization: a survey. Technical Report LIDS-2848, Laboratory for Information and Decision Systems, MIT, Cambridge, MA (2010)

  5. Couellan, N.P.: Primal-dual techniques for nonlinear programming and applications to artificial neural network training. Ph.D. Dissertation, School of Industrial Engineering, University of Oklahoma, Norman, OK (1997)

  6. Couellan, N.P., Trafalis, T.B: Online SVM learning via an incremental primal-dual technique. Optim. Methods Soft. (2011) (submitted)

  7. Davidon W.C.: New least-square algorithms. J. Optim. Theory Appl. 18(2), 187–197 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dennis J.E., Schnabel R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Classics in Applied Mathematics. SIAM, Philadelphia (1996)

    Book  Google Scholar 

  9. El-Bakry A.S., Tapia R.A., Tsuchiya T., Zhang Y.: On the formulation and theory of the primal-dual Newton interior point method for nonlinear programming. J. Optim. Theory Appl. 89, 507–541 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  10. Kallrath J.: Polylithic modeling and solution approaches using algebraic modeling systems. Optim. Lett. 5(3), 453–466 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  11. Kohn W., Zabinsky Z.B., Brayman V.: Optimization of algorithmic parameters using a meta-control approach. J. Glob. Optim. 34(2), 293–316 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  12. Lustig I.J., Marsten R.E., Shanno D.F.: Computational experiences with a primal-dual interior point method for linear programming. Linear Algebra Appl. 152, 191–222 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  13. Nocedal J., Wright S.J.: Numerical Optimization. Springer, New York (1999)

    Book  MATH  Google Scholar 

  14. Pardalos P.M., Resende M.: Handbook of Applied Optimization. Oxford University Press, Oxford (2002)

    MATH  Google Scholar 

  15. Söderstrom T., Stoica P.: System Identification. Prentice Hall International (UK), Englewood Cliffs (1989)

    Google Scholar 

  16. Trafalis, T.B., Couellan, N.P.: An incremental nonlinear primal-dual algorithm and applications to artificial neural networks training. In: Large Scale Systems: Theory and Applications, 1995. 7th IFAC/IFORS/IMACS Symposium, Oxford, UK (1995) (postprint volume)

  17. Tseng, P.: Incremental gradient(-projection) method with momentum term and adaptive stepsize rule. Technical Report, Department of Mathematics, University of Washington, Seattle, WA (1995)

  18. Yamashita, H.: A globally convergent primal-dual interior point method for constrained optimization. Technical Report, Mathematical System Institute, Inc., Shinjuku, Tokyo, Japan (1992)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nicolas P. Couellan.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Couellan, N.P., Trafalis, T.B. An incremental primal–dual method for nonlinear programming with special structure. Optim Lett 7, 51–62 (2013). https://doi.org/10.1007/s11590-011-0393-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-011-0393-0

Keywords

Navigation