Recent Advances in Bound Constrained Optimization

  • W. W. Hager
  • H. Zhang
Part of the IFIP International Federation for Information Processing book series (IFIPAICT, volume 199)


A new active set algorithm (ASA) for large-scale box constrained optimization is introduced. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for switching between the two steps. Numerical experiments and comparisons are presented using box constrained problems in the CUTEr and MINPACK test problem libraries.


Nonmonotone gradient projection box constrained optimization active set algorithm ASA cyclic BB method CBB conjugate gradient method CG_DESCENT degenerate optimization 


  1. [1]
    L. Armijo. Minimization of functions having Lipschitz continuous first partial derivatives. Pacific J. Math., 16:1–3, 1966.MATHMathSciNetGoogle Scholar
  2. [2]
    B. M. Averick, R. G. Carter, J. J. Moré, and G. L. Xue. The MINPACK-2 test problem collection. Technical report, Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, IL, 1992.Google Scholar
  3. [3]
    J. Barzilai and J. M. Borwein. Two point step size gradient methods. IMA J. Numer. Anal., 8:141–148, 1988.MathSciNetMATHGoogle Scholar
  4. [4]
    D. P. Bertsekas. On the Goldstein-Levitin-Polyak gradient projection method. IEEE Trans. Automatic Control, 21:174–184, 1976.MATHMathSciNetCrossRefGoogle Scholar
  5. [5]
    D. P. Bertsekas. Projected Newton methods for optimization problems with simple constraints. SIAM J. Control Optim., 20:221–246, 1982.MATHMathSciNetCrossRefGoogle Scholar
  6. [6]
    E. G. Birgin and J. M. Martínez. Large-scale active-set box-constrained optimization method with spectral projected gradients. Comput. Optim. Appl., 23:101–125, 2002.MathSciNetCrossRefMATHGoogle Scholar
  7. [7]
    E. G. Birgin, J. M. Martínez, and M. Raydan. Nonmonotone spectral projected gradient methods for convex sets. SIAM J. Optim., 10:1196–1211, 2000.MathSciNetCrossRefMATHGoogle Scholar
  8. [8]
    E. G. Birgin, J. M. Martínez, and M. Raydan. Algorithm 813: SPG-software for convex-constrained optimization. ACM Trans. Math. Software, 27:340–349, 2001.CrossRefMATHGoogle Scholar
  9. [9]
    I. Bongartz, A. R. Conn, N. I. M. Gould, and P. L. Toint. CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Software, 21:123–160, 1995.CrossRefMATHGoogle Scholar
  10. [10]
    M.A. Branch, T.F. Coleman, and Y. Li. A subspace, interior, and conjugate gradient method for large-scale bound-constrained minimization problems. SIAM J. Sci. Comput., 21:1–23, 1999.MathSciNetCrossRefMATHGoogle Scholar
  11. [11]
    J. V. Burke and J. J. Moré. On the identification of active constraints. SIAM J. Numer. Anal, 25:1197–1211, 1988.MathSciNetCrossRefMATHGoogle Scholar
  12. [12]
    J. V. Burke and J. J. Moré. Exposing constraints. SIAM J. Optim., 25:573–595, 1994.CrossRefGoogle Scholar
  13. [13]
    J. V. Burke, J. J. Moré, and G. Toraldo. Convergence properties of trust region methods for linear and convex constraints. Math. Prog., 47:305–336, 1990.CrossRefMATHGoogle Scholar
  14. [14]
    P. Calamai and J. Moré. Projected gradient for linearly constrained problems. Math. Prog., 39:93–116, 1987.MATHGoogle Scholar
  15. [15]
    T. F. Coleman and Y. Li. On the convergence of interior-reflective Newton methods for nonlinear minimization subject to bounds. Math. Prog., 67:189–224, 1994.MathSciNetCrossRefGoogle Scholar
  16. [16]
    T. F. Coleman and Y. Li. An interior trust region approach for nonlinear minimization subject to bounds. SIAM J. Optim., 6:418–445, 1996.MathSciNetCrossRefMATHGoogle Scholar
  17. [17]
    T. F. Coleman and Y. Li. A trust region and affine scaling interior point method for nonconvex minimization with linear inequality constraints. Technical report, Cornell University, Ithaca, NY, 1997.Google Scholar
  18. [18]
    A. R. Conn, N. I. M. Gould, and Ph. L. Toint. Global convergence of a class of trust region algorithms for optimization with simple bounds. SIAM J. Numer. Anal., 25:433–460, 1988.MathSciNetCrossRefMATHGoogle Scholar
  19. [19]
    A. R. Conn, N. I. M. Gould, and Ph. L. Toint. A globally convergent augmented Lagrangian algorithm for optimization with general constraints and simple bounds. SIAM J. Numer. Anal., 28:545–572, 1991.MathSciNetCrossRefMATHGoogle Scholar
  20. [20]
    Y. H. Dai, W. W. Hager, K. Schittkowski, and H. Zhang. The cyclic Barzilai-Borwein method for unconstrained optimization. IMA J. Numer. Anal, submitted, 2005.Google Scholar
  21. [21]
    R. S. Dembo and U. Tulowitzki. On the minimization of quadratic functions subject to box constraints. Technical report, School of Organization and Management, Yale University, New Haven, CT, 1983.Google Scholar
  22. [22]
    J. E. Dennis, M. Heinkenschloss, and L. N. Vicente. Trust-region interior-point algorithms for a class of nonlinear programming problems. SIAM J. Control Optim., 36:1750–1794, 1998.MathSciNetCrossRefMATHGoogle Scholar
  23. [23]
    E. D. Dolan and J. J. Moré. Benchmarking optimization software with performance profiles. Math. Program., 91:201–213, 2002.MathSciNetCrossRefMATHGoogle Scholar
  24. [24]
    Z. Dostál. Box constrained quadratic programming with proportioning and projections. SIAM J. Optim., 7:871–887, 1997.MATHMathSciNetCrossRefGoogle Scholar
  25. [25]
    Z. Dostál. A proportioning based algorithm for bound constrained quadratic programming with the rate of convergence. Numer. Algorithms, 34:293–302, 2003.MATHMathSciNetCrossRefGoogle Scholar
  26. [26]
    Z. Dostál, A. Friedlander, and S. A. Santos. Solution of coercive and semicoercive contact problems by FETI domain decomposition. Contemp. Math., 218:82–93, 1998.Google Scholar
  27. [27]
    Z. Dostál, A. Friedlander, and S. A. Santos. Augmented Lagrangians with adaptive precision control for quadratic programming with simple bounds and equality constraints. SIAM J. Optim., 13:1120–1140, 2003.MathSciNetCrossRefMATHGoogle Scholar
  28. [28]
    A. S. El-Bakry, R. A. Tapia, T. Tsuchiya, and Y. Zhang. On the formulation and theory of the primal-dual Newton interior-point method for nonlinear programming. J. Optim. Theory Appl., 89:507–541, 1996.MathSciNetCrossRefMATHGoogle Scholar
  29. [29]
    F. Facchinei, J. Júdice, and J. Soares. An active set Newton’s algorithm for large-scale nonlinear programs with box constraints. SIAM J. Optim., 8:158–186, 1998.MathSciNetCrossRefMATHGoogle Scholar
  30. [30]
    F. Facchinei and S. Lucidi. A class of penalty functions for optimization problems with bound constraints. Optimization, 26:239–259, 1992.MathSciNetMATHGoogle Scholar
  31. [31]
    F. Facchinei, S. Lucidi, and L. Palagi. A truncated Newton algorithm for large-scale box constrained optimization. SIAM J. Optim., 4:1100–1125, 2002.MathSciNetCrossRefGoogle Scholar
  32. [32]
    A. Friedlander, J. M. Martínez, and S. A. Santos. A new trust region algorithm for bound constrained minimization. Appl. Math. Optim., 30:235–266, 1994.MathSciNetCrossRefMATHGoogle Scholar
  33. [33]
    A. A. Goldstein. Convex programming in Hilbert space. Bull. Amer. Math. Soc, 70:709–710, 1964.MATHMathSciNetCrossRefGoogle Scholar
  34. [34]
    L. Grippo, F. Lampariello, and S. Lucidi. A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal, 23:707–716, 1986.MathSciNetCrossRefMATHGoogle Scholar
  35. [35]
    W. W. Hager and H. Zhang. CG_DESCENT user’s guide. Technical report, Dept. Math., Univ. Florida, 2004.Google Scholar
  36. [36]
    W. W. Hager and H. Zhang. A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim., 16:170–192, 2005.MathSciNetCrossRefMATHGoogle Scholar
  37. [37]
    W. W. Hager and H. Zhang. A new active set algorithm for box constrained optimization. SIAM J. Optim., submitted, 2005.Google Scholar
  38. [38]
    W. W. Hager and H. Zhang. CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Software, to appear 2006.Google Scholar
  39. [39]
    W. W. Hager and H. Zhang. A survey of nonlinear conjugate gradient methods. Pacific J. Optim., to appear 2006.Google Scholar
  40. [40]
    M. Heinkenschloss, M. Ulbrich, and S. Ulbrich. Superlinear and quadratic convergence of affine-scaling interior-point Newton methods for problems with simple bounds without strict complementarity assumption. Math. Prog., 86:615–635, 1999.MathSciNetCrossRefMATHGoogle Scholar
  41. [41]
    C. Kanzow and A. Klug. On affine-scaling interior-point Newton methods for nonlinear minimization with bound constraints. Comput. Optim. Appl., 2006, to appear.Google Scholar
  42. [42]
    D. Kinderlehrer and G. Stampacchia. An introduction to variational inequalities and their applications, volume 31 of Classics in Applied Mathematics. SIAM, Philadelphia, PA, 2000.MATHGoogle Scholar
  43. [43]
    M. Lescrenier. Convergence of trust region algorithms for optimization with bounds when strict complementarity does not hold. SIAM J. Numer. Anal., 28:476–495, 1991.MATHMathSciNetCrossRefGoogle Scholar
  44. [44]
    E. S. Levitin and B. T. Polyak. Constrained minimization problems. USSR Comput. Math. Math. Physics, 6:1–50, 1966.CrossRefGoogle Scholar
  45. [45]
    C. J. Lin and J. J. Moré. Incomplete cholesky factorizations with limited memory. SIAM J. Sci. Comput., 21:24–45, 1999.MathSciNetCrossRefMATHGoogle Scholar
  46. [46]
    C. J. Lin and J. J. Moré. Newton’s method for large bound-constrained optimization problems. SIAM J. Optim., 9:1100–1127, 1999.MathSciNetCrossRefMATHGoogle Scholar
  47. [47]
    J. M. Martínez. BOX-QUACAN and the implementation of augmented Lagrangian algorithms for minimization with inequality constraints. J. Comput. Appl. Math., 19:31–56, 2000.MATHGoogle Scholar
  48. [48]
    G. P. McCormick and R. A. Tapia. The gradient projection method under mild differentiability conditions. SIAM J. Control, 10:93–98, 1972.MathSciNetCrossRefMATHGoogle Scholar
  49. [49]
    J. J. Moré and G. Toraldo. On the solution of large quadratic programming problems with bound constraints. SIAM J. Optim., 1:93–113, 1991.MathSciNetCrossRefMATHGoogle Scholar
  50. [50]
    B. T. Polyak. The conjugate gradient method in extremal problems. USSR Comp. Math. Math. Phys., 9:94–112, 1969.CrossRefGoogle Scholar
  51. [51]
    S. M. Robinson. Strongly regular generalized equations. Math. Oper. Res., 5:43–62, 1980.MATHMathSciNetCrossRefGoogle Scholar
  52. [52]
    M. Ulbrich, S. Ulbrich, and M. Heinkenschloss. Global convergence of affine-scaling interior-point Newton methods for infinite-dimensional nonlinear problems with pointwise bounds. SIAM J. Control Optim., 37:731–764, 1999.MathSciNetCrossRefMATHGoogle Scholar
  53. [53]
    S. J. Wright. Implementing proximal point methods for linear programming. J. Optim. Theory Appl., 65:531–554, 1990.MATHMathSciNetCrossRefGoogle Scholar
  54. [54]
    H. Yamashita and H. Yabe. Superlinear and quadratic convergence of some primal-dual interior-point methods for constrained optimization. Math. Prog., 75:377–397, 1996.MathSciNetCrossRefGoogle Scholar
  55. [55]
    E. K. Yang and J. W. Tolle. A class of methods for solving large convex quadratic programs subject to box constraints. Math. Prog., 51:223–228, 1991.MathSciNetCrossRefMATHGoogle Scholar
  56. [56]
    Y. Zhang. Interior-point gradient methods with diagonal-scalings for simple-bound constrained optimization. Technical Report TR04-06, Department of Computational and Applied Mathematics, Rice University, Houston, Texas, 2004.Google Scholar
  57. [57]
    C. Zhu, R. H. Byrd, and J. Nocedal. Algorithm 778: L-BFGS-B, Fortran subroutines for large-scale bound-constrained optimization. ACM Trans. Math. Software, 23:550–560, 1997.MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© International Federation for Information Processing 2006

Authors and Affiliations

  • W. W. Hager
    • 1
  • H. Zhang
    • 1
  1. 1.Department of MathematicsUniversity of FloridaGainesville

Personalised recommendations