Skip to main content
Log in

Scaled nonlinear conjugate gradient methods for nonlinear least squares problems

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

We propose a modified structured secant relation to get a more accurate approximation of the second curvature of the least squares objective function. Then, using this relation and an approach introduced by Andrei, we propose three scaled nonlinear conjugate gradient methods for nonlinear least squares problems. An attractive feature of one of the proposed methods is satisfication of the sufficient descent condition regardless of the line search and the objective function convexity. We establish that the three proposed algorithms are globally convergent, under the assumption of the Jacobian matrix having full column rank on the level set for one, and without such assumption for the other two. Numerical experiments are made on the collection of test problems, both zero-residual and nonzero-residual, using the Dolan–Moré performance profiles. They show that the outperformance of our proposed algorithms is more pronounced on nonzero-residual as well as large problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Amini, K., Ghorbani Rizi, A.: A new structured quasi-Newton algorithm using partial information on Hessian. J. Comput. Appl. Math. 234, 805–811 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  2. Al-Baali, M., Fletcher, R.: Variational methods for non-linear least squares. J. Oper. Res. Soc. 36, 405–421 (1985)

    Article  MATH  Google Scholar 

  3. Andrei, N., Scaled memoryless, BFGS: Preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22, 561–571 (2007)

    Article  MathSciNet  Google Scholar 

  4. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. European J. Oper. Res. 204, 410–420 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  5. Babaie-Kafaki, S.: A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei. Comput. Optim. Appl. 52, 409–414 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  6. Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant relations. J. Comput. Appl. Math. 234, 1374–1386 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  7. Barzilai, J., Borwein, J.M.: Two point step size gradient method. IMA J. Num. Anal. 8, 141–148 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  8. Birgin, E., Martnez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  9. Broyden, C.G.: A class of methods for solving nonlinear simultaneous equations. Math. Comput. 19, 577–593 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  10. Chen, L., Deng, N., Zhang, J.: A modified quasi-Newton method for structured optimization with partial information on the Hessian. Comput. Optim. Appl. 35, 5–18 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dennis, J.E., Gay, D.M., Welsch, R.E.: An adaptive nonlinear least-squares algorithm. ACM Trans. Math. Softw. 7, 348–368 (1981)

    Article  MATH  Google Scholar 

  13. Dennis, J.E., Martinez, H.J., Tapia, R.A.: Convergence theory for the structured BFGS secant method with an application to nonlinear least squares. J. Optim. Theory Appl. 61, 161–178 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  14. Dennis, J.E., Moré, J.J.: A characterization of superlinear convergence and its application to quasi-Newton methods. Math. Comput. 28, 549–560 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  15. Dennis, J.E., Moré, J.J.: Quasi-newton methods, motivation and theory. SIAM Rev. 19, 46–89 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  16. Dennis, J.E., Walker, H.F.: Convergence theorems for least-change secant update methods. SIAM J. Numer. Anal. 18, 949–987 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  17. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profles. Math Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  18. Engels, J.R., Marté, H.J.: Local and superlinear convergence for partially known quasi-Newton methods. SIAM J. Optim. 1, 42–56 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  19. Fan, J., Yuan, Y.: On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption. Computing 74, 23–39 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  20. Fletcher, R.: On the Barzilai-Borwein method. In: Optimization and Control with Applications, pp. 35–256. Springer, Boston (2005)

  21. Fletcher, R., Revees, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  22. Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40, 191–216 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  23. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  24. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed de-scent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  25. Hager, W.W., Zhang, H.: Algorithm 851: CG Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32, 113–137 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  26. Huschens, J.: On the use of product structure in secant methods for nonlinear least squares problems. SIAM J. Optim. 4, 108–129 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  27. Kobayashi, M., Narushima, Y., Yabe, H.: Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems. J. Comput. Appl. Math. 234, 375–397 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  28. Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202, 523–539 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  29. Luksan, L., Vlcek, J.: Sparse and Partially Separable Test Problems for Unconstrained and Equality Constrained Optimization Technical Report No. 767, Institute of Computer Science, Academy of Sciences of the Czech Republic (1999)

  30. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)

    MATH  Google Scholar 

  31. Polak, E., Ribiére, G.: Note Sur la Convergence de Directions conjuguée. Francaise Informat Recherche Operationelle 16, 35–43 (1969)

    MATH  Google Scholar 

  32. Polyak, B.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)

    Article  MATH  Google Scholar 

  33. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  34. Shanno, D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  35. Shanno, D.F., Phua, K.H.: Matrix conditioning and nonlinear optimization. Math. Program. 14, 149–160 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  36. Shanno, D.F., Phua, K.H.: Algorithm 500: minimization of unconstrained multivariate functions. ACM Trans. Math. Softw. 2, 87–94 (1976)

    Article  MATH  Google Scholar 

  37. Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153, 733–757 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  38. Wei, Z., Li, G., Qi, L.: New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175, 1156–1188 (2006)

    MathSciNet  MATH  Google Scholar 

  39. Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant relation. Comput. Optim. Appl. 28, 203–225 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  40. Yabe, H., Yamaki, N.: Local and superlinear convergence of structured quasi-Newton methods for nonlinear optimization. J. Oper. Res. Soc. Jpn. 39, 541–557 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  41. Yuan, G., Wei, Z.: Convergence analysis of a modified BFGS method on convex minimizations. Comput. Optim. Appl. 47, 237–255 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  42. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  43. Zhang, J.Z., Xu, C.X.: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equation. J. Comput. Appl. Math. 137, 269–278 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  44. Zhou, W.: On the convergence of the modified Levenberg-Marquardt method with a nonmonotone second order Armijo type line search. J. Comput. Appl. Math. 239, 152–161 (2013)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The first author thanks Yazd University and the second author thanks Sharif University of Technology for supporting this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to N. Mahdavi-Amiri.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dehghani, R., Mahdavi-Amiri, N. Scaled nonlinear conjugate gradient methods for nonlinear least squares problems. Numer Algor 82, 1–20 (2019). https://doi.org/10.1007/s11075-018-0591-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-018-0591-2

Keywords

Mathematics Subject Classification (2010)

Navigation