Abstract
This paper introduces CLS, a new line search along an arbitrary smooth search path, that starts at the current iterate tangentially to a descent direction. Like the Goldstein line search and unlike the Wolfe line search, the new line search uses, beyond the gradient at the current iterate, only function values. Using this line search with search directions satisfying the bounded angle condition, global convergence to a stationary point is proved for continuously differentiable objective functions that are bounded below and have Lipschitz continuous gradients. The standard complexity bounds are proved under several natural assumptions.
Similar content being viewed by others
Data availibility
The online supplement is available in [18].
References
Armijo, L.: Minimization of functions having Lipschitz continuous first partial derivatives. Pacific J. Math. 16(1), 1–3 (1966)
Cartis, C., Gould, N.I.M., Toint., Ph.L.: Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives, volume MO30 of MOS-SIAM Series on Optimization. SIAM (2022)
Cartis, C., Sampaio, Ph.R., Toint, Ph.L.: Worst-case evaluation complexity of non-monotone gradient-related algorithms for unconstrained optimization. Optimization 64(3), 1349–1361 (2015)
Corliss, G., Faure, C., Griewank, A., Hascoët, L., Naumann, U.: Automatic Differentiation of Algorithms: From Simulation to Optimization. Springer, New York (2002)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher, R.: Practical Methods of Optimization. Wiley, New York (2000)
Goldstein, A., Price, J.: An effective algorithm for minimization. Numer. Math. 10, 184–189 (1967)
Goldstein, A.A.: On steepest descent. J. SIAM Ser. A Control 3, 147–151 (1965)
Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60, 545–557 (2015)
Hager, W.W., Zhang, H.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32, 113–137 (2006)
Karimi, H., Nutini, J., Schmidt, M.: Linear convergence of gradient and proximal-gradient methods under the Polyak–Lojasiewicz condition. In: Machine Learning and Knowledge Discovery in Databases, pp. 795–811 (2016)
Kimiaei, M., Neumaier, A., Azmi, B.: LMBOPT—a limited memory method for bound-constrained optimization. Math. Program. Comput. 14, 271–318 (2022)
Lemarechal, C.: A view of line-searches. In: Optimization and Optimal Control. Proceedings of a Conference Held at Oberwolfach, March 16–22, 1980, pp. 59–78. Springer, Berlin (1980)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)
Moré, J.J., Thuente, D.J.: Line search algorithms with guaranteed sufficient decrease. ACM Trans. Math. Softw. 20, 286–307 (1994)
Neumaier, A., Azmi, B.: Line search and convergence in bound-constrained optimization. Unpublished manuscript, University of Vienna (2019). http://www.optimization-online.org/DB_HTML/2019/03/7138.html
Neumaier, A., Azmi, B., Kimiaei, M.: An active set method for bound-constrained optimization. Manuscript (2023). https://optimization-online.org/?p=21354
Neumaier, A., Kimiaei, M.: An improved of the Goldstein line search. Supplemental martial. (2023). https://github.com/GS1400/CLS_supplement.git
Neumaier, A., Kimiaei, M., Azmi, B.: Globally linearly convergent nonlinear conjugate gradients without Wolfe line search. Manuscript (2023). https://optimization-online.org/?p=21354
Nocedal, J., Wright, S.: Numerical Optimization. Springer, Berlin (2006)
Royer, C.W., Wright, S.J.: Complexity analysis of second-order line-Search algorithms for smooth nonconvex optimization. SIAM J. Optim. 28, 1448–1477 (2018)
Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, Berlin (2006)
Warth, W., Werner, J.: Effiziente Schrittweitenfunktionen bei unrestringierten Optimierungsaufgaben. Computing 19, 59–72 (1977)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)
Zoutendijk, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)
Acknowledgements
The second author acknowledges financial support of the Austrian Science Foundation under Project No. P 34317. The authors are grateful for the thoughtful comments of two reviewers and the associate editor.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Neumaier, A., Kimiaei, M. An improvement of the Goldstein line search. Optim Lett (2024). https://doi.org/10.1007/s11590-024-02110-3
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11590-024-02110-3