Abstract
Based on the augmented version of the quasi-Newton method proposed by Aminifard et al. (App. Num. Math. 167:187–201, 2021), a new scaled parameter of the self-scaling memoryless BFGS update formula is proposed. The idea is to cluster the eigenvalues of the search direction matrix, obtained by minimizing the difference between the largest and the smallest eigenvalues of the matrix. The sufficient descent property is proved for uniformly convex functions, and the global convergence of the proposed algorithm is proved both for the uniformly convex and general nonlinear objective functions. Numerical experiments on a set of test functions of the CUTEr collection show that the proposed method is efficient. In addition, the proposed algorithm is effectively applied to salt and pepper noise elimination problem.
Similar content being viewed by others
References
Aminifard Z, Babaie-Kafaki S, Ghafoori S (2021) An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing. Appl Numer Math 167:187–201
Andrei N (2017) Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization. Optim Methods Softw 32(3):534–551
Andrei N (2020) New conjugate gradient algorithms based on self-scaling memoryless broyden-fletcher-goldfarb-shanno method. Calcolo 57(2):1–27
Andrei N (2020) A double parameter self-scaling memoryless BFGS method for unconstrained optimization. Comput Appl Math 39(3):159–14. https://doi.org/10.1007/s40314-020-01157-z
Arazm MR, Babaie-Kafaki S, Ghanbari R (2017) An extended dai-liao conjugate gradient method with global convergence for nonconvex functions. Glasnik matematički 52(2):361–375
Babaie-Kafaki, S., Mirhoseini, N., Aminifard, Z.: A descent extension of a modified polak–ribière–polyak method with application in image restoration problem. Opt Lett (2022)
Babaie-Kafaki S (2013) A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4):361–374. https://doi.org/10.1007/s10288-013-0233-4
Babaie-Kafaki S (2016) A modified scaling parameter for the memoryless BFGS updating formula. Numer Algorithms 72(2):425–433
Babaie-Kafaki S, Aminifard Z (2019) Two-parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length. Numer Algorithms 82(4):1345–1357
Byrd RH, Nocedal J (1989) A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J Numer Anal 26(3):727–739. https://doi.org/10.1137/0726042
Dai Y-H, Kou C-X (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23(1):296–320. https://doi.org/10.1137/100813026
Dennis JE Jr, Wolkowicz H (1993) Sizing and least-change secant methods. SIAM J Numer Anal 30(5):1291–1314. https://doi.org/10.1137/0730067
Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2, Ser. A):201–213. https://doi.org/10.1007/s101070100263
Gould NI, Orban D, Toint PL (2003) Cuter and sifdec: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw (TOMS) 29(4):373–394
Heravi AR, Hodtani GA (2018) A new correntropy-based conjugate gradient backpropagation algorithm for improving training in neural networks. IEEE Trans Neural Netw Learn Syst 29(12):6252–6263
Li, D.-H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J Comput Appl Math 129:15–35 (2001). https://doi.org/10.1016/S0377-0427(00)00540-9. Nonlinear programming and variational inequalities (Kowloon, 1998)
Li W, Liu Y, Yang J, Wu W (2018) A new conjugate gradient method with smoothing \( l_ {1/2}\) regularization based on a modified secant equation for training neural networks. Neural Process Lett 48(2):955–978
Li M, Liu H, Liu Z (2018) A new family of conjugate gradient methods for unconstrained optimization. J Appl Math Comput 58(1–2):219–234. https://doi.org/10.1007/s12190-017-1141-0
Liao A (1997) Modifying the BFGS method. Oper Res Lett 20(4):171–177. https://doi.org/10.1016/S0167-6377(96)00050-8
Livieris IE, Tampakas V, Pintelas P (2018) A descent hybrid conjugate gradient method based on the memoryless BFGS update. Numerical Algorithms 79(4):1169–1185
Oren, S.S., Luenberger, D.G.: Self-scaling variable metric (SSVM) algorithms. I. Criteria and sufficient conditions for scaling a class of algorithms. Manag. Sci. 20:845–862 (1973/74). https://doi.org/10.1287/mnsc.20.5.845. Mathematical programming
Oren SS, Spedicato E (1976) Optimal conditioning of self-scaling variable metric algorithms. Math. Program. 10(1):70–90. https://doi.org/10.1007/BF01580654
Sugiki K, Narushima Y, Yabe H (2012) Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J Optim Theory Appl 153(3):733–757. https://doi.org/10.1007/s10957-011-9960-x
Sun W, Yuan Y-X (2006) Optimization theory and methods: nonlinear programming. Springer optimization and its applications, vol 1. Springer, New York, p 687
Ullah N, Sabi’u J, Shah A (2021) A derivative-free scaling memoryless Broyden-Fletcher-Goldfarb-shanno method for solving a system of monotone nonlinear equations. Numer Linear Algebra Appl 28(5):2374
Wei Z, Li G, Qi L (2006) New quasi-Newton methods for unconstrained optimization problems. Appl Math Comput 175(2):1156–1188. https://doi.org/10.1016/j.amc.2005.08.027
Wei Z, Li G, Qi L (2006) New quasi-Newton methods for unconstrained optimization problems. Appl Math Comput 175(2):1156–1188. https://doi.org/10.1016/j.amc.2005.08.027
Wright S, Nocedal J et al (1999) Numerical optimization. Science 35(67–68):7
Yu G, Huang J, Zhou Y (2010) A descent spectral conjugate gradient method for impulse noise removal. Appl Math Lett 23(5):555–560. https://doi.org/10.1016/j.aml.2010.01.010
Yuan G, Sheng Z, Wang B, Hu W, Li C (2018) The global convergence of a modified BFGS method for nonconvex functions. J Comput Appl Math 327:274–294. https://doi.org/10.1016/j.cam.2017.05.030
Yuan G, Lu J, Wang Z (2020) The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl Numer Math 152:1–11. https://doi.org/10.1016/j.apnum.2020.01.019
Yuan G, Wang Z, Li P (2020) A modified Broyden family algorithm with global convergence under a weak Wolfe–Powell line search for unconstrained nonconvex problems. Calcolo 57(4):35–21. https://doi.org/10.1007/s10092-020-00383-5
Yuan G, Zhang M, Zhou Y (2022) Adaptive scaling damped BFGS method without gradient Lipschitz continuity. Appl Math Lett 124:107634–7. https://doi.org/10.1016/j.aml.2021.107634
Zhang JZ, Deng NY, Chen LH (1999) New quasi-Newton equation and related methods for unconstrained optimization. J Optim Theory Appl 102(1):147–167. https://doi.org/10.1023/A:1021898630001
Zhou W, Zhang L (2006) A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim Methods Softw 21(5):707–714. https://doi.org/10.1080/10556780500137041
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Jourak, M., Nezhadhosein, S. & Rahpeymaii, F. A new self-scaling memoryless quasi-Newton update for unconstrained optimization. 4OR-Q J Oper Res 22, 235–252 (2024). https://doi.org/10.1007/s10288-023-00544-6
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10288-023-00544-6