Skip to main content
Log in

A modified Dai–Kou-type method with applications to signal reconstruction and blurred image restoration

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

The classical Dai–Kou scheme (SIAM J Optim 23(1):296–320, 2013), which is popularly referred to as CGOPT, represents one of the most numerically efficient methods for unconstrained optimization. This article exploits nice properties of the scheme together with the projection method to present an adaptive Dai–Kou type method for solving constrained system of nonlinear monotone equations. The new scheme utilizes the backtracking line search strategy by Zhang and Zhou (J Comput Appl Math 196:478–484, 2006) to determine the step-size in the algorithm. In addition, the scheme requires less memory to implement as it avoids computing Jacobian matrix. This attribute makes it an ideal choice for large-scale problems. Other attributes of the scheme include its ability to generate search directions that satisfy the vital condition for global convergence and its applications to signal and image reconstruction problems in compressive sensing. The global convergence of the new scheme is proved using basic assumptions and numerical experiments conducted suggests that the new approach has a clear edge over four iterative schemes in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • Al-Baali M (1998) Numerical experience with a class of self-scaling quasi-Newton algorithms. J Optim Theory Appl 96(3):533–553

    MathSciNet  MATH  Google Scholar 

  • Babaie-Kafaki S, Ghanbari R, Mahdavi-Amiri N (2010) Two new conjugate gradient methods based on modified secant equations. J Comput Appl Math 234(5):1374–1386

    MathSciNet  MATH  Google Scholar 

  • Banham MR, Katsaggelos AK (1997) Digital image restoration. IEEE Signal Process Mag 14(2):24–41

    Google Scholar 

  • Chan CL, Katsaggelos AK, Sahakian AV (1993) Image sequence filtering in quantum-limited noise with applications to low-dose fluoroscopy. IEEE Trans Med Imaging 12(3):610–621

    Google Scholar 

  • Cheng W (2009) A PRP type method for systems of monotone equations. Math Comput Model 50:15–20

    MathSciNet  MATH  Google Scholar 

  • Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved wolfe line search. SIAM J Optim 23:296–320

    MathSciNet  MATH  Google Scholar 

  • Dai YH, Yuan Y (1999) A Nonlinear conjugate gradient method with a strong global convergence property. Soc Ind Appl Math J Optim 10(1):177–182

    MathSciNet  MATH  Google Scholar 

  • Dennis J, Moré J (1977) Quasi-Newton methods, motivation and theory. SIAM Rev Soc Ind Appl Math 19(1):46–89

    MathSciNet  MATH  Google Scholar 

  • Ding Y, Xiao Y, Li J (2017) A class of conjugate gradient methods for convex constrained monotone equations. Optimization 66(12):2309–2328

    MathSciNet  MATH  Google Scholar 

  • Dirkse SP, Ferris MC (1995) A collection of nonlinear mixed complementarity problems. Optim Methods Softw 5:319–345

    Google Scholar 

  • Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91:201–2013

    MathSciNet  MATH  Google Scholar 

  • Figueiredo M, Nowak R, Wright SJ (2007) Gradient projection for sparse reconstruction, application to compressed sensing and other inverse problems. IEEE J-STSP IEEE Press, Piscataway, pp 586–597

    Google Scholar 

  • Fletcher R (1997) Practical method of optimization. Volume 1: unconstrained optimization, 2nd edn. Wiley, New York

    Google Scholar 

  • Fletcher R, Reeves C (1964) Function minimization by conjugate gradients. Comput J 7:149–154

    MathSciNet  MATH  Google Scholar 

  • Halilu AS, Majumder A, Waziri MY, Awwal AM, Ahmed K (2021) On solving double direction methods for convex constrained monotone nonlinear equations with image restoration. Comput Appl Math 40:1–27

    MathSciNet  MATH  Google Scholar 

  • He BS, Yang H, Wang SL (2000) Alternationg direction method with self-adaptive penalty parameters for monotone variational inequalites. J Optim Theory Appl 106:337–356

    MathSciNet  Google Scholar 

  • Hestenes MR, Stiefel EL (1952) Methods of conjugate gradients for solving linear systems. J Res Nat Bur Stand 49:409–436

    MathSciNet  MATH  Google Scholar 

  • Hu Y, Wang Y (2020) An efficient projected gradient method for convex constrained monotone equations with applications in compressive sensing. J Appl Math Phys 8:983–998

    Google Scholar 

  • Kelly C (1999) Iterative methods for optimization. Front Appl Math. https://doi.org/10.1137/1.9781611970920CorpusID:123596970

    Article  Google Scholar 

  • La Cruz W (2017) A Spectral algorithm for large-scale systems of nonlinear monotone equations. Numer Algorithm. https://doi.org/10.1007/s1107s-017-0299-8

    Article  MathSciNet  MATH  Google Scholar 

  • La Cruz W, MartĂnez JM, Raydan M (2004) Spectral residual method without gradient information for solving large-scale nonlinear systems. Theory and Experiments, Citeseer, Technical Report RT-04-08

  • Li DH, Fukushima M (2001) A modified BFGS method and its global convergence in non-convex minimization. J Comput Appl Math 129(1–2):15–35

    MathSciNet  MATH  Google Scholar 

  • Li DH, Wang XL (2011) A modified Fletcher–Reeves-type derivative-free method for symmetric nonlinear equations. Numer Algebra Control Optim 1(1):71–82

    MathSciNet  MATH  Google Scholar 

  • Liu J, Feng Y (2019) A derivative-free iterative method for nonlinear monotone equations with convex constraints. Numer Algorithm 82:245–262

    MathSciNet  MATH  Google Scholar 

  • Liu JK, Li SJ (2015) A projection method for convex constrained monotone nonlinear equations with applications. Comput Math Appl 70(10):2442–2453

    MathSciNet  MATH  Google Scholar 

  • Liu J, Li S (2017) Multivariate spectral projection method for convex constrained nonlinear monotone equations. J Ind Manag Optim 13(1):283–297

    MathSciNet  MATH  Google Scholar 

  • Liu Y, Storey C (1991) Efficient generalized conjugate gradient algorithms. Part 1: theory. J Optim Theory Appl 69:129–137

    MathSciNet  MATH  Google Scholar 

  • Liu JK, Xu JL, Zhang LQ (2018) Partially symmetrical derivative-free Liu–Storey projection method for convex constrained equations. Int J Comput Math 10(1080/00207160):1533122

    Google Scholar 

  • Meintjes K, Morgan AP (1987) A methodology for solving chemical equilibrium systems. Appl Math Comput 22:333–361

    MathSciNet  MATH  Google Scholar 

  • Oren SS, Luenberger DG (1974) Self scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms. Manag Sci 20(5):845–862

    MATH  Google Scholar 

  • Oren SS, Spedicato E (1976) Optimal conditioning of self scaling variable metric algorithms. Math Program 101:70–90

    MathSciNet  MATH  Google Scholar 

  • Pang JS (1986) Inexact Newton methods for the nonlinear complementarity problem. Math Program 36:54–71

    MathSciNet  MATH  Google Scholar 

  • Polak E, Ribiére G (1969) Note Sur la convergence de directions conjugées, Rev. Francaise Informat. Recherche Operationelle, 3e Annèe 16 35-43

  • Polyak BT (1969) The conjugate gradient method in extreme problems. USSR Comp Math Math Phys 9:94–112

    MATH  Google Scholar 

  • Sabi’u J, Shah A, Waziri MY (2020) Two optimal Hager–Zhang conjugate gradient methods for solving monotone nonlinear equations. Appl Numer Math 153:217–233

    MathSciNet  MATH  Google Scholar 

  • Sabi’u J, Shah A, Waziri MY, Ahmed K (2021) Modified Hager–Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraint. Int J Comput Methods. https://doi.org/10.1142/S0219876220500437

    Article  MathSciNet  MATH  Google Scholar 

  • Slump CH (1992) Real-time image restoration in diagnostic X-ray imaging, the effects on quantum noise. In: Proceedings of 11th IAPR International

  • Solodov MV, Iusem AN (1997) Newton-type methods with generalized distances for constrained optimization. Optimization 41(3):257–277

    MathSciNet  MATH  Google Scholar 

  • Solodov MV, Svaiter BF (1998) A globally convergent inexact Newton method for systems of monotone equations. In: Fukushima M, Qi L (eds) Reformulation: nonsmooth, piecewise smooth, semismooth and smoothing methods. Kluwer Academic Publishers, Amsterdam, pp 355–369

    Google Scholar 

  • Wang XY, Li XJ, Kou XP (2016) A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints. Calcolo. https://doi.org/10.1007/s10092-015-0140-5

    Article  MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K (2022) Two descent Dai–Yuan conjugate gradient methods for systems of monotone nonlinear equations. J Sci Comput 90:36. https://doi.org/10.1007/s10915-021-01713-7

    Article  MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K, Sabi’u J (2019) A family of Hager–Zhang conjugate gradient methods for system of monotone nonlinear equations. Appl Math Comput 361:645–660

    MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K, Sabi’u J (2020a) A Dai–Liao conjugate gradient method via modified secant equation for system of nonlinear equations. Arab J Math 9:443–457

    MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K, Sabi’u J (2020b) Descent Perry conjugate gradient methods for systems of monotone nonlinear equations. Numer Algorithm 85:763–785

    MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K, Sabi’u J, Halilu AS (2020c) Enhanced Dai–Liao conjugate gradient methods for systems of monotone nonlinear equations. SeMA J 78:15–51

    MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K, Halilu AS, Awwal AM (2021a) Modified Dai-Yuan iterative scheme for nonlinear systems and its Application. Numer Algebra Control Optim. https://doi.org/10.3934/naco.2021044

    Article  Google Scholar 

  • Waziri MY, Usman H, Halilu AS, Ahmed K (2021b) Modified matrix-free methods for solving systems of nonlinear equations. Optimization 70:2321–2340

    MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K, Halilu AS, Sabi’u J (2022a) Two new Hager–Zhang iterative schemes with improved parameter choices for monotone nonlinear systems and their applications in compressed sensing. Rairo Oper Res. https://doi.org/10.1051/ro/2021190

    Article  MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K, Halilu AS (2022b) A modified PRP-type conjugate gradient projection algorithm for solving large-scale monotone nonlinear equations with convex constraint. J Comput Appl Math 407:114035

    MathSciNet  MATH  Google Scholar 

  • Waziri MY, Ahmed K, Halilu AS (2022c) Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations. Sao Paulo J Math Sci. https://doi.org/10.1007/s40863-022-00293-0

    Article  Google Scholar 

  • Xiao Y, Zhu H (2013) A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing. J Math Anal Appl 405(1):310–319

    MathSciNet  MATH  Google Scholar 

  • Xiao Y, Wang Q, Hu Q (2011) Non-smooth equations based method for \(\ell _1-norm\) problems with applications to compressed sensing. Nonlinear Anal Theory Methods Appl 74(11):3570–3577

    MATH  Google Scholar 

  • Yu GH, Niu SZ, Ma JH (2013) Multivariate spectral gradient projection method for non-linear monotone equations with convex constraints. J Ind Manag Optim 9:117–129

    MathSciNet  MATH  Google Scholar 

  • Zhang J, Xu C (2001) Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J Comput Appl Math 137(2):269–278

    MathSciNet  MATH  Google Scholar 

  • Zhang JZ, Deng NY, Chen LH (1999) New quasi-Newton equation and related methods for unconstrained optimization. J Optim Theory Appl 102(1):147–167

    MathSciNet  MATH  Google Scholar 

  • Zhang L, Zhou W, Li DH (2006a) Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer Math 104:561–572

    MathSciNet  MATH  Google Scholar 

  • Zhang L, Zhou W, Li D (2006b) Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer Math 104:561–572

    MathSciNet  MATH  Google Scholar 

  • Zhao YB, Li D (2001) Monotonicity of fixed point and normal mappings associated with variational inequality and its application. SIAM J Optim 11:962–973

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are grateful for the helpful and constructive comments by the anonymous reviewers and editors.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohammed Yusuf Waziri.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Communicated by Carlos Conca.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Waziri, M.Y., Ahmed, K. & Halilu, A.S. A modified Dai–Kou-type method with applications to signal reconstruction and blurred image restoration. Comp. Appl. Math. 41, 232 (2022). https://doi.org/10.1007/s40314-022-01917-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-022-01917-z

Keywords

Mathematics Subject Classification

Navigation