Skip to main content
Log in

On perturbed hybrid steepest descent method with minimization or superiorization for subdifferentiable functions

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

For finding the minimum value of differentiable functions over a nonempty closed convex subset of a Hilbert space, the hybrid steepest descent method (HSDM) can be applied. In this work, we study perturbed algorithms in line with a generalized HSDM and discuss how some selections of perturbations enable us to increase the convergence speed. When we specialize these results to constrained minimization then the perturbations become bounded perturbations used in the superiorization methodology (SM). We show usefulness of the SM in studying the constrained convex minimization problem for subdifferentiable functions and proceed with the study of the computational efficiency of the SM compared with the HSDM. In the computational experiment comparing the HSDM with superiorization, the latter seems to be advantageous for the specific experiment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Halpern, B.: Fixed points of nonexpanding maps. Bull. Am. Math. Soc. 73, 957–961 (1967)

    MathSciNet  MATH  Google Scholar 

  2. Yamada, I.: The hybrid steepest descent method for the variational inequality problems over the intersection of fixed point sets of nonexpansive mappings. In: Butnariu, D, Censor, Y, Reich, S (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications. Studies in Computational Mathematics, vol. 8, pp 473–504. North-Holland, Amsterdam (2001)

  3. Yamada, I., Ogura, N., Shirakawa, N.: A numerically robust hybrid steepest descent method for the convexly constrained generalized inverse problems. Contemp. Math. 313, 269–305 (2002)

    MathSciNet  MATH  Google Scholar 

  4. Censor, Y., Davidi, R., Herman, G.T.: Perturbation resilience and superiorization of iterative algorithms. Inverse Prob. 26, 065008 (2010)

    MathSciNet  MATH  Google Scholar 

  5. Nikazad, T., Davidi, R., Herman, G.T.: Accelerated perturbation-resilient block-iterative projection methods with application to image reconstruction. Inverse Prob. 28, 035005 (2012)

    MathSciNet  MATH  Google Scholar 

  6. Censor, Y.: Weak and strong superiorization: between feasibility-seeking and minimization. An. St. Univ. Ovidius Constanta, Ser. Mat. 23, 41–54 (2015)

    MathSciNet  MATH  Google Scholar 

  7. Censor, Y., Zaslavski, A.J.: Strict fejér monotonicity by superiorization of feasibility-seeking projection methods. J. Optim. Theory Appl. 165, 172–187 (2015)

    MathSciNet  MATH  Google Scholar 

  8. Cegielski, A., Al-Musallam, F.: Superiorization with level control. Inverse Prob. 33, 044009 (2017)

    MathSciNet  MATH  Google Scholar 

  9. Censor, Y., Davidi, R., Herman, G.T., Schulte, R.W., Tetruashvili, L.: Projected subgradient minimization versus superiorization. J. Optim. Theory Appl. 160, 730–747 (2014)

    MathSciNet  MATH  Google Scholar 

  10. Censor, Y.: Superiorization and perturbation resilience of algorithms: a bibliography compiled and continuously updated. http://math.haifa.ac.il/yair/bib-superiorization-censor.html see also: arXiv:1506.04219

  11. Zaslavski, A.J.: Numerical Optimization with Computational Errors, vol. 108. Springer, Berlin (2016)

    Google Scholar 

  12. Herman, G.T., Garduño, E., Davidi, R., Censor, Y.: Superiorization: an optimization heuristic for medical physics. Med. Phys. 39, 5532–5546 (2012)

    Google Scholar 

  13. Zaslavski, A.J.: Asymptotic behavior of two algorithms for solving common fixed point problems. Inverse Prob. 33, 044004 (2017)

    MathSciNet  MATH  Google Scholar 

  14. Censor, Y.: Can linear superiorization be useful for linear optimization problems. Inverse Prob. 33, 044006 (2017)

    MathSciNet  MATH  Google Scholar 

  15. He, H., Xu, H.K.: Perturbation resilience and superiorization methodology of averaged mappings. Inverse Prob. 33, 044007 (2017)

    MathSciNet  MATH  Google Scholar 

  16. Censor, Y., Heaton, H., Schulte, R.: Derivative-free superiorization with component-wise perturbations. Numer. Algor. 80, 1219–1240 (2019)

    MathSciNet  MATH  Google Scholar 

  17. Saeidi, S., Kim, D.S.: Combination of the hybrid steepest-descent method and the viscosity approximation. J. Optim. Theory Appl. 160, 911–930 (2014)

    MathSciNet  MATH  Google Scholar 

  18. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York (2017)

    MATH  Google Scholar 

  19. Mordukhovich, B.S.: Variational analysis and applications. Vol. 8 springer (2018)

  20. Xu, H.K.: Iterative algorithms for nonlinear operators. J. London Math. Soc. 66, 240–256 (2002)

    MathSciNet  MATH  Google Scholar 

  21. Xu, H.K., Kim, T.H.: Convergence ofhybrid steepest-descent method for variational inequalities. J. Optim. Theory Appl. 119, 185–201 (2003)

    MathSciNet  Google Scholar 

  22. Yamada, I., Ogura, N.: Hybrid steepest descent method for variational inequality problem over the fixed point set of certain quasi-nonexpansive mappings. Numer. Funct. Anal. Optim. 25, 619–655 (2004)

    MathSciNet  MATH  Google Scholar 

  23. Cegielski, A., Al-Musallam, F.: Strong convergence of a hybrid steepest descent method for the split common fixed point problem. Optimization 65, 1463–1476 (2016)

    MathSciNet  MATH  Google Scholar 

  24. Xu, H.K.: An iterative approach to quadratic optimization. J. Optim. Theory Appl. 116, 659–678 (2003)

    MathSciNet  MATH  Google Scholar 

  25. Moudafi, A.: Viscosity approximation methods for fixed-points problems. J. Math. Anal. Appl. 241, 46–55 (2000)

    MathSciNet  MATH  Google Scholar 

  26. Garcia-Falset, J., Llorens-Fuster, E., Prus, S.: The fixed point property for mappings admitting a center. Nonlinear Anal. 66, 1257–1274 (2007)

    MathSciNet  MATH  Google Scholar 

  27. Yamagishi, M., Yamada, I.: Nonexpansiveness of a linearized augmented Lagrangian operator for hierarchical convex optimization. Inverse Prob. 33, 044003 (2017)

    MathSciNet  MATH  Google Scholar 

  28. Bargetz, C., Reich, S., Zalas, R.: Convergence properties of dynamic string-averaging projection methods in the presence of perturbations. Numer. Algor. 77, 185–209 (2018)

    MathSciNet  MATH  Google Scholar 

  29. Iiduka, H.: Three-term conjugate gradient method for the convex optimization problem over the fixed point set of a nonexpansive mapping. Appl. Math. Comput. 217, 6315–6327 (2011)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

The authors would like to thank the referee for giving valuable and constructive comments that greatly contributed to improving the final version of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohsen Hoseini.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hoseini, M., Saeidi, S. & Kim, D.S. On perturbed hybrid steepest descent method with minimization or superiorization for subdifferentiable functions. Numer Algor 85, 353–374 (2020). https://doi.org/10.1007/s11075-019-00818-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-019-00818-3

Keywords

Navigation