Skip to main content
Log in

A note on the Douglas–Rachford splitting method for optimization problems involving hypoconvex functions

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

Recently, the convergence of the Douglas–Rachford splitting method (DRSM) was established for minimizing the sum of a nonsmooth strongly convex function and a nonsmooth hypoconvex function under the assumption that the strong convexity constant \(\beta \) is larger than the hypoconvexity constant \(\omega \). Such an assumption, implying the strong convexity of the objective function, precludes many interesting applications. In this paper, we prove the convergence of the DRSM for the case \(\beta =\omega \), under relatively mild assumptions compared with some existing work in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Baillon, J.B., Haddad, G.: Quelques propriétés des opérateurs angle-bornés et \(n\)-cycliquement monotones. Isr. J. Math. 26, 137–150 (1977)

    Article  Google Scholar 

  2. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, Berlin (2011)

    Book  Google Scholar 

  3. Bauschke, H.H., Hare, W.L., Moursi, W.M.: On the range of the Douglas–Rachford operator. Math. Oper. Res. 41, 884–897 (2016)

    Article  MathSciNet  Google Scholar 

  4. Bauschke, H.H., Koch, V.R., Phan, H.M.: Stadium norm and Douglas–Rachford splitting: a new approach to road design optimization. Oper. Res. 64, 201–218 (2016)

    Article  MathSciNet  Google Scholar 

  5. Bayram, İ., Selesnick, I.W.: The Douglas–Rachford algorithm for weakly convex penalties. arXiv:1511.03920v1 (2015)

  6. Beck, A., Teboulle, M.: Gradient-based algorithms with applications to signal recovery problems. In: Palomar, D., Eldar, Y. (eds.) Convex Optimization in Signal Processing and Communications, pp. 139–162. Cambridge University Press, Cambridge (2009)

    Google Scholar 

  7. Bolte, J., Daniilidis, A., Ley, O., Mazet, L.: Characterizations of Łojasiewicz inequalities: subgradient flows, talweg, convexity. Trans. Am. Math. Soc. 362, 3319–3363 (2010)

    Article  Google Scholar 

  8. Cannarsa, P., Sinestrari, C.: Semiconcave Functions, Hamilton–Jacobi Equations, and Optimal Control, vol. 58. Springer, Berlin (2004)

    MATH  Google Scholar 

  9. Degiovanni, M., Marino, A., Tosques, M.: Evolution equations with lack of convexity. Nonlinear Anal. 9, 1401–1443 (1985)

    Article  MathSciNet  Google Scholar 

  10. Douglas, J., Rachford, H.H.: On the numerical solution of heat conduction problems in two or three space variables. Trans. Am. Math. Soc. 82, 421–439 (1956)

    Article  MathSciNet  Google Scholar 

  11. Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55, 293–318 (1992)

    Article  MathSciNet  Google Scholar 

  12. Fukushima, M.: The primal Douglas–Rachford splitting algorithm for a class of monotone mappings with application to the traffic equilibrium problem. Math. Program. 72, 1–15 (1996)

    MathSciNet  MATH  Google Scholar 

  13. Guo, K., Han, D.R., Yuan, X.M.: Convergence analysis of Douglas–Rachford splitting method for “strongly + weakly” convex programming. SIAM J. Numer. Anal. 55(4), 1549–1577 (2017)

    Article  MathSciNet  Google Scholar 

  14. He, B.S., Yuan, X.M.: On the convergence rate of the Douglas–Rachford operator splitting method. Math. Program. 153, 715–722 (2015)

    Article  MathSciNet  Google Scholar 

  15. Kanzow, C., Shehu, Y.: Generalized Krasnosel’skiĭ-Mann-type iterations for nonexpansive mappings in Hilbert spaces. Comput. Optim. Appl. 67(3), 595–620 (2017)

    Article  MathSciNet  Google Scholar 

  16. Krasnosel’skiĭ, M.A.: Two remarks on the method of successive approximations. Uspehi Mat. Nauk. 10, 123–127 (1955)

    MathSciNet  Google Scholar 

  17. Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16, 964–979 (1979)

    Article  MathSciNet  Google Scholar 

  18. Marcellin, S., Thibault, L.: Evolution problems associated with primal lower nice functions. J. Convex Anal. 13, 385–421 (2006)

    MathSciNet  MATH  Google Scholar 

  19. Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (2015)

    Google Scholar 

  20. Rockafellar, R.T., Wets, R.J.-B.: Variational Analysis. Springer, Berlin (1998)

    Book  Google Scholar 

  21. Wang, X.F.: On Chebyshev functions and Klee functions. J. Math. Anal. Appl. 368, 293–310 (2010)

    Article  MathSciNet  Google Scholar 

  22. Wen, B., Chen, X.J., Pong, T.K.: Linear convergence of proximal gradient algorithm with extrapolation for a class of nonconvex nonsmooth minimization problems. SIAM J. Optim. 27, 124–145 (2017)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deren Han.

Additional information

K. Guo was supported by the Natural Science Foundation of China (Grant No. 11571178), Fundamental Research Funds of China West Normal University (Grant No. 412698). D. Han was supported by a project funded by PAPD of Jiangsu Higher Education Institutions and the Natural Science Foundation of China (Grant Nos. 11625105, 11371197 and 11431002).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Guo, K., Han, D. A note on the Douglas–Rachford splitting method for optimization problems involving hypoconvex functions. J Glob Optim 72, 431–441 (2018). https://doi.org/10.1007/s10898-018-0660-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-018-0660-z

Keywords

Navigation