A Lyapunov Function Construction for a Non-convex Douglas–Rachford Iteration

  • Ohad GiladiEmail author
  • Björn S. Rüffer


While global convergence of the Douglas–Rachford iteration is often observed in applications, proving it is still limited to convex and a handful of other special cases. Lyapunov functions for difference inclusions provide not only global or local convergence certificates, but also imply robust stability, which means that the convergence is still guaranteed in the presence of persistent disturbances. In this work, a global Lyapunov function is constructed by combining known local Lyapunov functions for simpler, local subproblems via an explicit formula that depends on the problem parameters. Specifically, we consider the scenario, where one set consists of the union of two lines and the other set is a line, so that the two sets intersect in two distinct points. Locally, near each intersection point, the problem reduces to the intersection of just two lines, but globally the geometry is non-convex and the Douglas–Rachford operator multi-valued. Our approach is intended to be prototypical for addressing the convergence analysis of the Douglas–Rachford iteration in more complex geometries that can be approximated by polygonal sets through the combination of local, simple Lyapunov functions.


Douglas–Rachford iteration Lyapunov function Robust \(\mathcal {KL}\)-stability Non-convex optimization Global convergence 

Mathematics Subject Classification

47H10 47J25 37N40 90C26 



O. Giladi has been supported by ARC Grant DP160101537. B. S. Rüffer has been supported by ARC Grant DP160102138. Both authors would like to thank the anonymous referees for their helpful comments.


  1. 1.
    Douglas, J., Rachford, H.H.: On the numerical solution of heat conduction problems in two and three space variables. Trans. Am. Math. Soc. 82, 421–439 (1956)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Bauschke, H.H., Combettes, P.L., Luke, D.R.: Phase retrieval, error reduction algorithm, and Fienup variants: a view from convex optimization. J. Opt. Soc. Am. A 19(7), 1334–1345 (2002)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16(6), 964–979 (1979)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Aragón Artacho, F.J., Borwein, J.M., Tam, M.K.: Recent results on Douglas–Rachford methods for combinatorial optimization problems. J. Optim. Theory Appl. 163(1), 1–30 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Elser, V., Rankenburg, I., Thibault, P.: Searching with iterated maps. Proc. Natl. Acad. Sci. USA 104(2), 418–423 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Gravel, S., Elser, V.: Divide and concur: a general approach to constraint satisfaction. Phys. Rev. E 78, 036706 (2008)CrossRefGoogle Scholar
  7. 7.
    Goebel, K., Kirk, W.A.: Topics in Metric Fixed Point Theory. Cambridge Studies in Advanced Mathematics, vol. 28. Cambridge University Press, Cambridge (1990)CrossRefzbMATHGoogle Scholar
  8. 8.
    Opial, Z.: Weak convergence of the sequence of successive approximations for nonexpansive mappings. Bull. Am. Math. Soc. 73, 591–597 (1967)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Borwein, J.M., Sims, B.: The Douglas–Rachford algorithm in the absence of convexity. In: Bauschke, H.H., Burachik, R.S., Combettes, P.L., Elser, V., Luke, D.R., Wolkowicz, H. (eds.) Fixed-Point Algorithms for Inverse Problems in Science and Engineering, Springer Optim. Appl., vol. 49, pp. 93–109. Springer, New York (2011)Google Scholar
  10. 10.
    Aragón Artacho, F.J., Borwein, J.M.: Global convergence of a non-convex Douglas–Rachford iteration. J. Glob. Optim. 57(3), 753–769 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Benoist, J.: The Douglas–Rachford algorithm for the case of the sphere and the line. J. Glob. Optim. 63(2), 363–380 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Giladi, O.: A remark on the convergence of the Douglas-Rachford iteration in a non-convex setting. Set-Valued Var. Anal. 26(2), 207–225 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Aragón Artacho, F.J., Borwein, J.M., Tam, M.K.: Global behavior of the Douglas–Rachford method for a nonconvex feasibility problem. J. Glob. Optim. 65(2), 309–327 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Bauschke, H.H., Dao, M.N.: On the finite convergence of the Douglas–Rachford algorithm for solving (not necessarily convex) feasibility problems in Euclidean spaces. SIAM J. Optim. 27(1), 507–537 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Borwein, J.M., Lindstrom, S.B., Sims, B., Schneider, A., Skerritt, M.P.: Dynamics of the Douglas–Rachford method for ellipses and p-spheres. Set-Valued Var. Anal. 26(2), 385–403 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Dao, M.N., Tam, M.K.: A Lyapunov-type approach to convergence of the Douglas–Rachford algorithm. J . Global Optim. (2018).
  17. 17.
    Hesse, R., Luke, D.R.: Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems. SIAM J. Optim. 23(4), 2397–2419 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Phan, H.M.: Linear convergence of the Douglas–Rachford method for two closed sets. Optimization 65(2), 369–385 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Dao, M.N., Phan, H.M.: Linear convergence of projection algorithms. arXiv:1609.00341 (2016)
  20. 20.
    Dao, M.N., Phan, H.M.: Linear convergence of the generalized Douglas–Rachford algorithm for feasibility problems. J. Global Optim. (2018).
  21. 21.
    Li, G., Pong, T.K.: Douglas–Rachford splitting for nonconvex optimization with application to nonconvex feasibility problems. Math. Program. 159(1–2), 371–401 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    Bauschke, H.H., Noll, D.: On the local convergence of the Douglas–Rachford algorithm. Arch. Math. (Basel) 102(6), 589–600 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Kellett, C.M., Teel, A.R.: On the robustness of \(\cal{KL}\)-stability for difference inclusions: smooth discrete-time Lyapunov functions. SIAM J. Control Optim. 44(3), 777–800 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    The Sage Developers: SageMath, the Sage Mathematics Software System (Version 7.x and 8.0) (2017). Accessed 26 Sept 2018
  25. 25.
    Giladi, O., Rüffer, B.S.: Accompanying code for the paper “A Lyapunov Function Construction for a Non-convex Douglas-Rachford Iteration” (2017). Accessed 26 Sept 2018
  26. 26.
    Bauschke, H.H., Bello Cruz, J.Y., Nghia, T.T.A., Phan, H.M., Wang, X.: The rate of linear convergence of the Douglas–Rachford algorithm for subspaces is the cosine of the Friedrichs angle. J. Approx. Theory 185, 63–79 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Kellett, C.M.: Classical converse theorems in Lyapunov’s second method. Discrete Contin. Dyn. Syst. Ser. B 20(8), 2333–2360 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Sontag, E.D.: Mathematical Control Theory. Texts in Applied Mathematics, vol. 6, 2nd edn. Springer, New York (1998)CrossRefzbMATHGoogle Scholar
  29. 29.
    Bauschke, H.H., Moursi, W.M.: On the order of the operators in the Douglas–Rachford algorithm. Optim. Lett. 10(3), 447–455 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  30. 30.
    Giladi, O., Rüffer, B.S.: A Lyapunov function construction for the Douglas–Rachford operator in a non-convex setting. arXiv:1708.08697v2 (2017)

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.School of Mathematical and Physical SciencesUniversity of NewcastleCallaghanAustralia

Personalised recommendations