Skip to main content
Log in

On Proximal Subgradient Splitting Method for Minimizing the sum of two Nonsmooth Convex Functions

  • Published:
Set-Valued and Variational Analysis Aims and scope Submit manuscript

Abstract

In this paper we present a variant of the proximal forward-backward splitting iteration for solving nonsmooth optimization problems in Hilbert spaces, when the objective function is the sum of two nondifferentiable convex functions. The proposed iteration, which will be called Proximal Subgradient Splitting Method, extends the classical subgradient iteration for important classes of problems, exploiting the additive structure of the objective function. The weak convergence of the generated sequence was established using different stepsizes and under suitable assumptions. Moreover, we analyze the complexity of the iterates.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Alber, Y.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math Program 81, 23–37 (1998)

    MathSciNet  MATH  Google Scholar 

  2. Bauschke, H.H., Borwein, J.: On projection algorithms for solving convex feasibility problems. SIAM Rev 38, 367–426 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bauschke, H.H., Combettes, P.L.: Convex analysis and monotone operator theory in Hilbert spaces. Springer, New York (2011)

  4. Bauschke, H.H., Koch, V.R., Phan, H.M.: Stadium norm and Douglas-Rachford splitting: a new approach to road design optimization. Operations Research (2016) in press

  5. Beck, A., Teboulle, M.: Fast gradient-based algorithms for constrained total variation image denoising and deblurring. IEEE Trans Image Process 18, 2419–2434 (2009)

    Article  MathSciNet  Google Scholar 

  6. Beck, A., Teboulle, M.: Gradient-Based Algorithms with Applications to Signal Recovery Problems. In: Palomar, D., Eldar, Y. (eds.) Convex Optimization in Signal Processing and Communications, pp. 42–88. University Press, Cambribge (2010)

  7. Bello Cruz, J.Y.: A subgradient method for vector optimization problems. SIAM J Optim 23, 2169–2182 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  8. Bello Cruz, J.Y., Iusem, A.N.: A strongly convergent method for nonsmooth convex minimization in Hilbert spaces. Numer Funct Anal Optim 32, 1009–1018 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  9. Bello Cruz, J.Y., Iusem, A.N.: Convergence of direct methods for paramonotone variational inequalities. Comput Optim Appl 46, 247–263 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  10. Bello Cruz, J.Y., Nghia, T.T.A.: On the convergence of the proximal forward-backward splitting method with linesearches. Technical report, 2015, Available in arXiv:1501.02501.pdf

  11. Bot, R., Csetnek, E.R.: Forward-Backward and Tseng’s type penalty schemes for monotone inclusion problems. Set-Valued and Variational Analysis 22, 313–331 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  12. Candes, E.J., Tao, T.: Decoding by linear programming. IEEE Trans Inf Theory 51, 4203–4215 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  13. Chavent, G., Kunisch, K.: Convergence of Tikhonov regularization for constrained ill-posed inverse problems. Inverse Prob 10, 63–76 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  14. Chen, G.H.-G., Rockafellar, R.T.: Convergence rates in forward-backward splitting. SIAM J Optim 7, 421–444 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  15. Combettes, P.L.: Solving monotone inclusions via compositions of nonexpansive averaged operators. Optimization 53, 475–504 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  16. Combettes, P.L.: Quasi-Fejérian analysis of some optimization algorithms. Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications. Studies in Computational Mathematics 8 115–152 North-Holland Amsterdam (2001)

  17. Combettes, P.L., Pesquet, J.-C.: A Douglas-Rachford splitting approach to nonsmooth convex variational signal recovery. IEEE Journal of selected topics in signal precessing 1, 564–574 (2007)

    Article  Google Scholar 

  18. Combettes, P.L., Pesquet, J.-C.: Proximal splitting methods in signal processing. In: Fixed-Point Algorithms for Inverse Problems. Science and Engineering. Springer Optimization and Its Applications 49 185–212 Springer New York (2011)

  19. Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward-backward splitting. Multiscale Model Simul 4, 1168–1200 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  20. Combettes, P.L., Vũ, B.C.: Variable metric forward-backward splitting with applications to monotone inclusions in duality. Optimization 63, 1289–1318 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  21. D’Antonio, G., Frangioni, A.: Convergence analysis of deflected conditional approximate subgradient methods. SIAM J Optim 20, 357–386 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  22. Ermoliev, Y.U.M.: On the method of generalized stochastic gradients and quasi-Fejér sequences. Cybernetics 5, 208–220 (1969)

    Article  Google Scholar 

  23. Figueiredo, M., Novak, R., Wright, S.J.: Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J Sel Top Sign Proces 1, 586–597 (2007)

    Article  Google Scholar 

  24. Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE Trans Pattern Anal Mach Intell 6, 721–741 (1984)

    Article  MATH  Google Scholar 

  25. Held, M., Wolfe, P., Crowder, H.: Validation of subgradient optimization. Math Program 6, 66–68 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  26. James, G.M., Radchenko, P., Lv, J.: DASSO: connections between the Dantzig selector and lasso. J R Stat Soc Ser B Stat Methodol 71, 127–142 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  27. Kim, S., Ahn, H., Cho, S.-C.: Variable target value subgradient method. Math Program 49, 359–369 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  28. Kiwiel, K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J Optim 14, 807–840 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  29. Kiwiel, K.C.: The efficiency of subgradient projection methods for convex optimization, Parts I: general level methods. SIAM J Control Optim 34, 660–676 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  30. Larson, T., Patriksson, M., Stromberg, A.-B.: Conditional subgradient optimization - Theory and application. Eur J Oper Res 88, 382–403 (1996)

    Article  MATH  Google Scholar 

  31. Mosci, S., Rosasco, L., Santoro, M., Verri, A., Villa, S., Sebag, M.: Solving structured sparsity regularization with proximal methods. In: Balczar, J, Bonchi, F, Gionis, A (eds.) Machine Learning and Knowledge Discovery in Databases, 6322 of Lecture Notes in Computer Science, Springer, 2010, 418–433

  32. Neal, P., Boyd, S.: Proximal Algorithms. Foundations and Trends in Optimization 1, 127–239 (2014)

    Article  Google Scholar 

  33. Nedic, A., Bertsekas, D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J Optim 12, 109–138 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  34. Nesterov, Y.U.: Subgradient methods for huge-scale optimization problems. Math Program 146, 275–297 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  35. Nesterov, Y.U.: Gradient methods for minimizing composite functions. Math Program 140, 125–161 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  36. Nesterov, Y.U.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J Optim 22, 341–362 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  37. Nesterov, Y.U.: Introductory lectures on convex optimization: A Basic Course. Kluwer Academic Publishers Norwel (2004)

  38. Polyak, B.T.: Minimization of unsmooth functionals U.S.S.R. Comput Math Math Phys 9, 14–29 (1969)

    Article  MATH  Google Scholar 

  39. Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM J Control Optim 14, 877–898 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  40. Sagastizb̈al, C.: Composite proximal bundle method. Math Program 140, 189–233 (2013)

    Article  MathSciNet  Google Scholar 

  41. Sherali, H.D., Choi, G., Tuncbilek, C.H.: A variable target value method for nondifferentiable optimization. A variable target value method for nondifferentiable optimization, 1–8 (1997)

  42. Svaiter, B.F.: A class of Fejér convergent algorithms, approximate resolvents and the hybrid Proximal-Extragradient method. J Optim Theory Appl 162, 133–153 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  43. Tropp, J.: Just relax: convex programming methods for identifying sparse signals. IEEE Trans Inf Theory 51, 1030–1051 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  44. Zhu, D.L., Marcotte, P.: Co-coercivity and its role in the convergence of iterative schemes for solving variational inequalities. SIAM J Optim 6, 714–726 (1996)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to José Yunier Bello Cruz.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bello Cruz, J.Y. On Proximal Subgradient Splitting Method for Minimizing the sum of two Nonsmooth Convex Functions. Set-Valued Var. Anal 25, 245–263 (2017). https://doi.org/10.1007/s11228-016-0376-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11228-016-0376-5

Keywords

Mathematics Subject Classification (2010)

Navigation