Skip to main content
Log in

An almost cyclic 2-coordinate descent method for singly linearly constrained problems

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

A block decomposition method is proposed for minimizing a (possibly non-convex) continuously differentiable function subject to one linear equality constraint and simple bounds on the variables. The proposed method iteratively selects a pair of coordinates according to an almost cyclic strategy that does not use first-order information, allowing us not to compute the whole gradient of the objective function during the algorithm. Using first-order search directions to update each pair of coordinates, global convergence to stationary points is established for different choices of the stepsize under an appropriate assumption on the level set. In particular, both inexact and exact line search strategies are analyzed. Further, linear convergence rate is proved under standard additional assumptions. Numerical results are finally provided to show the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Beck, A.: The 2-coordinate descent method for solving double-sided simplex constrained minimization problems. J. Optim. Theory Appl. 162(3), 892–919 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  2. Beck, A., Tetruashvili, L.: On the convergence of block coordinate descent type methods. SIAM J. Optim. 23(4), 2037–2060 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bertsekas, D.P.: Projected Newton methods for optimization problems with simple constraints. SIAM J. Control Optim. 20(2), 221–246 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bomze, I.M., Rinaldi, F., Rota Bulò, S.: First-order methods for the impatient: support identification in finite time with convergent Frank-Wolfe variants. Optimization Online (2018). http://www.optimization-online.org/DB_HTML/2018/07/6694.html

  5. Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp. 144–152. ACM (1992)

  6. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. (TIST) 2(3), 27 (2011)

    Google Scholar 

  7. Chang, K.W., Hsieh, C.J., Lin, C.J.: Coordinate descent method for large-scale l2-loss linear support vector machines. J. Mach. Learn. Res. 9, 1369–1398 (2008)

    MathSciNet  MATH  Google Scholar 

  8. Cristofari, A., De Santis, M., Lucidi, S., Rinaldi, F.: An active-set algorithmic framework for non-convex optimization problems over the simplex (2018). arXiv preprint arXiv:1703.07761

  9. Fan, R.E., Chen, P.H., Lin, C.J.: Working set selection using second order information for training support vector machines. J. Mach. Learn. Res. 6, 1889–1918 (2005)

    MathSciNet  MATH  Google Scholar 

  10. Hsieh, C.J., Chang, K.W., Lin, C.J., Keerthi, S.S., Sundararajan, S.: A dual coordinate descent method for large-scale linear SVM. In: Proceedings of the 25th International Conference on Machine Learning, pp. 408–415. ACM (2008)

  11. Joachims, T.: Making large-scale support vector machine learning practical. In: Schölkopf, B., Burges, C.J., Smola, A.J. (eds.) Advances in Kernel Methods—Support Vector Learning, B, pp. 169–184. MIT Press, Cambridge (1999)

    Google Scholar 

  12. Konnov, I.V.: Selective bi-coordinate variations for resource allocation type problems. Comput. Optim. Appl. 64(3), 821–842 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  13. Lin, C.J.: On the convergence of the decomposition method for support vector machines. IEEE Trans. Neural Netw. 12(6), 1288–1298 (2001)

    Article  Google Scholar 

  14. Lin, C.J., Lucidi, S., Palagi, L., Risi, A., Sciandrone, M.: Decomposition algorithm model for singly linearly-constrained problems subject to lower and upper bounds. J. Optim. Theory Appl. 141(1), 107–126 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  15. Liuzzi, G., Palagi, L., Piacentini, M.: On the convergence of a Jacobi-type algorithm for singly linearly-constrained problems subject to simple bounds. Optim. Lett. 5(2), 347–362 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  16. Lucidi, S., Palagi, L., Risi, A., Sciandrone, M.: A convergent decomposition algorithm for support vector machines. Comput. Optim. Appl. 38(2), 217–234 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  17. Luo, Z.Q., Tseng, P.: On the convergence of the coordinate descent method for convex differentiable minimization. J. Optim. Theory Appl. 72(1), 7–35 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  18. Manno, A., Palagi, L., Sagratella, S.: Parallel decomposition methods for linearly constrained problems subject to simple bound with application to the SVMs training. Comput. Optim. Appl. 71(1), 115–145 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  19. Necoara, I.: Random coordinate descent algorithms for multi-agent convex optimization over networks. IEEE Trans. Autom. Control 58(8), 2001–2012 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  20. Necoara, I., Nesterov, Y., Glineur, F.: Random block coordinate descent methods for linearly constrained optimization over networks. J. Optim. Theory Appl. 173(1), 227–254 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  21. Necoara, I., Patrascu, A.: A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints. Comput. Optim. Appl. 57(2), 307–337 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  22. Nesterov, Y.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22(2), 341–362 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  23. Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course, vol. 87. Springer, New York (2013)

    MATH  Google Scholar 

  24. Palagi, L., Sciandrone, M.: On the convergence of a modified version of SVM light algorithm. Optim. Methods Softw. 20(2–3), 317–334 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  25. Patrascu, A., Necoara, I.: Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization. J. Glob. Optim. 61(1), 19–46 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  26. Platt, J.C.: Sequential minimal optimization: a fast algorithm for training support vector machines. In: Schölkopf, B., Burges, C.J., Smola, A.J. (eds.) Advances in Kernel Methods—Support Vector Learning, pp. 185–208. MIT Press, Cambridge (1998)

    Google Scholar 

  27. Raj, A., Olbrich, J., Gärtner, B., Schölkopf, B., Jaggi, M.: Screening rules for convex problems (2016). arXiv preprint arXiv:1609.07478

  28. Reddi, S., Hefny, A., Downey, C., Dubey, A., Sra, S.: Large-scale randomized-coordinate descent methods with non-separable linear constraints (2014). arXiv preprint arXiv:1409.2617

  29. Tseng, P.: Descent methods for convex essentially smooth minimization. J. Optim. Theory Appl. 71(3), 425–463 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  30. Tseng, P., Yun, S.: A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training. Comput. Optim. Appl. 47(2), 179–206 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  31. Wright, S.J.: Coordinate descent algorithms. Math. Program. 151(1), 3–34 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  32. Xiao, L., Boyd, S.: Optimal scaling of a gradient method for distributed resource allocation. J. Optim. Theory Appl. 129(3), 469–488 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  33. Xu, S., Freund, R.M., Sun, J.: Solution methodologies for the smallest enclosing circle problem. Comput. Optim. Appl. 25(1–3), 283–292 (2003)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrea Cristofari.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cristofari, A. An almost cyclic 2-coordinate descent method for singly linearly constrained problems. Comput Optim Appl 73, 411–452 (2019). https://doi.org/10.1007/s10589-019-00082-0

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-019-00082-0

Keywords

Mathematics Subject Classification

Navigation