Stochastic Heavy-Ball Method for Constrained Stochastic Optimization Problems

Abstract

In this paper, we consider a heavy-ball method for the constrained stochastic optimization problem by focusing to the situation that the constraint set is specified as the intersection of possibly finitely many constraint sets. A variant algorithm of the stochastic heavy-ball method is proposed which will be incrementally processed by both the stochastic heavy-ball method and random constraint projection simultaneously. They converge almost surely to a solution of the suggested method is exhibited. Finally, a numerical experiment is discussed.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2

References

  1. 1.

    Aronszajn, N.: Theory of reproducing kernels. Trans. Am. Math. Soc. 68, 337–404 (1950)

    MathSciNet  Article  Google Scholar 

  2. 2.

    Bauschke, H.H.: Projection algorithms: results and open problems. Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications. Stud. Comput Math, vol. 8, pp. 11–22. North-Holland (2001)

  3. 3.

    Cauchy, A. -L.: Méthode générale pour la résolution des systèmes d’équations simultanées. Comptes Rendus Hebd. Seances Acad. Sci. 25, 536–538 (1847)

    Google Scholar 

  4. 4.

    Deutsch, F., Hundal, H.: The rate of convergence for the cyclic projections algorithm III: egularity of convex sets. J. Approx. Theory 155(2), 155–184 (2008)

    MathSciNet  Article  Google Scholar 

  5. 5.

    Gubin, L.G., Polyak, B.T., Raik, E.V.: The method of projections for finding the common point of convex sets. USSR Comput. Math. Math. Phys. 7(6), 1–24 (1967)

    Article  Google Scholar 

  6. 6.

    Ghadimi, E., Feyzmahdavian, H.R., Johansson, M.: Global convergence of the heavy-ball method for convex optimization. arXiv:1412.7457.pdf(2014)

  7. 7.

    Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  8. 8.

    Johnson, R., Zhang, T.: Accelerating stochastic gradient descent using predictive variance reduction. Adv. NIPS 26, 315–323 (2013)

    Google Scholar 

  9. 9.

    Loizou, N., Richtarik, P.: Linearly convergent stochastic heavy ball method for minimizing generalization error. In: 10th NIPS Workshop on Optimization for Machine Learning (NIPS 2017) (2017)

  10. 10.

    Nedic, A.: Random projection algorithms for convex set intersection problems. In: 49th IEEE Conference on Decision and Control (CDC), pp. 7655–7660 (2010)

  11. 11.

    Nemirovski, A., Juditsky, A., Lan, G., Shapiro, A.: Robust stochastic approximation approach to stochastic programming. SIAM J. Optim. 19(4), 1574–1609 (2008)

    MathSciNet  Article  Google Scholar 

  12. 12.

    Neumann, J.V.: Functional Operators. Princeton University Press, Princeton (1950)

    Google Scholar 

  13. 13.

    Nguyen, L.M., Nguyen, P.H., Dijk, M.V., Richtárik, P., Scheinberg, K., Takáč, M.: SGD and Hogwild! convergence without the bounded gradients assumption. Proceedings of the 35th International Conference on Machine Learning PMLR 80, 3747–3755 (2018)

    Google Scholar 

  14. 14.

    Polyak, B.T.: Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys. 4(5), 1–17 (1964)

    Article  Google Scholar 

  15. 15.

    Robbins, H., Monro, S.: A stochastic approximation method. Ann. Math. Statist. 22, 400–407 (1951)

    MathSciNet  Article  Google Scholar 

  16. 16.

    Robbins, H., Siegmund, D.: A convergence theorem for non negative almost supermartingales and some applications. Optim. Methods Statistics, pp 233–257. Academic Press, N. Y. (1971)

    Google Scholar 

  17. 17.

    Roux, N.L., Schmidt, M., Bach, F.: A stochastic gradient method with an exponential convergence rate for finite training sets. Adv. NIPS 4, 2663–2671 (2012)

    Google Scholar 

  18. 18.

    Schmidt, M., Roux, N.L., Bach, F.: Minimizing finite sums with the stochastic average gradient. Technical report (2013)

  19. 19.

    Shanno, D.F., Phua, K.H.: Algorithm 500: Minimization of unconstrained multivariate functions [E4]. ACM Trans. Math. Softw. 2(1), 87–94 (1976)

    Article  Google Scholar 

  20. 20.

    Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15(6), 1247–1257 (1978)

    MathSciNet  Article  Google Scholar 

  21. 21.

    Strohmer, T., Vershynin, R.: A randomized Kaczmarz algorithm with exponential convergence. J. Fourier Anal. Appl. 15(2), 262–278 (2009)

    MathSciNet  Article  Google Scholar 

  22. 22.

    Sun, T., Yin, P., Li, D., Huang, C., Guan, L., Jiang, H.: Non-ergodic convergence analysis of heavy-ball algorithms, arXiv:1811.01777(2018)

  23. 23.

    Wang, M., Bertsekas, D.P.: Stochastic first-order methods with random constraint projection. SIAM J. Optim. 26(1), 681–717 (2016)

    MathSciNet  Article  Google Scholar 

Download references

Funding

This study is supported by the Thailand Research Fund through the Royal Golden Jubilee Ph.D. Program (Grant No. PHD/0023/2555).

Author information

Affiliations

Authors

Corresponding author

Correspondence to Narin Petrot.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Promsinchai, P., Farajzadeh, A. & Petrot, N. Stochastic Heavy-Ball Method for Constrained Stochastic Optimization Problems. Acta Math Vietnam 45, 501–514 (2020). https://doi.org/10.1007/s40306-019-00357-y

Download citation

Keywords

  • Constrained stochastic optimization problem
  • Heavy-ball method
  • Random projection method
  • Converge almost surely

Mathematics Subject Classification (2010)

  • 90C15
  • 90C25
  • 90C06
  • 65K05