Skip to main content

Advertisement

Log in

Convergence rates of subgradient methods for quasi-convex optimization problems

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

Quasi-convex optimization acts a pivotal part in many fields including economics and finance; the subgradient method is an effective iterative algorithm for solving large-scale quasi-convex optimization problems. In this paper, we investigate the quantitative convergence theory, including the iteration complexity and convergence rates, of various subgradient methods for solving quasi-convex optimization problems in a unified framework. In particular, we consider a sequence satisfying a general (inexact) basic inequality, and investigate the global convergence theorem and the iteration complexity when using the constant, diminishing or dynamic stepsize rules. More importantly, we establish the linear (or sublinear) convergence rates of the sequence under an additional assumption of weak sharp minima of Hölderian order and upper bounded noise. These convergence theorems are applied to establish the iteration complexity and convergence rates of several subgradient methods, including the standard/inexact/conditional subgradient methods, for solving quasi-convex optimization problems under the assumptions of the Hölder condition and/or the weak sharp minima of Hölderian order.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Apolinario, H.C.F., Papa Quiroz, E.A., Oliveira, P.R.: A scalarization proximal point method for quasiconvex multiobjective minimization. J. Glob. Optim. 64, 79–96 (2016)

    MathSciNet  MATH  Google Scholar 

  2. Auslender, A., Teboulle, M.: Projected subgradient methods with non-Euclidean distances for non-differentiable convex minimization and variational inequalities. Math. Program. 120, 27–48 (2009)

    MathSciNet  MATH  Google Scholar 

  3. Aussel, D., Daniilidis, A.: Normal characterization of the main classes of quasiconvex functions. Set-Valued Anal. 8, 219–236 (2000)

    MathSciNet  MATH  Google Scholar 

  4. Avriel, M., Diewert, W.E., Schaible, S., Zang, I.: Generalized Concavity. Plenum Press, New York (1988)

    MATH  Google Scholar 

  5. Bao, J., Yu, C.K.W., Wang, J., Hu, Y., Yao, J.C.: Modified inexact Levenberg–Marquardt methods for solving nonlinear least squares problems. Comput. Optim. Appl. 74, 547–582 (2019)

    MathSciNet  MATH  Google Scholar 

  6. Bertsekas, D.P.: Nonlinear Programming. Athena Scientific, Cambridge (1999)

    MATH  Google Scholar 

  7. Bolte, J., Nguyen, T.P., Peypouquet, J., Suter, B.W.: From error bounds to the complexity of first-order descent methods for convex functions. Math. Program. 165, 471–507 (2017)

    MathSciNet  MATH  Google Scholar 

  8. Brännlund, U.: A generalized subgradient method with relaxation step. Math. Program. 71, 207–219 (1995)

    MathSciNet  MATH  Google Scholar 

  9. Burke, J.V., Deng, S.: Weak sharp minima revisited, part I: basic theory. Control Cybern. 31, 439–469 (2002)

    MATH  Google Scholar 

  10. Burke, J.V., Ferris, M.C.: Weak sharp minima in mathematical programming. SIAM J. Control Optim. 31, 1340–1359 (1993)

    MathSciNet  MATH  Google Scholar 

  11. Crouzeix, J.P., Martinez-Legaz, J.E., Volle, M.: Generalized Convexity, Generalized Monotonicity. Kluwer, Dordrecht (1998)

    MATH  Google Scholar 

  12. Ermoliev, Y.M.: Methods of solution of nonlinear extremal problems. Cybern. Syst. Anal. 2, 1–14 (1966)

    Google Scholar 

  13. Ferris, M.C.: Finite termination of the proximal point algorithm. Math. Program. 50, 359–366 (1991)

    MathSciNet  MATH  Google Scholar 

  14. Frank, M., Wolfe, P.: An algorithm for quadratic programming. Nav. Res. Logist. Q. 3, 95–110 (1956)

    MathSciNet  Google Scholar 

  15. Freund, R.M., Lu, H.: New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure. Math. Program. 170, 445–477 (2018)

    MathSciNet  MATH  Google Scholar 

  16. Goffin, J.L.: On convergence rates of subgradient optimization methods. Math. Program. 13, 329–347 (1977)

    MathSciNet  MATH  Google Scholar 

  17. Greenberg, H.J., Pierskalla, W.P.: Quasiconjugate functions and surrogate duality. Cahiers Centre Études Recherche Opertionnelle 15, 437–448 (1973)

    MATH  Google Scholar 

  18. Gürbüzbalaban, M., Ozdaglar, A., Ribeiro, P.A.: On the convergence rate of incremental aggregated gradient algorithms. SIAM J. Optim. 27, 1035–1048 (2017)

    MathSciNet  MATH  Google Scholar 

  19. Hadjisavvas, N., Komlósi, S., Schaible, S.: Handbook of Generalized Convexity and Generalized Monotonicity. Springer, New York (2005)

    MATH  Google Scholar 

  20. Hu, Y., Li, C., Yang, X.: On convergence rates of linearized proximal algorithms for convex composite optimization with applications. SIAM J. Optim. 26, 1207–1235 (2016)

    MathSciNet  MATH  Google Scholar 

  21. Hu, Y., Yang, X., Sim, C.K.: Inexact subgradient methods for quasi-convex optimization problems. Eur. J. Oper. Res. 240, 315–327 (2015)

    MathSciNet  MATH  Google Scholar 

  22. Hu, Y., Yang, X., Yu, C.K.W.: Subgradient methods for saddle point problems of quasiconvex optimization. Pure Appl. Funct. Anal. 2, 83–97 (2017)

    MathSciNet  Google Scholar 

  23. Hu, Y., Yu, C.K.W., Li, C., Yang, X.: Conditional subgradient methods for constrained quasi-convex optimization problems. J. Nonlinear Convex Anal. 17, 2143–2158 (2016)

    MathSciNet  MATH  Google Scholar 

  24. Huang, X.X., Yang, X.Q.: A unified augmented Lagrangian approach to duality and exact penalization. Math. Oper. Res. 28, 533–552 (2003)

    MathSciNet  MATH  Google Scholar 

  25. Johnstone, P.R., Moulin, P.: Faster subgradient methods for functions with Hölderian growth. Math. Program. 180, 417–450 (2020)

    MathSciNet  MATH  Google Scholar 

  26. Kiwiel, K.C.: Convergence and efficiency of subgradient methods for quasiconvex minimization. Math. Program. 90, 1–25 (2001)

    MathSciNet  MATH  Google Scholar 

  27. Kiwiel, K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14, 807–840 (2004)

    MathSciNet  MATH  Google Scholar 

  28. Konnov, I.V.: On properties of supporting and quasi-supporting vectors. J. Math. Sci. 71, 2760–2763 (1994)

    MathSciNet  MATH  Google Scholar 

  29. Konnov, I.V.: On convergence properties of a subgradient method. Optim. Methods Softw. 18, 53–62 (2003)

    MathSciNet  MATH  Google Scholar 

  30. Larsson, T., Patriksson, M., Strömberg, A.B.: Conditional subgradient optimization-theory and applications. Eur. J. Oper. Res. 88, 382–403 (1996)

    MATH  Google Scholar 

  31. Lu, Z., Zhang, Y., Lu, J.: \(\ell _p\) Regularized low-rank approximation via iterative reweighted singular value minimization. Comput. Optim. Appl. 68, 619–642 (2017)

    MathSciNet  MATH  Google Scholar 

  32. Mastrogiacomo, M., Gianin, E.R.: Portfolio optimization with quasiconvex risk measures. Math. Oper. Res. 40, 1042–1059 (2015)

    MathSciNet  MATH  Google Scholar 

  33. Mokhtari, A., Gürbüzbalaban, M., Ribeiro, A.: Surpassing gradient descent provably: a cyclic incremental method with linear convergence rate. SIAM J. Optim. 28, 1420–1447 (2018)

    MathSciNet  MATH  Google Scholar 

  34. Nedić, A., Bertsekas, D.P.: Convergence rate of incremental subgradient algorithms. In: Uryasev, S., Pardalos, P.M., (eds.) Stochastic Optimization: Algorithms and Applications, pp. 223–264. Springer, Berlin (2001)

  35. Nedić, A., Bertsekas, D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)

    MathSciNet  MATH  Google Scholar 

  36. Nedić, A., Ozdaglar, A.: Subgradient methods for saddle-point problems. J. Optim. Theory Appl. 142, 205–228 (2009)

    MathSciNet  MATH  Google Scholar 

  37. Nesterov, Y.: Primal-dual subgradient methods for convex problems. Math. Program. 120, 221–259 (2009)

    MathSciNet  MATH  Google Scholar 

  38. Nesterov, Y., Shikhman, V.: Dual subgradient method with averaging for optimal resource allocation. Eur. J. Oper. Res. 270, 907–916 (2018)

    MathSciNet  MATH  Google Scholar 

  39. Neto, E.S.H., Pierro, A.R.D.: Incremental subgradients for constrained convex optimization: a unified framework and new methods. SIAM J. Optim. 20, 1547–1572 (2009)

    MathSciNet  MATH  Google Scholar 

  40. Poljak, B.T.: Nonlinear programming methods in the presence of noise. Math. Program. 14, 87–97 (1978)

    MathSciNet  MATH  Google Scholar 

  41. Polyak, B.T.: A general method for solving extremum problems. Sov. Math. Doklady 8, 593–597 (1967)

    MATH  Google Scholar 

  42. Polyak, B.T.: Introduction to Optimization. Optimization Software, New York (1987)

    MATH  Google Scholar 

  43. Papa Quiroz, E.A., Apolinario, H.C.F., Villacorta, K.D., Oliveira, P.R.: A linear scalarization proximal point method for quasiconvex multiobjective minimization. J. Optim. Theory Appl. 183, 1028–1052 (2019)

    MathSciNet  MATH  Google Scholar 

  44. Papa Quiroz, E.A., Oliveira, P.R.: An extension of proximal methods for quasiconvex minimization on the nonnegative orthant. Eur. J. Oper. Res. 216, 26–32 (2012)

    MathSciNet  MATH  Google Scholar 

  45. Papa Quiroz, E.A., Oliveira, P.R.: Proximal point methods for quasiconvex and convex functions with Bregman distances on Hadamard manifolds. J. Convex Anal. 16, 49–69 (2009)

    MathSciNet  MATH  Google Scholar 

  46. Papa Quiroz, E.A., Quispe Cardenas, E.M., Oliveira, P.R.: Steepest descent method with a generalized Armijo search for quasiconvex functions on Riemannian manifolds. J. Math. Anal. Appl. 341, 467–477 (2008)

    MathSciNet  MATH  Google Scholar 

  47. Papa Quiroz, E.A., Ramirez, L.M., Oliveira, P.R.: An inexact proximal method for quasiconvex minimization. Eur. J. Oper. Res. 246, 721–729 (2015)

    MathSciNet  MATH  Google Scholar 

  48. Robinson, S.M.: Linear convergence of epsilon-subgradient descent methods for a class of convex functions. Math. Program. 86, 41–50 (1999)

    MathSciNet  MATH  Google Scholar 

  49. Shor, N.Z.: Minimization Methods for Non-differentiable Functions. Springer, New York (1985)

    MATH  Google Scholar 

  50. Stancu-Minasian, I.M.: Fractional Programming. Kluwer, Dordrecht (1997)

    MATH  Google Scholar 

  51. Studniarski, M., Ward, D.E.: Weak sharp minima: characterizations and sufficient conditions. SIAM J. Control Optim. 38, 219–236 (1999)

    MathSciNet  MATH  Google Scholar 

  52. Wang, J., Hu, Y., Li, C., Yao, J.-C.: Linear convergence of CQ algorithms and applications in gene regulatory network inference. Inverse Probl. 33, 055017 (2017)

    MathSciNet  MATH  Google Scholar 

  53. Wang, J., Li, C., Lopez, G., Yao, J.-C.: Proximal point algorithms on Hadamard manifolds: linear convergence and finite termination. SIAM J. Optim. 26, 2696–2729 (2016)

    MathSciNet  MATH  Google Scholar 

  54. Xu, Y., Lin, Q., Yang, T.: Accelerate stochastic subgradient method by leveraging local error bound. arXiv:1607.01027 (2018)

  55. Yu, C.K.W., Hu, Y., Yang, X., Choy, S.K.: Abstract convergence theorem for quasi-convex optimization problems with applications. Optimization 68, 1289–1304 (2019)

    MathSciNet  MATH  Google Scholar 

  56. Zhou, Z., So, A.M.C.: A unified approach to error bounds for structured convex optimization problems. Math. Program. 165, 689–728 (2017)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

Y. Hu was supported in part by the National Natural Science Foundation of China (11871347), Natural Science Foundation of Guangdong (2019A1515011917), Natural Science Foundation of Shenzhen (JCYJ20190808173603590, JCYJ20170817100950436, JCYJ20170818091621856) and Interdisciplinary Innovation Team of Shenzhen University. C. K. W. Yu was supported in part by grants from the Research Grants Council of the Hong Kong Special Administrative Region, China (UGC/FDS14/P02/17).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Carisa Kwok Wai Yu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hu, Y., Li, J. & Yu, C.K.W. Convergence rates of subgradient methods for quasi-convex optimization problems. Comput Optim Appl 77, 183–212 (2020). https://doi.org/10.1007/s10589-020-00194-y

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-020-00194-y

Keywords

Mathematics Subject Classification

Navigation