Skip to main content
Log in

A sublevel moment-SOS hierarchy for polynomial optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

We introduce a sublevel Moment-SOS hierarchy where each SDP relaxation can be viewed as an intermediate (or interpolation) between the d-th and \((d+1)\)-th order SDP relaxations of the Moment-SOS hierarchy (dense or sparse version). With the flexible choice of determining the size (level) and number (depth) of subsets in the SDP relaxation, one is able to obtain different improvements compared to the d-th order relaxation, based on the machine memory capacity. In particular, we provide numerical experiments for \(d=1\) and various types of problems both in combinatorial optimization (Max-Cut, Mixed Integer Programming) and deep learning (robustness certification, Lipschitz constant of neural networks), where the standard Lasserre’s relaxation (or its sparse variant) is computationally intractable. In our numerical results, the lower bounds from the sublevel relaxations improve the bound from Shor’s relaxation (first order Lasserre’s relaxation) and are significantly closer to the optimal value or to the best-known lower/upper bounds.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. http://web.stanford.edu/~yyye/yyye/Gset/

References

  1. Barthel, T., Hübener, R.: Solving condensed-matter ground-state problems by semidefinite relaxations. Phys. Rev. Lett. 108(20), 200404 (2012)

    Article  Google Scholar 

  2. Campos, J.S., Misener, R., Parpas, P.: Partial Lasserre relaxation for sparse Max-Cut. (2020). Available on http://www.optimization-online.org

  3. Chen, T., Lasserre, J.B., Magron, V., Pauwels, E.: Semialgebraic optimization for lipschitz constants of ReLU networks. In: Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M.F., Lin, H. (eds.) Advances in Neural Information Processing Systems, vol. 33, pp. 19189–19200. Curran Associates, Inc. (2020)

  4. Floudas, C.A., Pardalos, P.M.: A Collection of Test Problems for Constrained Global Optimization Algorithms, vol. 455. Springer Science & Business Media, Berlin (1990)

    Book  Google Scholar 

  5. Furini, F., Traversi, E., Belotti, P., Frangioni, A., Gleixner, A., Gould, N., Liberti, L., Lodi, A., Misener, R., Mittelmann, H., et al.: Qplib: a library of quadratic programming instances. Math. Program. Comput. 11(2), 237–265 (2019)

    Article  MathSciNet  Google Scholar 

  6. Haim, A., Kueng, R., Refael, G.: Variational-correlations approach to quantum many-body problems. arXiv preprint https://arxiv.org/abs/2001.06510, (2020)

  7. Henrion, D., Lasserre, J.-B.: Detecting global optimality and extracting solutions in gloptipoly. In: Henrion D., Garulli A. (eds.) Positive Polynomials in Control. Lecture Notes in Control and Information Science, vol. 312, pp. 293–310. Springer, Berlin, Heidelberg (2005). https://doi.org/10.1007/10997703_15

    Chapter  Google Scholar 

  8. Josz, C., Molzahn, D.K.: Lasserre hierarchy for large scale polynomial optimization in real and complex variables. SIAM J. Optim. 28(2), 1017–1048 (2018)

    Article  MathSciNet  Google Scholar 

  9. Klep, I., Magron, V., Povh, J.: Sparse noncommutative polynomial optimization. Math. Program. (2021). https://doi.org/10.1007/s10107-020-01610-1

  10. Kochenberger, G.A., Hao, J.-K., Lü, Z., Wang, H., Glover, F.: Solving large scale max cut problems via tabu search. J. Heuristics 19(4), 565–571 (2013)

    Article  Google Scholar 

  11. Lasserre, J.B.: Global optimization with polynomials and the problem of moments. SIAM J. Optim. 11(3), 796–817 (2001)

    Article  MathSciNet  Google Scholar 

  12. Lasserre, J.B.: Convergent sdp-relaxations in polynomial optimization with sparsity. SIAM J. Optim. 17(3), 822–843 (2006)

    Article  MathSciNet  Google Scholar 

  13. Lasserre, J.B., Toh, K.-C., Yang, S.: A bounded degree sos hierarchy for polynomial optimization. EURO J. Comput. Optim. 5(1–2), 87–117 (2017)

    Article  MathSciNet  Google Scholar 

  14. Lasserre, J.B.: An Introduction to Polynomial and Semi-Algebraic Optimization, vol. 52. Cambridge University Press, Cambridge (2015)

    Book  Google Scholar 

  15. Laurent, M.: A comparison of the sherali-adams, lovász-schrijver, and lasserre relaxations for 0–1 programming. Math. Oper. Res. 28(3), 470–496 (2003)

    Article  MathSciNet  Google Scholar 

  16. Magron, V.: Interval enclosures of upper bounds of roundoff errors using semidefinite programming. ACM Trans. Math. Softw. (TOMS) 44(4), 1–18 (2018)

    Article  MathSciNet  Google Scholar 

  17. Magron, V., Constantinides, G., Donaldson, A.: Certified roundoff error bounds using semidefinite programming. ACM Trans. Math. Softw. (TOMS) 43(4), 1–31 (2017)

    Article  MathSciNet  Google Scholar 

  18. Mai, N.H.A., Magron, V., Lasserre, J.-B.: A sparse version of Reznick’s Positivstellensatz. arXiv preprint arXiv:2002.05101, (2020)

  19. Majumdar, A., Ahmadi, A.A., Tedrake, R.: Control and verification of high-dimensional systems with DSOS and SDSOS programming. In: Proceedings of the 53rd IEEE Conference on Decision and Control, pp. 394–401. IEEE, (2014)

  20. Nie, J.: Optimality conditions and finite convergence of Lasserre’s hierarchy. Math. Program. 146(1–2), 97–121 (2014)

  21. Nie, J., Demmel, J.: Sparse sos relaxations for minimizing functions that are summations of small polynomials. SIAM J. Optim. 19(4), 1534–1558 (2009)

    Article  MathSciNet  Google Scholar 

  22. Pardalos, P.M., Phillips, A.T.: A global optimization approach for solving the maximum clique problem. Int. J. Comput. Math. 33(3–4), 209–216 (1990)

    Article  Google Scholar 

  23. Pál, K.F., Vértesi, T.: Quantum bounds on Bell inequalities. Phys. Rev. A 79(2), 022120 (2009)

    Article  MathSciNet  Google Scholar 

  24. Putinar, M.: Positive polynomials on compact semi-algebraic sets. Indiana Univ. Math. J. 42(3), 969–984 (1993)

    Article  MathSciNet  Google Scholar 

  25. Raghunathan, A., Steinhardt, J., Liang, P.: Semidefinite relaxations for certifying robustness to adversarial examples. In: Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 31. Curran Associates, Inc. (2018). https://proceedings.neurips.cc/paper/2018/file/29c0605a3bab4229e46723f89cf59d83-Paper.pdf

  26. Rendl, F., Rinaldi, G., Wiegele, A.: A branch and bound algorithm for Max-Cut based on combining semidefinite and polyhedral relaxations. In: International Conference on Integer Programming and Combinatorial Optimization, pp 295–309, Springer, (2007)

  27. Schlosser, C., Korda, M.: Sparse moment-sum-of-squares relaxations for nonlinear dynamical systems with guaranteed convergence. arXiv preprint arXiv:2012.05572 (2020)

  28. Tacchi, M., Cardozo, C., Henrion, D., Lasserre, J.B.: Approximating regions of attraction of a sparse polynomial differential system. IFAC-PapersOnLine 53(2), 3266–3271 (2020). 21th IFAC World Congress

  29. Tacchi, M., Weisser, T., Lasserre, J.B., Henrion, D.: Exploiting sparsity for semi-algebraic set volume computation. Found. Comput. Math. (2021). https://doi.org/10.1007/s10208-021-09508-w

    Article  Google Scholar 

  30. Waki, H., Kim, S., Kojima, M., Muramatsu, M.: Sums of squares and semidefinite program relaxations for polynomial optimization problems with structured sparsity. SIAM J. Optim. 17(1), 218–242 (2006)

    Article  MathSciNet  Google Scholar 

  31. Wang, J., Maggio, M., Magron, V.: Sparsejsr: A fast algorithm to compute joint spectral radius via sparse sos decompositions. In: 2021 American Control Conference (ACC), pp. 2254–2259, (2021)

  32. Wang, J., Magron, V.: Exploiting term sparsity in noncommutative polynomial optimization. Comput. Optim. Appl. 80(2), 483–521 (2021)

    Article  MathSciNet  Google Scholar 

  33. Wang, J., Magron, V., Lasserre, J.B., Mai, N.H.A.: CS-TSSOS: Correlative and term sparsity for large-scale polynomial optimization. arXiv preprint arXiv:2005.02828, (2020)

  34. Wang, J., Magron, V., Lasserre, J.-B.: Chordal-TSSOS: a moment-SOS hierarchy that exploits term sparsity with chordal extension. SIAM J. Optim. 31(1), 114–141 (2021)

    Article  MathSciNet  Google Scholar 

  35. Wang, J., Magron, V., Lasserre, J.-B.: TSSOS: A Moment-SOS hierarchy that exploits term sparsity. SIAM J. Optim. 31(1), 30–58 (2021)

    Article  MathSciNet  Google Scholar 

  36. Weisser, T., Lasserre, J.B., Toh, K.-C.: Sparse-BSOS: a bounded degree SOS hierarchy for large scale polynomial optimization with sparsity. Math. Program. Comput. 10(1), 1–32 (2018)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work has benefited from the Tremplin ERC Stg Grant ANR-18-ERC2-0004-01 (T-COPS project), the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie Actions, grant agreement 813211 (POEMA) as well as from the AI Interdisciplinary Institute ANITI funding, through the French “Investing for the Future PIA3” program under the Grant agreement \(\hbox {n}^{\circ }\)ANR-19-PI3A-0004. The third author was supported by the FMJH Program PGMO (EPICS project) and EDF, Thales, Orange et Criteo. The fourth author acknowledge the support of Air Force Office of Scientific Research, Air Force Material Command, USAF, under grant numbers FA9550-19-1-7026, FA9550-18-1-0226, and ANR MasDol.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tong Chen.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, T., Lasserre, JB., Magron, V. et al. A sublevel moment-SOS hierarchy for polynomial optimization. Comput Optim Appl 81, 31–66 (2022). https://doi.org/10.1007/s10589-021-00325-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-021-00325-z

Keywords

Navigation