Skip to main content

Advertisement

Log in

Distributed primal outer approximation algorithm for sparse convex programming with separable structures

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

This paper presents the distributed primal outer approximation (DiPOA) algorithm for solving sparse convex programming (SCP) problems with separable structures, efficiently, and in a decentralized manner. The DiPOA algorithm development consists of embedding the recently proposed relaxed hybrid alternating direction method of multipliers (RH-ADMM) algorithm into the outer approximation (OA) algorithm. We also propose two main improvements to control the quality and the number of cutting planes that approximate nonlinear functions. In particular, the RH-ADMM algorithm acts as a distributed numerical engine inside the DiPOA algorithm. DiPOA takes advantage of the multi-core architecture of modern processors to speed up optimization algorithms. The proposed distributed algorithm makes practical the solution of SCP in learning and control problems from the application side. This paper concludes with a performance analysis of DiPOA for the distributed sparse logistic regression and quadratically constrained optimization problems. Finally, the paper concludes with a numerical comparison with state-of-the-art optimization solvers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. See problem (2a)–(2e).

  2. A polytope is a bounded polyhedron [16, Chapter 2, pg. 31].

  3. The problems are different in the sense that the problem data are not necessarily the same.

  4. https://github.com/Alirezalm/dccp.git.

Abbreviations

CN:

Communication Network

DiPOA:

Distributed primal outer approximation

D-MILP:

Distributed mixed integer linear program

D-NLP:

Distributed nonlinear program

D-MINLP:

Distributed MINLP

DSLR:

Distributed sparse logistic regression

ET-SoCut:

Event triggered SoCut

GBD:

Generalized benders decomposition

LFC:

Local fusion center

MIP:

Mixed integer programming

MPI:

Message passing interface

MINLP:

Mixed integer nonlinear program

MILP:

Mixed integer linear program

MIQP:

Mixed integer quadratic program

MIQCP:

Mixed integer quadratically constrained program

NLP:

Nonlinear programming

OA:

Outer approximation

RH-ADMM:

Relaxed-hybrid alternating direction method of multipliers

SCP:

Sparse convex programming

SLR:

Sparse logistic regression

SoCut:

Second order cut

SQCQP:

Sparse quadratically constrained quadratic programming

References

  1. Aguilera, R.P., Urrutia, G., Delgado, R.A., Dolz, D., Agüero, J.C.: Quadratic model predictive control including input cardinality constraints. IEEE Trans. Autom. Control 62(6), 3068–3075 (2017). https://doi.org/10.1109/TAC.2017.2674185

    Article  MathSciNet  MATH  Google Scholar 

  2. Aytug, H.: Feature selection for support vector machines using generalized benders decomposition. Eur. J. Oper. Res. 244(1), 210–218 (2015). https://doi.org/10.1016/j.ejor.2015.01.006

    Article  MathSciNet  MATH  Google Scholar 

  3. Bai, Y., Liang, R., Yang, Z.: Splitting augmented Lagrangian method for optimization problems with a cardinality constraint and semicontinuous variables. Optim. Methods Softw. 31(5), 1089–1109 (2016). https://doi.org/10.1080/10556788.2016.1196206

    Article  MathSciNet  MATH  Google Scholar 

  4. Bernal, D.E., Vigerske, S., Trespalacios, F., Grossmann, I.E.: Improving the performance of DICOPT in convex MINLP problems using a feasibility pump. Optim. Methods Softw. 35(1), 171–190 (2020). https://doi.org/10.1080/10556788.2019.1641498

    Article  MathSciNet  MATH  Google Scholar 

  5. Bertsimas, D., Cory-Wright, R.: A scalable algorithm for sparse portfolio selection. INFORMS J. Comput. (2022). https://doi.org/10.1287/ijoc.2021.1127

    Article  MathSciNet  MATH  Google Scholar 

  6. Bertsimas, D., Dunn, J.: Machine learning under a modern optimization lens. Dyn. Ideas LLC (2019)

  7. Bertsimas, D., King, A.: Best subset selection via a modern optimization lens. Ann. Stat. 44(2), 813–852 (2016). https://doi.org/10.1214/15-AOS1388

    Article  MathSciNet  MATH  Google Scholar 

  8. Bertsimas, D., King, A.: Logistic regression: From art to science. Stat. Sci. 32(3), 367–384 (2017). https://doi.org/10.1214/16-STS602

    Article  MathSciNet  MATH  Google Scholar 

  9. Bertsimas, D., Mundru, N.: Sparse convex regression. INFORMS J. Comput. 33(1), 262–279 (2021). https://doi.org/10.1287/ijoc.2020.0954

    Article  MathSciNet  MATH  Google Scholar 

  10. Bertsimas, D., Pauphilet, J., Parys, B.V.: Sparse classification: a scalable discrete optimization perspective. Mach. Learn. 110(October) (2021). https://doi.org/10.1007/s10994-021-06085-5

  11. Bertsimas, D., Shioda, R.: Algorithm for cardinality-constrained quadratic optimization. Comput. Optim. Appl. 43(1), 1–22 (2009). https://doi.org/10.1007/s10589-007-9126-9

    Article  MathSciNet  MATH  Google Scholar 

  12. Bienstock, D.: Computational study of a family of mixed-integer quadratic programming problems. Math. Program. 74(2), 121–140 (1996). https://doi.org/10.1007/BF02592208

    Article  MathSciNet  MATH  Google Scholar 

  13. Blumensath, T.: Compressed sensing with nonlinear observations and related nonlinear optimization problems. IEEE Trans. Inf. Theory 59(6), 3466–3474 (2013). https://doi.org/10.1109/TIT.2013.2245716

    Article  MathSciNet  MATH  Google Scholar 

  14. Blumensath, T., Davies, M.E.: Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27(3), 265–274 (2009). https://doi.org/10.1016/j.acha.2009.04.002

    Article  MathSciNet  MATH  Google Scholar 

  15. Bourguignon, S., Ninin, J., Carfantan, H., Mongeau, M.: Exact sparse approximation problems via mixed-integer programming: Formulations and computational performance. IEEE Trans. Signal Process. 64(6), 1405–1419 (2015). https://doi.org/10.1109/TSP.2015.2496367

    Article  MathSciNet  MATH  Google Scholar 

  16. Boyd, S., Boyd, S.P., Vandenberghe, L.: Convex Optimization. Cambridge university press, Cambridge (2004)

    Book  MATH  Google Scholar 

  17. Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J., et al.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends® in Machine Learning 3(1), 1–122 (2011). https://doi.org/10.1561/2200000016

  18. Candès, E.J.: The restricted isometry property and its implications for compressed sensing. C.R. Math. 346(9–10), 589–592 (2008). https://doi.org/10.1016/j.crma.2008.03.014

    Article  MathSciNet  MATH  Google Scholar 

  19. Duran, M.A., Grossmann, I.E.: An outer-approximation algorithm for a class of mixed-integer nonlinear programs. Math. Program. 36(3), 307–339 (1986). https://doi.org/10.1007/BF02592064

    Article  MathSciNet  MATH  Google Scholar 

  20. Fletcher, R., Ley, S.: Solving mixed integer nonlinear programs by outer approximation. Math. Program. 66, 327–349 (1994). https://doi.org/10.1007/BF01581153

    Article  MathSciNet  MATH  Google Scholar 

  21. Geoffrion, A.M.: Generalized benders decomposition. J. Optim. Theory Appl. 10(4), 237–260 (1972). https://doi.org/10.1007/BF00934810

    Article  MathSciNet  MATH  Google Scholar 

  22. Gropp, W., Lusk, E., Skjellum, A.: Using MPI: Portable Parallel Programming with the Message-Passing Interface. MIT Press, Cambridge (2014)

    MATH  Google Scholar 

  23. Grossmann, I.: Review of nonlinear mixed-integer and disjunctive programming techniques. Optim. Eng. 3(3), 227–252 (2002). https://doi.org/10.1023/A:1021039126272

    Article  MathSciNet  MATH  Google Scholar 

  24. Gurobi Optimization, L.: Gurobi optimizer reference manual (2020). http://www.gurobi.com

  25. Hastie, T., Tibshirani, R., Tibshirani, R.: Best subset, forward stepwise or lasso? Analysis and recommendations based on extensive comparisons. Stat. Sci. 35(4), 579–592 (2020). https://doi.org/10.1214/19-STS733

    Article  MathSciNet  MATH  Google Scholar 

  26. Hijazi, H., Bonami, P., Cornuéjols, G., Ouorou, A.: Mixed-integer nonlinear programs featuring on/off constraints. Comput. Optim. Appl. 52(2), 537–558 (2012). https://doi.org/10.1007/s10589-011-9424-0

    Article  MathSciNet  MATH  Google Scholar 

  27. James, G., Witten, D., Hastie, T., Tibshirani, R.: An Introduction to Statistical Learning. Springer, Berlin (2013)

    Book  MATH  Google Scholar 

  28. Kamiya, S., Miyashiro, R., Takano, Y.: Feature subset selection for the multinomial logit model via mixed-integer optimization. In: the 22nd International Conference on Artificial Intelligence and Statistics, pp. 1254–1263. PMLR (2019)

  29. Kronqvist, J., Bernal, D.E., Grossmann, I.E.: Using regularization and second order information in outer approximation for convex MINLP. Math. Program. 180(1–2), 285–310 (2020). https://doi.org/10.1007/s10107-018-1356-3

    Article  MathSciNet  MATH  Google Scholar 

  30. Kronqvist, J., Bernal, D.E., Lundell, A., Grossmann, I.E.: review and comparison of solvers for convex MINLP. Optim. Eng. 20(2), 397–455 (2019). https://doi.org/10.1007/s11081-018-9411-8

    Article  MathSciNet  Google Scholar 

  31. Kronqvist, J., Lundell, A., Westerlund, T.: The extended supporting hyperplane algorithm for convex mixed-integer nonlinear programming. J. Global Optim. 64(2), 249–272 (2016). https://doi.org/10.1007/s10898-015-0322-3

    Article  MathSciNet  MATH  Google Scholar 

  32. Kronqvist, J., Lundell, A., Westerlund, T.: Reformulations for utilizing separability when solving convex MINLP problems. J. Global Optim. 71(3), 571–592 (2018). https://doi.org/10.1007/s10898-018-0616-3

    Article  MathSciNet  MATH  Google Scholar 

  33. Lundell, A., Kronqvist, J.: Integration of polyhedral outer approximation algorithms with MIP solvers through callbacks and lazy constraints. In: AIP Conference Proceedings 2070(March) (2019). https://doi.org/10.1063/1.5089979

  34. Ma, M., Nikolakopoulos, A.N., Giannakis, G.B.: Hybrid admm: a unifying and fast approach to decentralized optimization. EURASIP J. Adv. Signal Process. 2018(1), 1–17 (2018). https://doi.org/10.1186/s13634-018-0589-x

    Article  Google Scholar 

  35. Murray, A., Faulwasser, T., Hagenmeyer, V., Villanueva, M.E., Houska, B.: Partially distributed outer approximation. J. Global Optim. 80, 523–550 (2021). https://doi.org/10.1007/s10898-021-01015-0

    Article  MathSciNet  MATH  Google Scholar 

  36. Muts, P., Nowak, I., Hendrix, E.M.: The decomposition-based outer approximation algorithm for convex mixed-integer nonlinear programming. J. Global Optim. 77, 75–96 (2020). https://doi.org/10.1007/s10898-020-00888-x

    Article  MathSciNet  MATH  Google Scholar 

  37. Nedić, A., Liu, J.: Distributed optimization for control. Ann. Rev. Control Robot. Autonom. Syst. 1, 77–103 (2018). https://doi.org/10.1146/annurev-control-060117-105131

    Article  Google Scholar 

  38. Nedić, A., Olshevsky, A.: Distributed optimization over time-varying directed graphs. IEEE Trans. Autom. Control 60(3), 601–615 (2015). https://doi.org/10.1109/TAC.2014.2364096

    Article  MathSciNet  MATH  Google Scholar 

  39. Notarstefano, G., Notarnicola, I., Camisa, A., et al.: Distributed optimization for smart cyber-physical networks. Found. Trends® Syst. Control 7(3), 253–383 (2019). https://doi.org/10.1561/2600000020

  40. Olama, A., Bastianello, N., Mendes, P.R., Camponogara, E.: Relaxed hybrid consensus ADMM for distributed convex optimisation with coupling constraints. IET Control Theory Appl. 13(17), 2828–2837 (2019). https://doi.org/10.1049/iet-cta.2018.6260

    Article  MathSciNet  Google Scholar 

  41. Ryu, E.K., Boyd, S.: Primer on monotone operator methods. Appl. Comput. Math 15(1), 3–43 (2016)

    MathSciNet  MATH  Google Scholar 

  42. Sant’Anna, L.R., de Oliveira, A.D., Filomena, T.P., Caldeira, J.F.: Solving the index tracking problem based on a convex reformulation for cointegration. Financ. Res. Lett. 37, 101356 (2020). https://doi.org/10.1016/j.frl.2019.101356

    Article  Google Scholar 

  43. Su, L., Tang, L., Bernal, D.E., Grossmann, I.E.: Improved quadratic cuts for convex mixed-integer nonlinear programs. Comput. Chem. Eng. 109, 77–95 (2018). https://doi.org/10.1016/j.compchemeng.2017.10.011

    Article  Google Scholar 

  44. Su, L., Tang, L., Grossmann, I.E.: Computational strategies for improved MINLP algorithms. Comput. Chem. Eng. 75, 40–48 (2015). https://doi.org/10.1016/j.compchemeng.2015.01.015

    Article  Google Scholar 

  45. Sun, X., Zheng, X., Li, D.: Recent advances in mathematical programming with semi-continuous variables and cardinality constraint. J. Oper. Res. Soc. China 1(1), 55–77 (2013). https://doi.org/10.1007/s40305-013-0004-0

    Article  MATH  Google Scholar 

  46. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc.: Ser. B (Methodol.) 58(1), 267–288 (1996). https://doi.org/10.1111/j.2517-6161.1996.tb02080.x

    Article  MathSciNet  MATH  Google Scholar 

  47. Tillmann, A.M., Bienstock, D., Lodi, A., Schwartz, A.: Cardinality minimization, constraints, and regularization: a survey. arXiv preprint arXiv:2106.09606 (2021)

  48. Wang, F., Cao, L.: A new algorithm for quadratic integer programming problems with cardinality constraint. Jpn. J. Ind. Appl. Math. 37, 449–460 (2020). https://doi.org/10.1007/s13160-019-00403-0

    Article  MathSciNet  MATH  Google Scholar 

  49. Westerlund, T., Pettersson, F.: An extended cutting plane method for solving convex MINLP problems. Comput. Chem. Eng. 19, 131–136 (1995). https://doi.org/10.1016/0098-1354(95)87027-X

    Article  Google Scholar 

Download references

Funding

This work was funded in part by Fundação de Amparo à Pesquisa e Inovação do Estado de Santa Catarina (FAPESC) under Grant 2021TR2265 and Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES, Brazil) under the project PrInt CAPES-UFSC 698503P1.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alireza Olama.

Ethics declarations

Competing interests

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Olama, A., Camponogara, E. & Mendes, P.R.C. Distributed primal outer approximation algorithm for sparse convex programming with separable structures. J Glob Optim 86, 637–670 (2023). https://doi.org/10.1007/s10898-022-01266-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-022-01266-5

Keywords

Navigation