Abstract
In this paper, we describe a new active-set algorithmic framework for minimizing a non-convex function over the unit simplex. At each iteration, the method makes use of a rule for identifying active variables (i.e., variables that are zero at a stationary point) and specific directions (that we name active-set gradient related directions) satisfying a new “nonorthogonality” type of condition. We prove global convergence to stationary points when using an Armijo line search in the given framework. We further describe three different examples of active-set gradient related directions that guarantee linear convergence rate (under suitable assumptions). Finally, we report numerical experiments showing the effectiveness of the approach.
Similar content being viewed by others
References
Bertsekas, D.P.: Projected Newton methods for optimization problems with simple constraints. SIAM J. Control Optim. 20(2), 221–246 (1982)
Bertsekas, D.P.: Nonlinear Programming. Athena Scientific, Belmont (1999)
Birgin, E.G., Martínez, J.M.: Large-scale active-set box-constrained optimization method with spectral projected gradients. Comput. Optim. Appl. 23(1), 101–125 (2002)
Bomze, I.M., Budinich, M., Pardalos, P.M., Pelillo, M.: The maximum clique problem. In: Handbook of Combinatorial Optimization, pp. 1–74. Springer (1999)
Bomze, I.M.: Evolution towards the maximum clique. J. Global Optim. 10(2), 143–164 (1997)
Brás, C.P., Fischer, A., Júdice, J.J., Schönefeld, K., Seifert, S.: A block active set algorithm with spectral choice line search for the symmetric eigenvalue complementarity problem. Appl. Math. Comput. 294, 36–48 (2017)
Buchheim, C., De Santis, M., Lucidi, S., Rinaldi, F., Trieu, L.: A feasible active set method with reoptimization for convex quadratic mixed-integer programming. SIAM J. Optim. 26(3), 1695–1714 (2016)
Clarkson, K.L.: Coresets, sparse greedy approximation, and the Frank–Wolfe algorithm. ACM Trans. Algorithms 6(4), 63 (2010)
Cristofari, A., De Santis, M., Lucidi, S., Rinaldi, F.: A two-stage active-set algorithm for bound-constrained optimization. J. Optim. Theory Appl. 172(2), 369–401 (2017)
de Klerk, E.: The complexity of optimizing over a simplex, hypercube or sphere: a short survey. CEJOR 16(2), 111–125 (2008)
De Santis, M., Di Pillo, G., Lucidi, S.: An active set feasible method for large-scale minimization problems with bound constraints. Comput. Optim. Appl. 53(2), 395–423 (2012)
De Santis, M., Lucidi, S., Rinaldi, F.: A fast active set block coordinate descent algorithm for \(\ell _1\)-regularized least squares. SIAM J. Optim. 26(1), 781–809 (2016)
di Serafino, D., Toraldo, G., Viola, M., Barlow, J.: A two-phase gradient method for quadratic programming problems with a single linear constraint and bounds on the variables. SIAM J. Optim. 28(4), 2809–2838 (2018)
Di Pillo, G., Grippo, L.: A class of continuously differentiable exact penalty function algorithms for nonlinear programming problems. In: System Modelling and Optimization, pp. 246–256. Springer (1984)
Facchinei, F., Lucidi, S.: Quadratically and superlinearly convergent algorithms for the solution of inequality constrained minimization problems. J. Optim. Theory Appl. 85(2), 265–289 (1995)
Facchinei, F., Fischer, A., Kanzow, C.: On the accurate identification of active constraints. SIAM J. Optim. 9(1), 14–32 (1998)
Facchinei, F., Júdice, J., Soares, J.: An active set Newton algorithm for large-scale nonlinear programs with box constraints. SIAM J. Optim. 8(1), 158–186 (1998)
Frank, M., Wolfe, P.: An algorithm for quadratic programming. Nav. Res. Logist. Q. 3(1–2), 95–110 (1956)
Guélat, J., Marcotte, P.: Some comments on Wolfe’s ‘away step’. Math. Program. 35(1), 110–119 (1986)
Hager, W.W., Zhang, H.: A new active set algorithm for box constrained optimization. SIAM J. Optim. 17(2), 526–557 (2006)
Hager, W.W., Zhang, H.: An active set algorithm for nonlinear optimization with polyhedral constraints. Sci. China Math. 59(8), 1525–1542 (2016)
Hager, W.W., Zhang, H.: Projection onto a polyhedron that exploits sparsity. SIAM J. Optim. 26(3), 1773–1798 (2016)
Iusem, A.N., Júdice, J.J., Sessa, V., Sarabando, P.: Splitting methods for the eigenvalue complementarity problem. Optim. Methods Softw. 34(6), 1184–1212 (2019)
Lacoste-Julien, S., Jaggi, M.: An affine invariant linear convergence analysis for Frank-Wolfe algorithms. arXiv preprint arXiv:1312.7864 (2013)
Lacoste-Julien, S., Jaggi, M.: On the global linear convergence of Frank-Wolfe optimization variants. In: NIPS 2015 - Advances in Neural Information Processing Systems (2015)
Luo, Z.Q., Tseng, P.: On the linear convergence of descent methods for convex essentially smooth minimization. SIAM J. Control Optim. 30(2), 408–425 (1992)
Moré, J.J., Toraldo, G.: Algorithms for bound constrained quadratic programming problems. Numer. Math. 55(4), 377–400 (1989)
Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course, vol. 87. Springer, Berlin (2013)
Nocedal, J., Wright, S.J.: Numerical optimization (2006)
Ortega, J.M., Rheinboldt, W.C.: Iterative Solution of Nonlinear Equations in Several Variables. SIAM, Philadelphia (2000)
Wolfe, P.: Convergence theory in nonlinear programming. In: Integer and Nonlinear Programming, pp. 1–36 (1970)
Xu, S., Freund, R.M., Sun, J.: Solution methodologies for the smallest enclosing circle problem. Comput. Optim. Appl. 25(1–3), 283–292 (2003)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Cristofari, A., De Santis, M., Lucidi, S. et al. An active-set algorithmic framework for non-convex optimization problems over the simplex. Comput Optim Appl 77, 57–89 (2020). https://doi.org/10.1007/s10589-020-00195-x
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-020-00195-x