Abstract
In this paper, the problem of computing the projection, and therefore the minimum distance, from a point onto a Minkowski sum of general convex sets is studied. Our approach is based on Nirenberg’s minimum norm duality theorem and Nesterov’s smoothing techniques. It is shown that the projection onto a Minkowski sum of sets can be represented as the sum of points on constituent sets so that, at these points, all of the sets share the same normal vector which is the negative of the dual solution. For numerically solving the problem, the most suitable algorithm is the one suggested by Gilbert (SIAM J Control 4:61–80, 1966). This algorithm has been widely used in collision detection and path planning in robotics. However, a main drawback of this method is that in some cases, it turns to be very slow as it approaches the solution. In this paper we proposed NESMINO whose \(O\left( \frac{1}{\sqrt{\epsilon }}\ln (\frac{1}{\epsilon })\right) \) complexity bound is better than the worst-case complexity bound of \(O(\frac{1}{\epsilon })\) of Gilbert’s algorithm.
Similar content being viewed by others
References
Bauschke, H.H., Bui, M.N., Wang, X.: On sums and convex combinations of projectors onto convex sets. J. Approx. Theory 242, 31–57 (2019)
Beck, A.: First-Order Methods in Optimization, vol. 25. SIAM, Philadelphia (2017)
Bergen, G.: A fast and robust GJK implementation for collision detection of convex objects, Tech. report, Department of Mathematics and Computing Science, Eindhoven University of Technology (1999)
Borwein, J.M., Lewis, A.S.: Convex Analysis and Nonlinear Optimization: Theory and Examples. CMS Books in Mathematics, Canadian Mathematical Society (2000)
Cameron, S.: Enhancing GJK: computing minimum and penetration distances between convex polyhedra, vol. 3112–3117 (1997)
Chang, L., Qiao, H., Wan, A., Keane, J.: An improved Gilbert algorithm with rapid convergence. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3861–3866 (2006)
Chen, Y., Ye,X.: Projection onto a Simplex, CoRR, abs/1208.4873
Condat, L.: Fast projection onto the simplex and the \(\ell _1\) ball. Math. Program. 158, 575–585 (2016)
Dai, Y.H.: Fast algorithms for projection on an ellipsoid. SIAM J. Optim. 16, 986–1006 (2006)
Dax, A.: A new class of minimum norm duality theorems. SIAM J. Optim. 19, 1947–1969 (2009)
Defazio, A., Bach, F., Lacoste-Julien, S.: Saga: a fast incremental gradient method with support for non-strongly convex composite objectives. In: Advances in Neural Information Processing Systems (2014)
Duchi, J., Shalev-Shwartz, S., Singer, Y., Chandra, T.: Efficient projections onto the \(\ell _1\)-ball for learning in high dimensions. In: Proceedings of the 25th ACM International Conference on Machine Learning, pp. 272–279 (2008)
Frank, M., Wolfe, P.: An algorithm for quadratic programming. Naval Res. Logist. Q. 3, 95–110 (1956)
Gabidullina, Z.R.: The problem of projecting the origin of euclidean space onto the convex polyhedron, arXiv:1605.05351
Gaines, B.R., Kim, J., Zhou, H.: Algorithms for fitting the constrained lasso. J. Comput. Graph. Stat. 27(4), 861–871 (2018)
Garber, D., Hazan, E.: Faster rates for the frank-wolfe method over strongly-convex sets. In: ICML, 541–549 (2015)
Gilbert, E.G.: An iterative procedure for computing the minimum of a quadratic form on a convex set. SIAM J. Control 4, 61–80 (1966)
Gilbert, E.G., Johnson, D.W., Keerthi, S.S.: A fast procedure for computing the distance between complex objects in three-dimensional space. IEEE Trans. Robot. Autom. 4, 193–203 (1988)
Gilbert, E.G., Foo, C.-P.: Computing the distance between general convex objects in three-dimensional space. IEEE Trans. Robot. Autom. 6, 53–61 (1990)
Hiriart-Urruty, J.B., Lemaréchal, C.: Convex Analysis and Minimization Algorithms, I and II, Grundlehren Math. Wiss. 305 and 306. Springer, Berlin (1993)
Jaggi, M.: Revisiting Frank–Wolfe: projection-free sparse convex optimization. In: ICML, vol. 1, pp. 427–435 (2013)
Lacoste-Julien, S., Jaggi, M.: On the global linear convergence of Frank–Wolfe optimization variants. In: Advances in Neural Information Processing Systems, pp. 496–504 (2015)
Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: A fast iterative nearest point algorithm for support vector machine classifier design. IEEE Trans. Neural Netw. 11, 124–136 (2000)
Kurzhanskiy, A.A., Varaiya, P.: Ellipsoidal Toolbox, Tech. Report EECS-2006-46, EECS, UC Berkeley (2006)
Luenberger, D.G.: Optimization by Vector Spaces Method. Wiley, New York (1969)
Martin, S.: Training support vector machines using Gilbert’s algorithm. In: The 5th IEEE International Conference on Data Mining (ICDM), pp. 306–313 (2005)
Mitchell, B.F., Demyanov, V.F., Malozemov, V.N.: Finding the point of a polyhedron closest to the origin. SIAM J. Control Optim. 12, 19–26 (1974)
Mordukhovich, B.S., Nam, N.M.: Limiting subgradients of minimal time functions in Banach spaces. J. Glob. Optim. 46, 615–633 (2010)
Nam, N.M., An, N.T., Rector, R.B., Sun, J.: Nonsmooth algorithms and Nesterov smoothing technique for generalized Fermat–Torricelli problems. SIAM J. Optim. 24(4), 1815–1839 (2014)
Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103, 127–152 (2005)
Nesterov, Y.: Introductory lectures on convex optimization: a basic course, Appl. Optim. 87, Kluwer, Boston (2004)
Nesterov, Y.: A method for unconstrained convex minimization problem with the rate of convergence \(O\left(\dfrac{1}{k^2}\right)\). Dokl. Akad. Nauk SSSR 269, 543–547 (1983)
Nirenberg, L.: Functional Analysis. Academic Press, New York (1961)
Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)
Schmidt, M., Le Roux, N., Bach, F.: Minimizing finite sums with the stochastic average gradient. Technical report, INRIA, hal-0086005 (2013)
Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc.: Ser. B (Methodological)) 58(1), 267–288 (1996)
Tibshirani, R., Taylor, J.: The solution path of the generalized lasso. Ann. Stat. 39(3), 1335–1371 (2011)
Tuy, H.: Convex Analysis and Global Optimization: Nonconvex Optimization and Its Applications. Kluwer, Dordrecht (1998)
Wolfe, P.: Finding the nearest point in a polytope. Math. Programm. 11, 128–149 (1976)
Won, J.H., Xu, J., Lange, K.: Projection onto Minkowski sums with application to constrained learning. In: International Conference on Machine Learning, pp. 3642–3651 (2019)
Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc.: Ser. B (Statistical Methodology) 68(1), 49–67 (2006)
Yuan, L., Liu, J., Ye, J.: Efficient methods for overlapping group lasso. In: Advances in Neural Information Processing Systems, pp. 352–360 (2011)
Acknowledgements
This article was supported by the National Natural Science Foundation of China under Grant Nos. 11401152 and 11950410503. Research of the second author was supported by the China Postdoctoral Science Foundation under Grant No. 2017M622991 and the Vietnam National Foundation for Science and Technology Development under Grant No. 101.01-2017.325. The authors would like to thank anonymous reviewers for insightful comments that helped to greatly improve the manuscript.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Qin, X., An, N.T. Smoothing algorithms for computing the projection onto a Minkowski sum of convex sets. Comput Optim Appl 74, 821–850 (2019). https://doi.org/10.1007/s10589-019-00124-7
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-019-00124-7
Keywords
- Minimum norm problem
- Minkowski sum of sets
- Gilbert’s algorithm
- Nesterov’s smoothing technique
- Fast gradient method
- SAGA