Skip to main content
Log in

Smoothing algorithms for computing the projection onto a Minkowski sum of convex sets

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In this paper, the problem of computing the projection, and therefore the minimum distance, from a point onto a Minkowski sum of general convex sets is studied. Our approach is based on Nirenberg’s minimum norm duality theorem and Nesterov’s smoothing techniques. It is shown that the projection onto a Minkowski sum of sets can be represented as the sum of points on constituent sets so that, at these points, all of the sets share the same normal vector which is the negative of the dual solution. For numerically solving the problem, the most suitable algorithm is the one suggested by Gilbert (SIAM J Control 4:61–80, 1966). This algorithm has been widely used in collision detection and path planning in robotics. However, a main drawback of this method is that in some cases, it turns to be very slow as it approaches the solution. In this paper we proposed NESMINO whose \(O\left( \frac{1}{\sqrt{\epsilon }}\ln (\frac{1}{\epsilon })\right) \) complexity bound is better than the worst-case complexity bound of \(O(\frac{1}{\epsilon })\) of Gilbert’s algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Bauschke, H.H., Bui, M.N., Wang, X.: On sums and convex combinations of projectors onto convex sets. J. Approx. Theory 242, 31–57 (2019)

    Article  MathSciNet  Google Scholar 

  2. Beck, A.: First-Order Methods in Optimization, vol. 25. SIAM, Philadelphia (2017)

    Book  Google Scholar 

  3. Bergen, G.: A fast and robust GJK implementation for collision detection of convex objects, Tech. report, Department of Mathematics and Computing Science, Eindhoven University of Technology (1999)

  4. Borwein, J.M., Lewis, A.S.: Convex Analysis and Nonlinear Optimization: Theory and Examples. CMS Books in Mathematics, Canadian Mathematical Society (2000)

  5. Cameron, S.: Enhancing GJK: computing minimum and penetration distances between convex polyhedra, vol. 3112–3117 (1997)

  6. Chang, L., Qiao, H., Wan, A., Keane, J.: An improved Gilbert algorithm with rapid convergence. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3861–3866 (2006)

  7. Chen, Y., Ye,X.: Projection onto a Simplex, CoRR, abs/1208.4873

  8. Condat, L.: Fast projection onto the simplex and the \(\ell _1\) ball. Math. Program. 158, 575–585 (2016)

    Article  MathSciNet  Google Scholar 

  9. Dai, Y.H.: Fast algorithms for projection on an ellipsoid. SIAM J. Optim. 16, 986–1006 (2006)

    Article  MathSciNet  Google Scholar 

  10. Dax, A.: A new class of minimum norm duality theorems. SIAM J. Optim. 19, 1947–1969 (2009)

    Article  MathSciNet  Google Scholar 

  11. Defazio, A., Bach, F., Lacoste-Julien, S.: Saga: a fast incremental gradient method with support for non-strongly convex composite objectives. In: Advances in Neural Information Processing Systems (2014)

  12. Duchi, J., Shalev-Shwartz, S., Singer, Y., Chandra, T.: Efficient projections onto the \(\ell _1\)-ball for learning in high dimensions. In: Proceedings of the 25th ACM International Conference on Machine Learning, pp. 272–279 (2008)

  13. Frank, M., Wolfe, P.: An algorithm for quadratic programming. Naval Res. Logist. Q. 3, 95–110 (1956)

    Article  MathSciNet  Google Scholar 

  14. Gabidullina, Z.R.: The problem of projecting the origin of euclidean space onto the convex polyhedron, arXiv:1605.05351

  15. Gaines, B.R., Kim, J., Zhou, H.: Algorithms for fitting the constrained lasso. J. Comput. Graph. Stat. 27(4), 861–871 (2018)

    Article  MathSciNet  Google Scholar 

  16. Garber, D., Hazan, E.: Faster rates for the frank-wolfe method over strongly-convex sets. In: ICML, 541–549 (2015)

  17. Gilbert, E.G.: An iterative procedure for computing the minimum of a quadratic form on a convex set. SIAM J. Control 4, 61–80 (1966)

    Article  MathSciNet  Google Scholar 

  18. Gilbert, E.G., Johnson, D.W., Keerthi, S.S.: A fast procedure for computing the distance between complex objects in three-dimensional space. IEEE Trans. Robot. Autom. 4, 193–203 (1988)

    Article  Google Scholar 

  19. Gilbert, E.G., Foo, C.-P.: Computing the distance between general convex objects in three-dimensional space. IEEE Trans. Robot. Autom. 6, 53–61 (1990)

    Article  Google Scholar 

  20. Hiriart-Urruty, J.B., Lemaréchal, C.: Convex Analysis and Minimization Algorithms, I and II, Grundlehren Math. Wiss. 305 and 306. Springer, Berlin (1993)

  21. Jaggi, M.: Revisiting Frank–Wolfe: projection-free sparse convex optimization. In: ICML, vol. 1, pp. 427–435 (2013)

  22. Lacoste-Julien, S., Jaggi, M.: On the global linear convergence of Frank–Wolfe optimization variants. In: Advances in Neural Information Processing Systems, pp. 496–504 (2015)

  23. Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: A fast iterative nearest point algorithm for support vector machine classifier design. IEEE Trans. Neural Netw. 11, 124–136 (2000)

    Article  Google Scholar 

  24. Kurzhanskiy, A.A., Varaiya, P.: Ellipsoidal Toolbox, Tech. Report EECS-2006-46, EECS, UC Berkeley (2006)

  25. Luenberger, D.G.: Optimization by Vector Spaces Method. Wiley, New York (1969)

    MATH  Google Scholar 

  26. Martin, S.: Training support vector machines using Gilbert’s algorithm. In: The 5th IEEE International Conference on Data Mining (ICDM), pp. 306–313 (2005)

  27. Mitchell, B.F., Demyanov, V.F., Malozemov, V.N.: Finding the point of a polyhedron closest to the origin. SIAM J. Control Optim. 12, 19–26 (1974)

    Article  MathSciNet  Google Scholar 

  28. Mordukhovich, B.S., Nam, N.M.: Limiting subgradients of minimal time functions in Banach spaces. J. Glob. Optim. 46, 615–633 (2010)

    Article  MathSciNet  Google Scholar 

  29. Nam, N.M., An, N.T., Rector, R.B., Sun, J.: Nonsmooth algorithms and Nesterov smoothing technique for generalized Fermat–Torricelli problems. SIAM J. Optim. 24(4), 1815–1839 (2014)

    Article  MathSciNet  Google Scholar 

  30. Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103, 127–152 (2005)

    Article  MathSciNet  Google Scholar 

  31. Nesterov, Y.: Introductory lectures on convex optimization: a basic course, Appl. Optim. 87, Kluwer, Boston (2004)

  32. Nesterov, Y.: A method for unconstrained convex minimization problem with the rate of convergence \(O\left(\dfrac{1}{k^2}\right)\). Dokl. Akad. Nauk SSSR 269, 543–547 (1983)

    MathSciNet  Google Scholar 

  33. Nirenberg, L.: Functional Analysis. Academic Press, New York (1961)

    Google Scholar 

  34. Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)

    Book  Google Scholar 

  35. Schmidt, M., Le Roux, N., Bach, F.: Minimizing finite sums with the stochastic average gradient. Technical report, INRIA, hal-0086005 (2013)

  36. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc.: Ser. B (Methodological)) 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  37. Tibshirani, R., Taylor, J.: The solution path of the generalized lasso. Ann. Stat. 39(3), 1335–1371 (2011)

    Article  MathSciNet  Google Scholar 

  38. Tuy, H.: Convex Analysis and Global Optimization: Nonconvex Optimization and Its Applications. Kluwer, Dordrecht (1998)

    Book  Google Scholar 

  39. Wolfe, P.: Finding the nearest point in a polytope. Math. Programm. 11, 128–149 (1976)

    Article  MathSciNet  Google Scholar 

  40. Won, J.H., Xu, J., Lange, K.: Projection onto Minkowski sums with application to constrained learning. In: International Conference on Machine Learning, pp. 3642–3651 (2019)

  41. Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc.: Ser. B (Statistical Methodology) 68(1), 49–67 (2006)

    Article  MathSciNet  Google Scholar 

  42. Yuan, L., Liu, J., Ye, J.: Efficient methods for overlapping group lasso. In: Advances in Neural Information Processing Systems, pp. 352–360 (2011)

Download references

Acknowledgements

This article was supported by the National Natural Science Foundation of China under Grant Nos. 11401152 and 11950410503. Research of the second author was supported by the China Postdoctoral Science Foundation under Grant No. 2017M622991 and the Vietnam National Foundation for Science and Technology Development under Grant No. 101.01-2017.325. The authors would like to thank anonymous reviewers for insightful comments that helped to greatly improve the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaolong Qin.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Qin, X., An, N.T. Smoothing algorithms for computing the projection onto a Minkowski sum of convex sets. Comput Optim Appl 74, 821–850 (2019). https://doi.org/10.1007/s10589-019-00124-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-019-00124-7

Keywords

Mathematics Subject Classification

Navigation