Skip to main content
Log in

When do birds of a feather flock together? k-Means, proximity, and conic programming

  • Full Length Paper
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

Given a set of data, one central goal is to group them into clusters based on some notion of similarity between the individual objects. One of the most popular and widely-used approaches is k-means despite the computational hardness to find its global minimum. We study and compare the properties of different convex relaxations by relating them to corresponding proximity conditions, an idea originally introduced by Kumar and Kannan. Using conic duality theory, we present an improved proximity condition under which the Peng–Wei relaxation of k-means recovers the underlying clusters exactly. Our proximity condition improves upon Kumar and Kannan and is comparable to that of Awashti and Sheffet, where proximity conditions are established for projective k-means. In addition, we provide a necessary proximity condition for the exactness of the Peng–Wei relaxation. For the special case of equal cluster sizes, we establish a different and completely localized proximity condition under which the Amini–Levina relaxation yields exact clustering, thereby having addressed an open problem by Awasthi and Sheffet in the balanced case. Our framework is not only deterministic and model-free but also comes with a clear geometric meaning which allows for further analysis and generalization. Moreover, it can be conveniently applied to analyzing various data generative models such as the stochastic ball models and Gaussian mixture models. With this method, we improve the current minimum separation bound for the stochastic ball models and achieve the state-of-the-art results of learning Gaussian mixture models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. \(\mathcal{K}\) is pointed if for \(\varvec{Z}\in \mathcal{K}\) and \(-\varvec{Z}\in \mathcal{K}\), \(\varvec{Z}\) must be \(\varvec{0}\), see Chapter 2 in [7].

  2. The dual cone of \(\mathcal{K}\) is defined as \(\{\varvec{W}: \langle \varvec{W}, \varvec{Z}\rangle \ge 0, \forall \varvec{Z}\in \mathcal{K}\}\); in particular, there holds \((\mathcal{K}^*)^* = \mathcal{K}.\)

  3. The primal problem or dual problem is solvable if it is feasible, bounded and the optimal value is attained.

References

  1. Achlioptas, D., McSherry, F.: On spectral learning of mixtures of distributions. In: International Conference on Computational Learning Theory, pp. 458–469. Springer, New York (2005)

    Chapter  Google Scholar 

  2. Aloise, D., Deshpande, A., Hansen, P., Popat, P.: NP-hardness of Euclidean sum-of-squares clustering. Mach. Learn. 75(2), 245–248 (2009)

    Article  Google Scholar 

  3. Amini, A.A., Levina, E.: On semidefinite relaxations for the block model. Ann. Stat. 46(1), 149–179 (2018)

    Article  MathSciNet  Google Scholar 

  4. Arthur, D., Manthey, B., Röglin, H.: Smoothed analysis of the k-means method. J. ACM (JACM) 58(5), 19 (2011)

    Article  MathSciNet  Google Scholar 

  5. Awasthi, P., Bandeira, A.S., Charikar, M., Krishnaswamy, R., Villar, S., Ward, R.: Relax, no need to round: integrality of clustering formulations. In: Proceedings of the 2015 Conference on Innovations in Theoretical Computer Science, pp. 191–200. ACM (2015)

  6. Awasthi, P., Sheffet, O.: Improved spectral-norm bounds for clustering. In: APPROX-RANDOM, pp. 37–49. Springer, New York (2012)

    Chapter  Google Scholar 

  7. Ben-Tal, A., Nemirovski, A.: Lectures on modern convex optimization: analysis, algorithms, and engineering applications. SIAM (2001)

  8. Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)

    Book  Google Scholar 

  9. Dasgupta, S.: Learning mixtures of gaussians. In: 40th Annual Symposium on Foundations of Computer Science, pp. 634–644. IEEE (1999)

  10. Du, Q., Faber, V., Gunzburger, M.: Centroidal Voronoi tessellations: applications and algorithms. SIAM Rev. 41(4), 637–676 (1999)

    Article  MathSciNet  Google Scholar 

  11. Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. The Johns Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  12. Iguchi, T., Mixon, D.G., Peterson, J., Villar, S.: On the tightness of an SDP relaxation of k-means (2015). arXiv preprint arXiv:1505.04778

  13. Iguchi, T., Mixon, D.G., Peterson, J., Villar, S.: Probably certifiably correct k-means clustering. Math. Progr. 165(2), 605–642 (2017)

    Article  MathSciNet  Google Scholar 

  14. Kannan, R., Vempala, S.: Spectral algorithms. Found. Trends Theor. Comput. Sci. 4(3–4), 157–288 (2009)

    Article  MathSciNet  Google Scholar 

  15. Kumar, A., Kannan, R.: Clustering with spectral norm and the k-means algorithm. In: 2010 51st Annual IEEE Symposium on Foundations of Computer Science (FOCS), pp. 299–308. IEEE (2010)

  16. Ling, S., Strohmer, T.: Certifying global optimality of graph cuts via semidefinite relaxation: a performance guarantee for spectral clustering (2018). arXiv preprint arXiv:1806.11429

  17. Lloyd, S.: Least squares quantization in PCM. IEEE Trans. Inf. Theory 28(2), 129–137 (1982)

    Article  MathSciNet  Google Scholar 

  18. Lu, Y., Zhou, H.H.: Statistical and computational guarantees of Lloyd’s algorithm and its variants (2016). arXiv preprint arXiv:1612.02099

  19. Mahajan, M., Nimbhorkar, P., Varadarajan, K.: The planar k-means problem is NP-hard. In: International Workshop on Algorithms and Computation, pp. 274–285. Springer, New York (2009)

    Chapter  Google Scholar 

  20. Mixon, D.G., Villar, S., Ward, R.: Clustering subgaussian mixtures by semidefinite programming. Inf. Inference: J. IMA 6(4), 389–415 (2017)

    Article  MathSciNet  Google Scholar 

  21. Peng, J., Wei, Y.: Approximating k-means-type clustering via semidefinite programming. SIAM J. Opt. 18(1), 186–205 (2007)

    Article  MathSciNet  Google Scholar 

  22. Selim, S.Z., Ismail, M.A.: k-Means-type algorithms: a generalized convergence theorem and characterization of local optimality. IEEE Trans. Pattern Anal. Mach. Intell. 6(1), 81–87 (1984)

    Article  Google Scholar 

  23. Tropp, J.A.: User-friendly tail bounds for sums of random matrices. Found. Comput. Math. 12(4), 389–434 (2012)

    Article  MathSciNet  Google Scholar 

  24. Vattani, A.: k-Means requires exponentially many iterations even in the plane. Discrete Comput. Geom. 45(4), 596–616 (2011)

    Article  MathSciNet  Google Scholar 

  25. Vempala, S., Wang, G.: A spectral algorithm for learning mixture models. J. Comput. Syst. Sci. 68(4), 841–860 (2004)

    Article  MathSciNet  Google Scholar 

  26. Vershynin, R.: Introduction to the non-asymptotic analysis of random matrices. In: Eldar, Y.C., Kutyniok, G. (eds.) Compressed Sensing: Theory and Applications, Chapter 5. Cambridge University Press, Cambridge (2012)

    Google Scholar 

  27. Wright, S.J.: Primal-dual interior-point methods. SIAM (1997)

  28. Yang, L., Sun, D., Toh, K.-C.: SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints. Math. Progr. 7(3), 331–366 (2015)

    Article  MathSciNet  Google Scholar 

  29. Zhao, X.-Y., Sun, D., Toh, K.-C.: A Newton-CG augmented Lagrangian method for semidefinite programming. SIAM J. Opt. 20(4), 1737–1765 (2010)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yang Li.

Additional information

Y. Li, S. Ling, T. Strohmer, and K. Wei acknowledge support from the NSF via Grants DMS 1620455 and DMS 1737943.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, X., Li, Y., Ling, S. et al. When do birds of a feather flock together? k-Means, proximity, and conic programming. Math. Program. 179, 295–341 (2020). https://doi.org/10.1007/s10107-018-1333-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-018-1333-x

Keywords

Navigation