Abstract
Identifying clusters of similar objects in data plays a significant role in a wide range of applications. As a model problem for clustering, we consider the densest \(k\)-disjoint-clique problem, whose goal is to identify the collection of \(k\) disjoint cliques of a given weighted complete graph maximizing the sum of the densities of the complete subgraphs induced by these cliques. In this paper, we establish conditions ensuring exact recovery of the densest \(k\) cliques of a given graph from the optimal solution of a particular semidefinite program. In particular, the semidefinite relaxation is exact for input graphs corresponding to data consisting of \(k\) large, distinct clusters and a smaller number of outliers. This approach also yields a semidefinite relaxation with similar recovery guarantees for the biclustering problem. Given a set of objects and a set of features exhibited by these objects, biclustering seeks to simultaneously group the objects and features according to their expression levels. This problem may be posed as that of partitioning the nodes of a weighted bipartite complete graph such that the sum of the densities of the resulting bipartite complete subgraphs is maximized. As in our analysis of the densest \(k\)-disjoint-clique problem, we show that the correct partition of the objects and features can be recovered from the optimal solution of a semidefinite program in the case that the given data consists of several disjoint sets of objects exhibiting similar features. Empirical evidence from numerical experiments supporting these theoretical guarantees is also provided.
Similar content being viewed by others
Notes
The penalty parameter \(\beta = \min \{ \max \{5n/k, 80 \}, 500\}/2\) was chosen via simulation, and seems to work well for most problem instances.
References
Ackerman, M., Ben-David, S.: Clusterability: a theoretical study. In: Proceedings of the AISTATS-09. JMLR: W &CP 5, 1–8 (2009)
Aloise, D., Deshpande, A., Hansen, P., Popat, P.: Np-hardness of euclidean sum-of-squares clustering. Mach. Learn. 75(2), 245–248 (2009)
Ames, B., Vavasis, S.: Convex optimization for the planted k-disjoint-clique problem. Arxiv, preprint arXiv:1008.2814, 2010
Ames, B., Vavasis, S.: Nuclear norm minimization for the planted clique and biclique problems. Math. Program. 129(1), 1–21 (2011)
Balakrishnan, S., Xu, M., Krishnamurthy, A., Singh, A.: Noise thresholds for spectral clustering. Adv. Neural Inf. Process. Syst. 25(3), 954–962 (2011)
Bansal, N.A., Blum, A., Chawla, S.: Correlation clustering. Mach. Learn. 56(1), 89–113 (2004)
Berkhin, P.: A survey of clustering data mining techniques. In: Kogan, J., Nicholas, C., Teboulle, M. (eds.) Grouping Multidimensional Data: Recent Advances in Clustering, pp. 25–71. Springer, Berlin, Heidelberg (2006)
Birgin, E., Martínez, J., Raydan, M.: Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optimiz. 10(4), 1196–1211 (2000)
Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends. Mach. Learn. 3(1), 1–122 (2011)
Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)
Busygin, S., Prokopyev, O., Pardalos, P.: Biclustering in data mining. Comput. Oper. Res. 35(9), 2964–2987 (2008)
Candès, E., Plan, Y.: Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements. Arxiv, preprint arXiv:1001.0339 (2010)
Candès, E., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)
Candès, E., Romberg, J., Tao, T.: Stable signal recovery from incomplete and inaccurate measurements. Commun. Pure Appl. Math. 59(8), 1207–1223 (2006)
Candès, E., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)
Donoho, D.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)
Eckstein, J., Bertsekas, D.P.: On the douglasrachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55(1–3), 293–318 (1992)
Fan, N., Boyko, N., Pardalos, P.: Recent advances of data biclustering with application in computational neuroscience. Computat. Neurosci. pp. 105–132 (2010)
Fan, N., Pardalos, P.: Multi-way clustering and biclustering by the ratio cut and normalized cut in graphs. J. Comb. Optimiz. 23(2), 224–251 (2012)
Flynn, C., Perry, P.: Consistent biclustering. Arxiv, preprint arXiv:1206.6927 (2012)
Füredi, Z., Komlós, J.: The eigenvalues of random symmetric matrices. Combinatorica 1(3), 233–241 (1981)
Geman, S.: A limit theorem for the norm of random matrices. Ann. Prob. 8(2), 252–261 (1980)
Golub, G.Van, Loan, C.: Matrix Computations. Johns Hopkins University Press, Baltimore (1996)
Gross, D.: Recovering low-rank matrices from few coefficients in any basis. IEEE Trans. Inf. Theory 57(3), 1548–1566 (2011)
Hoeffding, W.: Probability inequalities for sums of bounded random variables. J. Am. Stat. Assoc. 58, 13–30 (1962)
Jalali, A., Chen, Y., Sanghavi, S., Xu, H.: Clustering partially observed graphs via convex optimization. Arxiv, preprint arXiv:1104.4803 (2011)
Kannan, R., Vempala, S., Vetta, A.: On clusterings: good, bad and spectral. J. ACM 51(3), 497–515 (2004)
Kolar, M., Balakrishnan, S., Rinaldo, A., Singh, A.: Minimax localization of structural information in large noisy matrices. In: Shawe-Taylor, J., Zemel, R.S., Bartlett, P. (eds.) Advances in Neural Information Processing Systems, pp. 909–917. FCN, Weinberger (2011)
Lu, Z., Zhang, Y.: Penalty decomposition methods for rank minimization. Technical report. Department of Mathematics, Simon Fraser University. Arxiv, preprint (2010)
Mathieu, C., Schudy, W.: Correlation clustering with noisy input. In: Proceedings of the twenty-first annual ACM-SIAM symposium on discrete algorithms, pp. 712–728. Society for Industrial and Applied Mathematics (2010)
Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: analysis and an algorithm. Adv. Neural Inf. Process. Syst. 2, 849–856 (2002)
Ostrovsky, R., Rabani, Y., Schulman, L., Swamy, C.: The effectiveness of Lloyd-type methods for the k-means problem. In: Proceedings of 47st Annual IEEE Symposium on the Foundations of Computer Science (2006)
Oymak, S., Hassibi, B.: Finding dense clusters via “low rank + sparse” decomposition. Arxiv, preprint arXiv:1104.5186 (2011)
Oymak, S., Mohan, K., Fazel, M., Hassibi, B.: A simplified approach to recovery conditions for low rank matrices. In: Information Theory Proceedings (ISIT), 2011 IEEE International Symposium on IEEE, pp. 2318–2322 (2011)
Peng, J., Wei, Y.: Approximating k-means-type clustering via semidefinite programming. SIAM J. Optimiz. 18(1), 186–205 (2007)
Recht, B.: A simpler approach to matrix completion. J. Mach. Learn. Res. 12, 3413–3430 (2011)
Recht, B., Fazel, M., Parrilo, P.: Guaranteed minimum-rank solutions of linear matrix equations via muclear norm minimization. SIAM Rev. 52, 471 (2010)
Recht, B., Xu, W., Hassibi, B.: Null space conditions and thresholds for rank minimization. Mathematical Programming, pp. 1–28 (2010)
Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1997)
Rohe, K., Chatterjee, S., Yu, B.: Spectral clustering and the high-dimensional stochastic blockmodel. Ann. Stat. 39(4), 1878–1915 (2011)
Rohe, K., Yu, B.: Co-clustering for directed graphs; the stochastic co-blockmodel and a spectral algorithm. Arxiv, preprint arXiv:1204.2296 (2012)
Shamir, R., Tsur, D.: Improved algorithms for the random cluster graph model. Alg. Theory SWAT 2002, 230–239 (2002)
Singh, V., Mukherjee, L., Peng, J., Xu, J.: Ensemble clustering using semidefinite programming with applications. Mach. Learn. 79(1), 177–200 (2010)
Tunçel, L.: Polyhedral and Semidefinite Programming Methods in Combinatorial Optimization. American Mathematical Society, Providence (2010)
Van Den Berg, E., Friedlander, M.: Probing the pareto frontier for basis pursuit solutions. SIAM J. Sci. Comput. 31(2), 890–912 (2008)
Acknowledgments
This research was supported in part by the Institute for Mathematics and its Applications with funds provided by the National Science Foundation, and by a Postgraduate Scholarship from NSERC (Natural Science and Engineering Research Council of Canada). I am grateful to Stephen Vavasis, Henry Wolkowicz, Levent Tunçel, Shai Ben-David, Inderjit Dhillon, Ben Recht, and Ting Kei Pong for their helpful comments and suggestions. I am especially grateful to Ting Kei for his help implementing the ADMM algorithms used to perform the numerical trials. I would also like to thank Warren Schudy for suggesting some relevant references that were omitted in an earlier version. Finally, I thank the two anonymous reviewers, whose suggestions vastly improved the presentation and organization of this paper.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ames, B.P.W. Guaranteed clustering and biclustering via semidefinite programming. Math. Program. 147, 429–465 (2014). https://doi.org/10.1007/s10107-013-0729-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-013-0729-x