Advertisement

Towards Robust Arbitrarily Oriented Subspace Clustering

  • Zhong Zhang
  • Chongming Gao
  • Chongzhi Liu
  • Qinli Yang
  • Junming ShaoEmail author
Conference paper
  • 2.1k Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11446)

Abstract

Clustering high-dimensional data is challenging since meaningful clusters usually hide in the arbitrarily oriented subspaces, and classical clustering algorithms like k-means tend to fail in such case. Subspace clustering has thus attracted growing attention in the last decade and many algorithms have been proposed such as ORCLUS and 4C. However, existing approaches are usually sensitive to global and/or local noisy points, and the overlapping subspace clusters are little explored. Beyond, these approaches usually involve the exhaustive local search for correlated points or subspaces, which is infeasible in some cases. To deal with these problems, in this paper, we introduce a new subspace clustering algorithm called RAOSC, which formulates the Robust Arbitrarily Oriented Subspace Clustering as a group structure low-rank optimization problem. RAOSC is able to recover subspace clusters from a sea of noise while noise and overlapping points can be naturally identified during the optimization process. Unlike existing low-rank based subspace clustering methods, RAOSC can explicitly produce the subspaces of clusters without any prior knowledge of subspace dimensionality. Furthermore, RAOSC does not need a post-processing procedure to obtain the clustering result. Extensive experiments on both synthetic and real-world data sets have demonstrated that RAOSC allows yielding high-quality clusterings and outperforms many state-of-the-art algorithms.

Keywords

Subspace clustering Correlation clustering 

Notes

Acknowledgments

This work is supported by the National Natural Science Foundation of China (61403062, 61433014, 41601025), Science-Technology Foundation for Young Scientist of SiChuan Province (2016JQ0007), Fok Ying-Tong Education Foundation for Young Teachers in the Higher Education Institutions of China (161062) and National key research and development program (2016YFB0502300).

References

  1. 1.
    Achtert, E., Goldhofer, S., Kriegel, H.P., Schubert, E., Zimek, A.: Evaluation of clusterings–metrics and visual support. In: Proceedings of the 28th IEEE International Conference on Data Engineering, pp. 1285–1288 (2012)Google Scholar
  2. 2.
    Aggarwal, C.C., Wolf, J.L., Yu, P.S., Procopiuc, C., Park, J.S.: Fast algorithms for projected clustering. In: Proceedings of the 1999 ACM SIGMOD International Conference on Management of Data, vol. 28 (1999)Google Scholar
  3. 3.
    Aggarwal, C.C., Yu, P.S.: Finding generalized projected clusters in high dimensional spaces. In: Proceedings of the 2000 ACM SIGMOD International Conference on Management of Data, vol. 29 (2000)Google Scholar
  4. 4.
    Agrawal, R., Gehrke, J., Gunopulos, D., Raghavan, P.: Automatic subspace clustering of high dimensional data for data mining applications. In: Proceedings of the 1999 ACM SIGMOD International Conference on Management of Data, vol. 27 (1998)Google Scholar
  5. 5.
    Assent, I., Krieger, R., Emmanuel, M., Seidl, T.: DUSC: dimensionality unbiased subspace clustering. In: Proceedings of the 7th IEEE International Conference on Data Mining, pp. 409–414 (2008)Google Scholar
  6. 6.
    Böhm, C., Kailing, K., Kröger, P., Zimek, A.: Computing clusters of correlation connected objects. In: Proceedings of the 2004 ACM SIGMOD International Conference on Management of Data, pp. 455–466 (2004)Google Scholar
  7. 7.
    Cheng, C.H., Fu, A.W., Zhang, Y.: Entropy-based subspace clustering for mining numerical data. In: Proceedings of the 5th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 84–93 (1999)Google Scholar
  8. 8.
    Goebl, S., He, X., Plant, C., Böhm, C.: Finding the optimal subspace for clustering. In: Proceedings of the 14th IEEE International Conference on Data Mining, pp. 130–139 (2014)Google Scholar
  9. 9.
    Günnemann, S., Färber, I., Virochsiri, K., Seidl, T.: Subspace correlation clustering: finding locally correlated dimensions in subspace projections of the data. In: Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 352–360 (2012)Google Scholar
  10. 10.
    Kailing, K., Kriegel, H.P., Kröger, P.: Density-connected subspace clustering for high-dimensional data. In: Proceedings of the 2004 SIAM International Conference on Data Mining, pp. 246–256 (2004)Google Scholar
  11. 11.
    Kriegel, H.P., Kröger, P., Zimek, A.: Clustering high-dimensional data: a survey on subspace clustering, pattern-based clustering, and correlation clustering. ACM Trans. Knowl. Discov. Data 3(1), 1 (2009)CrossRefGoogle Scholar
  12. 12.
    Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 171–184 (2013)CrossRefGoogle Scholar
  13. 13.
    Liu, G., Lin, Z., Yu, Y.: Robust subspace segmentation by low-rank representation. In: Proceedings of the 27th International Conference on Machine Learning, pp. 663–670 (2010)Google Scholar
  14. 14.
    Mautz, D., Ye, W., Plant, C., Böhm, C.: Towards an optimal subspace for k-means. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 365–373 (2017)Google Scholar
  15. 15.
    Mautz, D., Ye, W., Plant, C., Böhm, C.: Discovering non-redundant k-means clusterings in optimal subspaces. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 1973–1982 (2018)Google Scholar
  16. 16.
    Nie, F., Yuan, J., Huang, H.: Optimal mean robust principal component analysis. In: Proceedings of the 31st International Conference on International Conference on Machine Learning, vol. 32, pp. 1062–1070 (2014)Google Scholar
  17. 17.
    Nie, F., Huang, H.: Subspace clustering via new low-rank model with discrete group structure constraint. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence, pp. 1874–1880 (2016)Google Scholar
  18. 18.
    Shao, J., Gao, C., Zeng, W., Song, J., Yang, Q.: Synchronization-inspired co-clustering and its application to gene expression data. In: 2017 IEEE International Conference on Data Mining, pp. 1075–1080 (2017)Google Scholar
  19. 19.
    Shao, J., Wang, X., Yang, Q., Plant, C., Böhm, C.: Synchronization-based scalable subspace clustering of high-dimensional data. Knowl. Inf. Syst. 52(1), 83–111 (2017)CrossRefGoogle Scholar
  20. 20.
    Shao, J., Yang, Q., Dang, H.V., Schmidt, B., Kramer, S.: Scalable clustering by iterative partitioning and point attractor representation. ACM Trans. Knowl. Discov. Data 11(1), 5 (2016)CrossRefGoogle Scholar
  21. 21.
    Tung, A.K.H., Xu, X., Ooi, B.C.: CURLER: finding and visualizing nonlinear correlation clusters. In: Proceedings of the 2005 ACM SIGMOD International Conference on Management of Data, pp. 467–478 (2005)Google Scholar
  22. 22.
    Ye, W., Maurus, S., Hubig, N., Plant, C.: Generalized independent subspace clustering. In: Proceedings of the 2016 IEEE International Conference on Data Mining, pp. 569–578 (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Zhong Zhang
    • 1
  • Chongming Gao
    • 1
  • Chongzhi Liu
    • 1
  • Qinli Yang
    • 1
  • Junming Shao
    • 1
    Email author
  1. 1.School of Computer Science and EngineeringUniversity of Electronic Science and Technology of ChinaChengduChina

Personalised recommendations