Skip to main content

Dual-Constrained Deep Semi-Supervised Coupled Factorization Network with Enriched Prior

Abstract

Nonnegativity based matrix factorization is usually powerful for learning the parts-based “shallow” representation, however it fails to discover deep hidden information within both the basis concept and representation spaces. In this paper, we therefore propose a new dual-constrained deep semi-supervised coupled factorization network (DS2CF-Net) for learning hierarchical representations. DS2CF-Net is formulated as the joint partial-label and structure-constrained deep factorization network using multi-layers of linear transformations, which coupled updates the basic concepts and new representations in each layer. An error correction mechanism with feature fusion strategy is also integrated between consecutive layers to improve the representation ability of features. To improve the discriminating abilities of both representation and coefficients in feature space, we clearly consider how to enrich the prior knowledge by the coefficients-based label prediction, and incorporate the enriched prior knowledge as the additional label and structure constraints. To be specific, the label constraint enables the intra-class samples to have the same coordinate in the feature space, while the structure constraint forces the coefficients in each layer to be block-diagonal so that the enriched prior knowledge are more accurate. Besides, we integrate the adaptive dual-graph learning to retain the locality structures of both the data manifold and feature manifold in each layer. Finally, a fine-tuning process is performed to refine the structure-constrained matrix and data weight matrix in each layer using the predicted labels for more accurate representations. Extensive simulations on public databases show that our method can obtain state-of-the-art performance.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

References

  • Belkin, M., & Niyogi, P. (2003). Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6), 1373–1396.

    Article  MATH  Google Scholar 

  • Bergstra, J., Yamins, D., & Cox, D. (2013). Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures. Proceedings of International Conference on Machine Learning, 28(1), 115–123.

    Google Scholar 

  • Cai, D., He, X., & Han, J. (2005). Document clustering using locality preserving indexing. IEEE Transactions on Knowledge and Data Engineering, 17(12), 1624–1637.

    Article  Google Scholar 

  • Cai, D., He, X. F., Han, J., & Huang, T. (2011a). Graph regularized nonnegative matrix factorization for data representation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33(8), 1548–1560.

    Article  Google Scholar 

  • Cai, D., He, X., & Han, J. (2011b). Locally consistent concept factorization for document clustering. IEEE Transactions on Knowledge and Data Engineering, 23(6), 902–913.

    Article  Google Scholar 

  • Cichocki, A., & Zdunek, R. (2006). Multilayer nonnegative matrix factorization. Electronics Letters, 42(16), 947–948.

    Article  Google Scholar 

  • Cohen, G., Afshar, S., Tapson, J., & Schaik, A. V. (2017). EMNIST: Extending MNIST to handwritten letters. In Proceedings of the international joint conference on neural networks, Anchorage, AK, USA.

  • Cormen, T. H. (2009). Introduction to algorithms. MIT Press.

    MATH  Google Scholar 

  • Dhillon, L. (2001). Co-clustering documents and words using bipartite spectral graph partitioning. In Proceedings of the 17th ACM SIGKDD (pp. 269–274).

  • Golub, G. H., & Reinsch, C. (1970). Singular value decomposition and least squares solutions. Numerische Mathematik, 14(5), 403–420.

    Article  MathSciNet  MATH  Google Scholar 

  • Gray, R. (1984). Vector quantization. IEEE Assp Magazine, 1(2), 4–29.

    Article  Google Scholar 

  • Gu, Q., & Zhou, J. (2009). Co-clustering on manifolds. In Proceedings of the 15th ACM SIGKDD (pp. 359–368).

  • He, Z., Yi, S., Cheung, Y. M., You, X., & Tang, Y. Y. (2017). Robust object tracking via key patch sparse representation. IEEE Transactions on Cybernetics, 47(2), 354–364.

    Google Scholar 

  • He, X., Cai, D., & Niyogi, P. (2006). Laplacian score for feature selection. In Advances in neural information processing systems (pp. 507–515).

  • Hou, C., Nie, F., Li, X., Yi, D. Y., & Wu, Y. (2014). Joint embedding learning and sparse regression: A framework for unsupervised feature selection. IEEE Transactions on Cybernetics, 44(6), 793–804.

    Article  Google Scholar 

  • Hull, J. (1994). A database for handwritten text recognition research. IEEE Transactions on Pattern Analysis Machine Intelligence, 16(5), 550–554.

    Article  Google Scholar 

  • Jolliffe, I. T. (1986). Principal Component Analysis (Vol. 87, pp. 41–64). Berlin: Springer.

    Book  Google Scholar 

  • Lee, D., & Seung, H. (1999). Learning the parts of objects by non-negative matrix factorization. Nature, 401, 788–791.

    Article  MATH  Google Scholar 

  • Leibe, B., & Schiele, B. (2003). Analyzing appearance and contour based methods for object categorization. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 409–415).

  • Li, Z., & Tang, J. (2017). Weakly-supervised deep matrix factorization for social image understanding. IEEE Transactions on Image Processing, 26(1), 276–288.

    Article  MathSciNet  MATH  Google Scholar 

  • Li, H., Zhang, J., & Liu, J. (2017a). Graph-regularized CF with local coordinate for image representation. Journal of Visual Communication and Image Representation, 49, 392–400.

    Article  Google Scholar 

  • Li, X., Shen, X., Shu, Z., Ye, Q., & Zhao, C. (2017b). Graph regularized multilayer concept factorization for data representation. Neurocomputing, 238, 139–151.

    Article  Google Scholar 

  • Li, X., Cui, G., & Dong, Y. (2017c). Graph regularized non-negative low-rank matrix factorization for image clustering. IEEE Transactions on Cybernetics, 47(11), 3840–3853.

    Article  MathSciNet  Google Scholar 

  • Li, X., Zhao, C. X., Shu, Z. Q., & Wang, Q. (2015). Multilayer concept factorization for data representation. In Proceedings of international conference on computer science and education, Cambridge, UK (pp. 486–491).

  • Lin, B. H., Tao, X. M., & Lu, J. H. (2020). Hyperspectral image denoising via matrix factorization and deep prior regularization. IEEE Transactions on Image Processing, 29, 565–578.

    Article  MathSciNet  Google Scholar 

  • Liu, H., Yang, G., & Wu, Z. (2014). Constrained concept factorization for image representation. IEEE Transactions on Cybernetics, 44(7), 1214–1224.

    Article  Google Scholar 

  • Liu, H., Wu, Z., & Li, X. (2012). Constrained nonnegative matrix factorization for image representation. IEEE Transactions on Pattern Analysis Machine Intelligence, 34(7), 1299–1311.

    Article  Google Scholar 

  • Lovasz, L., & Plummer, M. (1986). Matching theory. North Holland: Akademiai Kiado.

    MATH  Google Scholar 

  • Ma, S. H., Zhang, L. F., Hu, W. B., Zhang, Y. P., Wu, J., & Li, X. L. (2018). Self-representative manifold concept factorization with adaptive neighbors for clustering. In Proceedings of the international joint conference on artificial intelligence, Stockholm, Sweden (pp. 2539–2545).

  • Ma, X. K., Dong, D., & Wang, Q. (2019). Community detection in multi-layer networks using joint nonnegative matrix factorization. IEEE Transactions on Knowledge and Data Engineering, 31(2), 273–286.

    Article  Google Scholar 

  • Nayar, S., Nene, S., & Murase, H. (1996). Columbia object image library (coil 100). Department of Computer Science, Columbia University, Tech. Rep. CUCS-006-96.

  • Pan, J., & Gillis, N. (2021). Generalized separable nonnegative matrix factorization. IEEE Transactions on Pattern Analysis and Machine Intelligence, 43(5), 1546–1561.

    Article  Google Scholar 

  • Peng, Y., Tang, R. X., Kong, W. Z., Zhang, J. H., Nie, F. P., & Cichocki, A. (2019). Joint structured graph learning and clustering based on concept factorization. In IEEE ICASSP, Brighton, UK (pp. 3162–3166).

  • Rahiche, A., & Cheriet, M. (2021). Blind decomposition of multispectral document images using orthogonal nonnegative matrix factorization. IEEE Transactions on Image Processing, 30, 5997–6012.

    Article  MathSciNet  Google Scholar 

  • Rajabi, R., & Ghassemian, H. (2015). Spectral unmixing of hyperspectral imagery using multilayer NMF. IEEE Geoscience and Remote Sensing Letters, 12(1), 38–42.

    Article  Google Scholar 

  • Ren, J., Zhang, Z., Li, S., Wang, Y., Liu, G., Yan, S., & Wang, M. (2019). Learning hybrid representation by robust dictionary learning in factorized compressed space. IEEE Transactions on Image Processing, 29, 3941–3956.

    Article  MathSciNet  Google Scholar 

  • Roweis, S., & Saul, L. (2000). Nonlinear dimensionality reduction by locally linear embedding. Science, 290, 2323–2326.

    Article  Google Scholar 

  • Shang, F., Jiao, L., & Wang, F. (2012). Graph dual regularization non-negative matrix factorization for co-clustering. Pattern Recognition., 45(6), 2237–2250.

    Article  MATH  Google Scholar 

  • Shen, Y., Liu, L., & Shao, L. (2019). Unsupervised binary representation learning with deep variational networks. International Journal of Computer Vision, 127, 1614–1628.

    Article  Google Scholar 

  • Sugiyama, M. (2007). Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis. Journal of Machine Learning Research, 8, 1027–1061.

    MATH  Google Scholar 

  • Tenenbaum, J., de Silva, V., & Langford, J. (2000). A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500), 2319–2323.

    Article  Google Scholar 

  • Trigeorgis, G., Bousmalis, K., Zafeiriou, S., & Schuller, B. W. (2015). A deep matrix factorization method for learning attribute representations. IEEE Transactions on Pattern Analysis Machine Intelligence, 39(3), 417–429.

    Article  Google Scholar 

  • Wei, X., & Gong, Y. (2004). Document clustering by concept factorization. In Proceedings of the ACM SIGIR.

  • Weyrauch, B., Huang, J., Heisele, B., & Blanz, V. (2004). Component based face recognition with 3D morphable models. In Proceedings of the IEEE workshop on face processing in video, Washington, DC.

  • Xiao, H., Rasul, K., Vollgraf, R. (2017). Fashion-mnist: A novel image dataset for benchmarking machine learning algorithms. arXiv: 178.07747v2.

  • Xiao, T., Tian, H., & Shen, H. (2019). Variational deep collaborative matrix factorization for social recommendation. In Proceedings of PAKDD, Macau, China (pp. 426–437).

  • Yang, Y., Shen, H., Ma, Z., Huang, Z., & Zhou, X. (2011). L2,1-norm regularized discriminative feature selection for unsupervised learning. In Proceedings of IJCAI, Barcelona, Spain.

  • Yang, J., & Yang, J. (2002). From image vector to matrix: A straight forward image projection technique—IMPCA vs. PCA. Pattern Recognition, 35(9), 1997–1999.

    Article  MATH  Google Scholar 

  • Ye, J., & Jin, Z. (2014). Dual-graph regularized concept factorization for clustering. Neurocomputing, 138(11), 120–130.

    Article  Google Scholar 

  • Ye, J., & Jin, Z. (2017). Graph-regularized local coordinate concept factorization for image representation. Neural Processing Letters, 46(2), 427–449.

    Article  Google Scholar 

  • Zhang, Z., Li, F., Zhao, M., Zhang, L., & Yan, S. (2016). Joint low-rank and sparse principal feature coding for enhanced robust representation and visual classification. IEEE Transactions on Image Processing, 25(6), 2429–2443.

    Article  MathSciNet  MATH  Google Scholar 

  • Zhang, Z., Zhang, Y., Liu, G., Tang, J., Yan, S., & Wang, M. (2020b). Joint label prediction based semi-supervised adaptive concept factorization for robust data representation. IEEE Transactions on Knowledge and Data Engineering, 32(5), 952–970.

    Article  Google Scholar 

  • Zhang, C., Fu, H., Wang, J., Li, W., Cao, X., & Hu, Q. (2020d). Tensorized multi-view subspace representation learning. International Journal of Computer Vision, 128, 2344–2361.

    Article  MathSciNet  Google Scholar 

  • Zhang, Y., Zhang, Z., Zhang, Z., Zhao, M., Zhang, L., Zha, Z., & Wang, M. (2019). Deep self-representative concept factorization network for representation learning. In Proceedings of the SIAM international conference on data mining (SDM), Cincinnati, USA.

  • Zhang, H., Zhang, Z., Zhao, M., Ye, Q., Zhang, M., & Wang, M. (2020). Robust triple-matrix-recovery-based auto-weighted label propagation for classification. IEEE Transactions on Neural Networks and Learning Systems, 31(11), 4538–4552.

    Article  MathSciNet  Google Scholar 

  • Zhang, Y., Zhang, Z., Wang, Y., Zhang, Z., Zhang, L., Yan, S., & Wang, M. (2020). Partial-label and structure-constrained deep coupled factorization network. In Proceedings of the 35th AAAI conference on artificial intelligence, virtual conference.

  • Zhang, Z., Zhang, Y., Li, S., Liu, G., Zeng, D., Yan, S., & Wang, M. (2021). Flexible auto-weighted local-coordinate concept factorization: A robust framework for unsupervised clustering. IEEE Transactions on Knowledge and Data Engineering, 33(4), 1523–1539.

    Article  Google Scholar 

  • Zhang, Z., Zhang, Y., Xu, M., Zhang, L., Yang, Y., & Yan, S. (2021). A Survey on concept factorization: From shallow to deep representation learning. Information Processing and Management, 58(3), 102534.

    Article  Google Scholar 

  • Zhao, R., & Tan, V. (2018). A unified convergence analysis of the multiplicative update algorithm for regularized nonnegative matrix factorization. IEEE Transactions on Signal Processing, 66(1), 129–138.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors also would like to express our sincere thanks to the anonymous reviewers’ constructive comments and suggestions which have made the paper a higher standard. We also sincerely thank Prof. Mingliang Xu and Prof. Yi Yang for their professional discussion on the error correction mechanism with feature fusion strategy, and the fine-tuning process for refining the features. This work is partially supported by the National Natural Science Foundation of China (62072151, 62020106007, 61806035 and U1936217), Anhui Provincial Natural Science Fund for Distinguished Young Scholars (2008085J30) and the Fundamental Research Funds for the Central Universities of China (JZ2019-HGPA0102).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhao Zhang.

Additional information

Communicated by Suha Kwak.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zhang, Y., Zhang, Z., Wang, Y. et al. Dual-Constrained Deep Semi-Supervised Coupled Factorization Network with Enriched Prior. Int J Comput Vis 129, 3233–3254 (2021). https://doi.org/10.1007/s11263-021-01524-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11263-021-01524-1

Keywords

  • Deep semi-supervised coupled factorization network
  • Representation learning
  • Dual constraints
  • Clustering
  • Enriched prior
  • Error correction
  • Fine-tuning of features