Abstract
Multi-label dimensionality reduction is an appealing and challenging task in data mining and machine learning. Previous works on multi-label dimensionality reduction mainly conduct in an unsupervised or supervised way, and ignore abundant unlabeled samples. In addition, most of them emphasize on using pairwise correlations between samples, therefore, unable to utilize the high-order sample information to improve the performance. To address these challenges, we propose an approach called Semi-supervised Multi-label Dimensionality Reduction via Low Rank Representation (SMLD-LRR). SMLD-LRR first utilizes the low rank representation in the feature space of samples to calculate the low rank constrained coefficient matrix, then it adapts the coefficient matrix to capture the high-order structure of samples. Next, it uses low rank representation in the label space of labeled samples to explore the global correlations of labels. After that, SMLD-LRR further employs the learned high-order structure of samples to enforce the consistency between samples in the original space and the corresponding samples in the projected subspace by maximizing the dependence between them. Finally, these two high-order correlations and the dependence term are incorporated into the multi-label linear discriminant analysis for dimensionality reduction. Extensive experimental results on four multi-label datasets demonstrate that SMLD-LRR achieves better performance than other competitive methods across various evaluation criteria; it also can effectively exploit high-order label correlations to preserve sample structure in the projected subspace.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Available at http://mulan.sourceforge.net/datasets-mlc.html.
References
Zhang, M., Zhou, Z.: A review on multi-label learning algorithms. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2014)
Wu, X.Z., Zhou, Z.H.: A unified view of multi-label performance measures. In: ICML, pp. 3780–3788. IMIS, Sydney (2017)
Guo, B., Hou, C., Nie, F., Yi, D.: Semi-supervised multi-label dimensionality reduction. In: IEEE ICDM, pp. 919–924. IEEE, Barcelona (2016)
Nie, F., Xu, D., Li, X., Xiang, S.: Semisupervised dimensionality reduction and classification through virtual label regression. SMC Man Cybern. Part B (Cybern.) 41(3), 675–685 (2011)
Jolliffe, I.: Principal Component Analysis. In: International Encyclopedia of Statistical Science, pp. 1094–1096. Springer, Heidelberg (2011)
Belkin, M., Niyogi, P., Sindhwani, V.: Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 7, 2399–2434 (2006)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Hotelling, H.: Relations between two sets of variates. Biometrika 28(3–4), 321–377 (1936)
Ji, S., Tang, L., Yu, S., Ye, J.: A shared-subspace learning framework for multi-label classification. TKDD 4(2), 8 (2010)
Zhang, Y., Zhou, Z.: Multilabel dimensionality reduction via dependence maximization. TKDD 4(3), 14 (2010)
Zhang, Z., Chow, T.W.: Robust linearly optimized discriminant analysis. Neurocomputing 79, 140–157 (2012)
Gretton, A., Bousquet, O., Smola, A., Schölkopf, B.: Measuring statistical dependence with Hilbert-Schmidt norms. In: Jain, S., Simon, H.U., Tomita, E. (eds.) ALT 2005. LNCS (LNAI), vol. 3734, pp. 63–77. Springer, Heidelberg (2005). https://doi.org/10.1007/11564089_7
Wang, H., Ding, C., Huang, H.: Multi-label linear discriminant analysis. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6316, pp. 126–139. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15567-3_10
Fisher, R.A.: The use of multiple measurements in taxonomic problems. Ann. Hum. Genet. 7(2), 179–188 (1936)
Sun, L., Ji, S., Yu, S., Ye, J.: On the equivalence between canonical correlation analysis and orthonormalized partial least squares. In: IJCAI, Padadena, pp. 1230–1235 (2009)
Yu, G., Zhang, G., Domeniconi, C., Yu, Z., You, J.: Semi-supervised classification based on random subspace dimensionality reduction. Pattern Recognit. 45(3), 1119–1135 (2012)
Wu, H., Prasad, S.: Semi-supervised dimensionality reduction of hyperspectral imagery using pseudo-labels. Pattern Recognit. 74, 212–224 (2018)
Yuan, Y., Zhao, K., Lu, H.: Multi-label linear discriminant analysis with locality consistency. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds.) ICONIP 2014. LNCS, vol. 8835, pp. 386–394. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-12640-1_47
Yu, Y., Yu, G., Chen, X., Ren, Y.: Semi-supervised multi-label linear discriminant analysis. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, E.S. (eds.) ICONIP 2017. LNCS, vol. 10634, pp. 688–698. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70087-8_71
Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. TPAMI 35(1), 171–184 (2013)
Yang, S., Wang, X., Wang, M., Han, Y., Jiao, L.: Semi-supervised low-rank representation graph for pattern recognition. IEEE Trans. Image Process. 7(2), 131–136 (2013)
Zhuang, L., Wang, J., Lin, Z., Yang, A.Y., Ma, Y., Yu, N.: Locality-preserving low-rank representation for graph construction from nonlinear manifolds. Neurocomputing 175, 715–722 (2016)
Wen, J., Zhang, B., Xu, Y., Yang, J., Han, N.: Adaptive weighted nonnegative low-rank representation. Pattern Recognit. 81, 326–340 (2018)
Lin, Z., Liu, R., Su, Z.: Linearized alternating direction method with adaptive penalty for low-rank representation. In: NIPS, pp. 612–620. MIT Press, Spain (2011)
Zhang, H., Lin, Z., Zhang, C.: A counterexample for the validity of using nuclear norm as a convex surrogate of rank. In: Blockeel, H., Kersting, K., Nijssen, S., Železný, F. (eds.) ECML PKDD 2013. LNCS (LNAI), vol. 8189, pp. 226–241. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-40991-2_15
Wang, C., Yan, S., Zhang, L., Zhang, H.: Multi-label sparse coding for automatic image annotation. In: IEEE CVPR, pp. 1643–1650. IEEE, Miami Beach (2009)
Wright, J., Ma, Y., Mairal, J., Sapiro, G., Huang, T.S., Yan, S.: Sparse representation for computer vision and pattern recognition. Proc. IEEE 98(6), 1031–1044 (2010)
Chung, F.R.: Spectral Graph Theory. American Mathematical Soc (No. 92) (1997)
Wang, F., Zhang, C.: Label propagation through linear neighborhoods. TKDE 20(1), 55–67 (2007)
Datta, R., Joshi, D., Li, J., Wang, J.Z.: Image retrieval: ideas, influences, and trends of the new age. ACM Comput. Surv. (CSUR) 40(2), 5 (2008)
Zhang, M., Zhou, Z.: ML-kNN: a lazy learning approach to multi-label learning. Pattern Recognit. 40(7), 2038–2048 (2007)
Bucak, S.S., Jin, R., Jain, A.K.: Multi-label learning with incomplete class assignments. In: CVPR, pp. 2801–2808. IEEE, Colorado, Colorado Springs (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Liu, Y. (2018). Semi-supervised Multi-label Dimensionality Reduction via Low Rank Representation. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11303. Springer, Cham. https://doi.org/10.1007/978-3-030-04182-3_55
Download citation
DOI: https://doi.org/10.1007/978-3-030-04182-3_55
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04181-6
Online ISBN: 978-3-030-04182-3
eBook Packages: Computer ScienceComputer Science (R0)