Non-negative matrix factorization based modeling and training algorithm for multi-label learning
- 141 Downloads
Multi-label learning is more complicated than single-label learning since the semantics of the instances are usually overlapped and not identical. The effectiveness of many algorithms often fails when the correlations in the feature and label space are not fully exploited. To this end, we propose a novel non-negative matrix factorization (NMF) based modeling and training algorithm that learns from both the adjacencies of the instances and the labels of the training set. In the modeling process, a set of generators are constructed, and the associations among generators, instances, and labels are set up, with which the label prediction is conducted. In the training process, the parameters involved in the process of modeling are determined. Specifically, an NMF based algorithm is proposed to determine the associations between generators and instances, and a non-negative least square optimization algorithm is applied to determine the associations between generators and labels. The proposed algorithm fully takes the advantage of smoothness assumption, so that the labels are properly propagated. The experiments were carried out on six set of benchmarks. The results demonstrate the effectiveness of the proposed algorithms.
Keywordsmulti-label learning non-negative least square optimization non-negative matrix factorization smoothness assumption
Unable to display preview. Download preview PDF.
The authors are grateful to the support of the National Natural Science Foundation of China (Grant Nos. 61402076, 61572104, 61103146), the Fundamental Research Funds for the Central Universities (DUT17JC04), and the Project of the Key Laboratory of Symbolic Computation and Knowledge Engineering ofMinistry of Education, Jilin University (93K172017K03).
- 4.Sanden C, Zhang J. Enhancing multi-label music genre classification through ensemble techniques. In: Proceedings of the 34th International ACMSIGIR Conference on Research and Development in Information Retrieval. 2011, 705–714Google Scholar
- 5.Tang L, Rajan S, Narayanan V. Large scale multi-label classification via metalabeler. In: Proceedings of the 19th International Conference on World Wide Web. 2009, 211–220Google Scholar
- 6.Gopal S, Yang Y. Multi-label classification with meta-level features. In: Proceedings of the 33rd International ACM SIGIR Conference on Research & Development in Information Retrieval. 2010, 315–322Google Scholar
- 7.Zhu X, Ghahramani Z. Learning from labeled and unlabeled data with label propagation. Technical Report, 2002Google Scholar
- 12.Zhu X, Lafferty J, Rosenfeld R. Semi-supervised learning with graphs. Carnegie Mellon University, Doctor Thesis, 2005Google Scholar
- 14.Hou P, Geng X, Zhang M L. Multi-label manifold learning. In: Proceedings of the 30th AAAI Conference on Artificial Intelligence. 2016, 1680–1686Google Scholar
- 17.Huang S J, Yu Y, Zhou Z H. Multi-label hypothesis reuse. In: Proceedings of the 18th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2012, 525–533Google Scholar
- 18.Huang S J, Zhou Z H. Multi-label learning by exploiting label correlations locally. In: Proceedings of the 26th AAAI Conference on Artificial Intelligence. 2012, 949–955Google Scholar
- 38.Joachims T. Transductive inference for text classification using support vector machines. In: Proceedings of the 16th International Conference on Machine Learning. 1999, 200–209Google Scholar
- 41.Xu M, Jin R, Zhou Z H. Speedup matrix completion with side information: application to multi-label learning. In: Proceedings of the 27th Annual Conference on Neural Information Processing Systems. 2013, 2301–2309Google Scholar