Abstract
Semi-supervised feature selection is an active topic in machine learning and data mining. Laplacian support vector machine (LapSVM) has been successfully applied to semi-supervised learning. However, LapSVM cannot be directly applied to feature selection. To remedy it, we propose a sparse Laplacian support vector machine (SLapSVM) and apply it to semi-supervised feature selection. On the basis of LapSVM, SLapSVM introduces the \(\ell _1\)-norm regularization, which means the solution of SLapSVM has sparsity. In addition, the training procedure of SLapSVM can be formulated as solving a quadratic programming problem, which indicates that the solution of SLapSVM is unique and global. SLapSVM can perform feature selection and classification at the same time. Experimental results on semi-supervised classification problems show the feasibility and effectiveness of the proposed semi-supervised learning algorithms.
This work was supported in part by the Natural Science Foundation of the Jiangsu Higher Education Institutions of China under Grant No. 19KJA550002, by the Six Talent Peak Project of Jiangsu Province of China under Grant No. XYDXX-054, by the Priority Academic Program Development of Jiangsu Higher Education Institutions, and by the Collaborative Innovation Center of Novel Software Technology and Industrialization.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Belkin, M., Niyogi, P.: Semi-supervised learning on riemannian manifolds. Mach. Learn. 56(1–3), 209–239 (2004)
Belkin, M., Niyogi, P., Sindhwani, V.: Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 7(1), 2399–2434 (2006)
Bennett, K.P., Demiriz, A.: Semi-supervised support vector machines. In: Proceedings of International Conference on Neural Information Processing Systems, pp. 368–374 (1999)
Blum, A., Chawla, S.: Learning from labeled and unlabeled data using graph mincuts. In: Proceedings of Eighteenth International Conference on Machine Learning, pp. 19–26 (2001)
Blum, A., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proceedings of the 1998 Conference on Computational Learning Theory, pp. 92–100 (1998)
Cheng, S.J., Huang, Q.C., Liu, J.F., Tang, X.L.: A novel inductive semi-supervised SVM with graph-based self-training. In: Yang, J., Fang, F., Sun, C. (eds.) IScIDE 2012. LNCS, vol. 7751, pp. 82–89. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-36669-7_11
Dheeru, D., Karra Taniskidou, E.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml
Gammerman, A., Vovk, V., Vapnik, V.: Learning by transduction. In: Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence, pp. 148–155. Morgan Kaufmann, San Francisco, CA (2013)
Gasso, G., Zapien, K., Canu, S.: Sparsity regularization path for semi-supervised SVM. In: Proceedings of International Conference on Machine Learning and Applications, pp. 25–30 (2007)
Jiang, J., Ma, J., Chen, C., Jiang, X., Wang, Z.: Noise robust face image super-resolution through smooth sparse representation. IEEE Trans. Cybern. 47(11), 3991–4002 (2017)
Kothari, R., Jain, V.: Learning from labeled and unlabeled data. In: Proceedings of International Joint Conference on Neural Networks, pp. 2803–2808 (2002)
Le, H.M., Thi, H.A.L., Nguyen, M.C.: Sparse semi-supervised support vector machines by DC programming and DCA. Neurocomputing 153, 62–76 (2015)
Liu, W., He, J., Chang, S.F.: Large graph construction for scalable semi-supervised learning. In: Proceedings of International Conference on Machine Learning, pp. 679–686 (2010)
Lu, Z., Peng, Y.: Robust image analysis by l1-norm semi-supervised learning. IEEE Trans. Image Process. 24(1), 176–188 (2015)
Poggio, T., Girosi, F.: A sparse representation for function approximation. Neural Comput. 10(6), 1445–1454 (1998)
Rabiner, L.R.: A tutorial on hidden markov models and selected applications in speech recognition. Proc. IEEE 77(2), 257–286 (1989)
Refaeilzadeh, P., Tang, L., Liu, H.: Cross-Validation. In: Lui, L., Özsu, M.T. (eds.) Encyclopedia of Database Systems, pp. 532–538. Springer, New York (2016)
Schölkopf, B.: Sparseness of support vector machines. Mach. Learn. 4(6), 1071–1105 (2008)
Shahshahani, B.M., Landgrebe, D.A.: The effect of unlabeled samples in reducing the small sample size problem and mitigating the hughes phenomenon. IEEE Trans. Geosci. Remoto Sens. 32(5), 1087–1095 (1994)
Shental, N., Bar-Hillel, A., Hertz, T., Weinshall, D.: Gaussian mixture models with equivalence constraints. In: Constrained Clustering: Advances in Algorithms, Theory, and Applications, pp. 33–58. Chapman & Hall, London (2009)
Sun, Y., et al.: Discriminative local sparse representation by robust adaptive dictionary pair learning. IEEE Trans. Neural Networks Learn. Syst. 1–15 (2020)
Tan, J., Zhen, L., Deng, N., Zhang, Z.: Laplacian p-norm proximal support vector machine for semi-supervised classification. Neurocomputing 144(1), 151–158 (2014)
Tang, B., Zhang, L.: Semi-supervised feature selection based on logistic I-RELIEF for multi-classification. In: Geng, X., Kang, B.-H. (eds.) PRICAI 2018. LNCS (LNAI), vol. 11012, pp. 719–731. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-97304-3_55
Tang, B., Zhang, L.: Multi-class semi-supervised logistic I-RELIEF feature selection based on nearest neighbor. In: Yang, Q., Zhou, Z.-H., Gong, Z., Zhang, M.-L., Huang, S.-J. (eds.) PAKDD 2019. LNCS (LNAI), vol. 11440, pp. 281–292. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-16145-3_22
Yakowitz, S.J.: An introduction to Bayesian networks. Technometrics 39(3), 336–337 (1997)
Yedidia, J.H.S., Freeman, W.T., Weiss, Y.: Generalized belief propagation. Adv. Neural Inf. Process. Syst. 13(10), 689–695 (2000)
Zhang, L., Zhou, W.: On the sparseness of 1-norm support vector machines. Neural Networks 23(3), 373–385 (2010)
Zhang, L., Zhou, W., Chang, P., Liu, J., Yan, Z., Wang, T., Li, F.: Kernel sparse representation-based classifier. IEEE Trans. Signal Process. 60, 1684–1695 (2012)
Zhang, L., Zhou, W., Li, F.: Kernel sparse representation-based classifier ensemble for face recognition. Multimedia Tools Appl. 74(1), 123–137 (2015)
Zhou, D., Schölkopf, B.: Learning from labeled and unlabeled data using random walks. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese, M.A. (eds.) DAGM 2004. LNCS, vol. 3175, pp. 237–244. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-28649-3_29
Zhou, D., Schölkopf, B.: A regularization framework for learning from graph data. In: ICML Workshop on Statistical Relational Learning and Its Connections to Other Fields, pp. 132–137 (2004)
Zhu, J., Rosset, S., Hastie, T., Tibshirani, R.: 1-norm support vector machines. In: Proceedings of the 16th International Conference on Neural Information Processing Systems, vol. 16, no. 1, pp. 49–56 (2003)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhang, L., Zheng, X., Xu, Z. (2020). Semi-supervised Feature Selection Using Sparse Laplacian Support Vector Machine. In: Zhang, H., Zhang, Z., Wu, Z., Hao, T. (eds) Neural Computing for Advanced Applications. NCAA 2020. Communications in Computer and Information Science, vol 1265. Springer, Singapore. https://doi.org/10.1007/978-981-15-7670-6_10
Download citation
DOI: https://doi.org/10.1007/978-981-15-7670-6_10
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-7669-0
Online ISBN: 978-981-15-7670-6
eBook Packages: Computer ScienceComputer Science (R0)