Abstract
Support Vector Machine (SVM) has been widely applied in real application due to its efficient performance in the classification task so that a large number of SVM methods have been proposed. In this paper, we present a novel SVM method by taking the dynamic graph learning and the self-paced learning into account. To do this, we propose utilizing self-paced learning to assign important samples with large weights, learning a transformation matrix for conducting feature selection to remove redundant features, and learning a graph matrix from the low-dimensional data of original data to preserve the data structure. As a consequence, both the important samples and the useful features are used to select support vectors in the SVM framework. Experimental analysis on four synthetic and sixteen benchmark data sets demonstrated that our method outperformed state-of-the-art methods in terms of both binary classification and multi-class classification tasks.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Belkin, M., Niyogi, P., Sindhwani, V.: Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 7 (11), 2399–2434 (2006)
Belousov, A.I., Verzakov, S.A., Von Frese, J.: Applicational aspects of support vector machines. Journal of Chemometrics: A Journal of the Chemometrics Society 16 (8-10), 482–489 (2002)
Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: COLT, pp 144–152 (1992)
Boyd, S.: Convex optimization of graph laplacian eigenvalues. In: ICM, pp 1311–1319 (2006)
Chang, C.-C., Lin, C.-J.: Libsvm: A library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 27 (2011)
Chen, Y., Gupta, M.R., Recht, B.: Learning kernels from indefinite similarities. In: ICML, pp 145–152 (2009)
Crammer, K., Singer, Y.: On the algorithmic implementation of multiclass kernel-based vector machines. J. Mach. Learn. Res. 2(12), 265–292 (2001)
Daubechies, I., DeVore, R., Fornasier, M., Güntürk, C.S.: Iteratively reweighted least squares minimization for sparse recovery. Commun. Pure Appl. Math. 63(1), 1–38 (2010)
Duchi, J., Shalev-Shwartz, S., Singer, Y., Chandra, T.: Efficient projections onto the l 1-ball for learning in high dimensions. In: ICML, pp 272–279 (2008)
Fan, Y., He, R., Liang, J., Hu, B.: Self-paced learning: An implicit regularization perspective. In: AAAI (2017)
Gan, J., Wen, G., Yu, H., Zheng, W., Lei, C.: Supervised feature selection by self-paced learning regression. Pattern Recognition Letters (2018)
Gill, P.E., Robinson, D.P.: A primal-dual augmented lagrangian. Comput. Optim. Appl. 51(1), 1–25 (2012)
Gu, B., Quan, X., Gu, V.Y., Sheng, S., Zheng, G.: Chunk incremental learning for cost-sensitive hinge loss support vector machine. Pattern Recogn. 83, 196–208 (2018)
Gunasekar, S., Woodworth, B.E., Bhojanapalli, S., Neyshabur, B., Srebro, N.: Implicit regularization in matrix factorization. In: NIPS, pp 6151–6159 (2017)
Iranmehr, A., Masnadi-Shirazi, H., Vasconcelos, N.: Cost-sensitive support vector machines. Neurocomputing 343, 50–64 (2019)
Kumar, M.P., Packer, B., Koller, D.: Self-paced learning for latent variable models. In: NIPS, pp 1189–1197 (2010)
Lafta, R., Zhang, J., Tao, X., Li, Y., Diykh, M., Lin, J.C.-W.: A structural graph-coupled advanced machine learning ensemble model for disease risk prediction in a telehealthcare environment. In: Big Data in Engineering Applications, pp 363–384 (2018)
Lei, C., Zhu, X.: Unsupervised feature selection via local structure learning and sparse learning. Multimed. Tools Appl. 77(22), 29605–29622 (2018)
Lei, Y., Dogan, Ü., Zhou, D.-X., Kloft, M.: Data-dependent generalization bounds for multi-class classification. IEEE Trans. Inform. Theory, 65(5) (2019)
Liu, Z., Elashoff, D., Piantadosi, S.: Sparse support vector machines with l0 approximation for ultra-high dimensional omics data. Artif. Intell. Med. 96, 134–141 (2019)
Meng, D., Zhao, Q., Jiang, L.: What objective does self-paced learning indeed optimize? arXiv:1511.06049 (2015)
Mygdalis, V., Tefas, A., Pitas, I.: Learning multi-graph regularization for svm classification. In: ICIP, pp 1608–1612 (2018)
Mygdalis, V., Tefas, A., Pitas, I.: Exploiting multiplex data relationships in support vector machines. Pattern Recogn. 85, 70–77 (2019)
Nie, F., Huang, Y., Wang, X., Huang, H.: New primal svm solver with linear computational cost for big data classifications. In: ICML, vol. 32, pp 505–513 (2014)
Paige, C.C., Saunders, M.A.: Lsqr: An algorithm for sparse linear equations and sparse least squares. ACM Trans. Math. Softw. 8(1), 43–71 (1982)
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, O., Grisel, B., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., et al: Scikit-learn: Machine learning in python. J. Mach. Learn. Res. 12(10), 2825–2830 (2011)
Peng, H., Fan, Y.: A general framework for sparsity regularized feature selection via iteratively reweighted least square minimization. In: AAAI, pp 2471–2477 (2017)
Pham, T., Tao, X., Zhanag, J., Yong, J., Zhang, W., Cai, Y.: Mining heterogeneous information graph for health status classification. In: BESC, pp 73–78 (2018)
Ren, Y., Zhao, P., Sheng, Y., Yao, D., Xu, Z.: Robust softmax regression for multi-class classification with self-paced learning. In: AAAI, pp 2641–2647 (2017)
Shan, C., Gong, S., McOwan, P.W.: Facial expression recognition based on local binary patterns: A comprehensive study. Image Vis. Comput. 27(6), 803–816 (2009)
Shawe-Taylor, J., Cristianini, N.: Support vector machines. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, 93–112 (2000)
Shen, F., Xu, Y., Liu, L., Yang, Y., Huang, Z., Shen, H.T.: Unsupervised deep hashing with similarity-adaptive and discrete optimization. IEEE Trans. Pattern Anal. Mach. Intell. 40(12), 3034–3044 (2018)
Singh, D., Mohan, C.K.: Graph formulation of video activities for abnormal activity recognition. Pattern Recogn. 65, 265–272 (2017)
Tang, F., Adam, L., Si, B.: Group feature selection with multiclass support vector machine. Neurocomputing 317, 42–49 (2018)
Vapnik, V.: Pattern recognition using generalized portrait method. Autom. Remote. Control. 24, 774–780 (1963)
Wang, C., Ye, Q., Luo, P., Ye, N., Fu, L.: Robust capped l1-norm twin support vector machine. Neural Netw. 114, 47–59 (2019)
Wu, J., Zhou, Z.: Sequence-based prediction of microrna-binding residues in proteins using cost-sensitive Laplacian support vector machines. IEEE/ACM Trans. Comput. Biol. Bioinform. 10(3), 752–759 (2013)
Xu, H., Xue, H., Chen, X., Wang, Y.: Solving indefinite kernel support vector machine with difference of convex functions programming. In: AAAI (2017)
Xu, J., Nie, F., Han, J.: Feature selection via scaling factor integrated multi-class support vector machines. In: IJCAI, pp 3168–3174 (2017)
Yuan, G.-X., Chang, K.-W., Hsieh, C.-J., Lin, C.-J.: A comparison of optimization methods and software for large-scale l1-regularized linear classification. J. Mach. Learn. Res. 11(11), 3183–3234 (2010)
Zhang, Y., Zhou, Z.: Cost-sensitive face recognition. IEEE Trans. Pattern Anal. Mach. Intell. 32(10), 1758–1769 (2009)
Zhang, D., Meng, D., Zhao, L., Han, J.: Bridging saliency detection to weakly supervised object detection based on self-paced curriculum learning. arXiv:1703.01290 (2017)
Zhang, J., Tan, L., Tao, X.: On relational learning and discovery in social networks: A survey. Int. J. Mach. Learn. Cybern. 10(8), 2085–2102 (2019)
Zheng, W., Zhu, X., Wen, G., Zhu, Y., Yu, H., Gan, J.: Unsupervised feature selection by self-paced learning regularization. Pattern Recognition Letters, https://doi.org/10.1016/j.patrec.2018.06.029 (2018)
Zheng, W., Zhu, X., Zhu, Y., Hu, R., Lei, C.: Dynamic graph learning for spectral feature selection. Multimed. Tools Appl. 77(22), 29739–29755 (2018)
Zhu, J., Rosset, S., Tibshirani, R., Hastie, T.J.: 1-norm support vector machines. In: NIPS, pp 49–56 (2004)
Zhu, X., Li, X., Zhang, S., Xu, Z., Yu, L., Wang, C.: Graph pca hashing for similarity search. IEEE Trans. Multimed. 19(9), 2033–2044 (2017)
Zhu, X., Gan, J., Lu, G., Li, J., Zhang, S.: Spectral clustering via half-quadratic optimization. World Wide Web. https://doi.org/10.1007/s11280-019-00731-8 (2019)
Zhu, X., Zhang, S., Hu, R., He, W., Lei, C., Zhu, P.: One-step multi-view spectral clustering. IEEE Trans. Knowl. Data Eng. 31(10), 2022–2034 (2019)
Zhu, X., Zhang, S., Li, Y., Zhang, J., Yang, L., Fang, Y.: Low-rank sparse subspace for spectral clustering. IEEE Trans. Knowl. Data Eng. 31(8), 1532–1543 (2019)
Acknowledgments
This work is partially supported by the China Key Research Program (Grant No: 2016YFB1000905); the Natural Science Foundation of China (Grants No: 61876046 and 61573270); the Guangxi Collaborative Innovation Center of Multi-Source Information Integration and Intelligent Processing; the Guangxi High Institutions Program of Introducing 100 High-Level Overseas Talents; the Strategic Research Excellence Fund at Massey University; the Marsden Fund of New Zealand (MAU1721); the Project of Guangxi Science and Technology (GuiKeAD17195062); and the Research Fund of Guangxi Key Lab of Multisource Information Mining and Security (18-A-01-01).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This article belongs to the Topical Collection: Computational Social Science as the Ultimate Web Intelligence
Guest Editors: Xiaohui Tao, Juan D. Velasquez, Jiming Liu, and Ning Zhong
Rights and permissions
About this article
Cite this article
Hu, R., Zhu, X., Zhu, Y. et al. Robust SVM with adaptive graph learning. World Wide Web 23, 1945–1968 (2020). https://doi.org/10.1007/s11280-019-00766-x
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11280-019-00766-x