Soft Computing

, Volume 22, Issue 6, pp 1945–1957 | Cite as

Large-scale linear nonparallel SVMs

  • Dalian Liu
  • Dewei Li
  • Yong Shi
  • Yingjie Tian
Methodologies and Application


Large-scale problems have been a very active topic in machine learning area. In the time of big data, it is a challenge and meaningful work to solve such problems. Standard SVM can make linear classification on large-scale problems effectively, with acceptable training time and excellent prediction accuracy. However, nonparallel SVM (NPSVM) and ramp loss nonparallel SVM (RNPSVM) are proposed with better performance than SVM on benchmark datasets. It is motivated to introduce NPSVMs into the area of large-scale issues. In this paper, we propose large-scale linear NPSVMs, solved by the alternating direction method of multipliers (ADMM), to handle large-scale classification problems. ADMM breaks large problems into smaller pieces, avoiding solving intractable problems and leading to higher training speed. The primal problems of NPSVM are convex and differentiable, and they can be managed directly by ADMM. But the objective functions of RNPSVM, composed of convex ones and concave ones, should first be processed by CCCP algorithm and transformed as a series of convex programs. Then, we apply ADMM to solve these programs in every iteration. Experiments of NPSVMs on large-scale problems verify that the algorithms can classify large-scale tasks effectively.


Large-scale Nonparallel SVM Ramp loss function ADMM 



This work has been partially supported by grants from National Natural Science Foundation of China (Nos. 61472390, 11271361, 71331005, 11226089 and 91546201), Major International (Regional) Joint Research Project (No. 71110107026) and the Beijing Natural Science Foundation (No. 1162005).

Compliance with ethical standards

Conflict of interest

We declare that we have no conflicts of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors


  1. Akbani R, Kwek S, Japkowicz N (2004) Applying support vector machines to imbalanced datasets. In: Machine learning: ECML 2004, Springer, Berlin, p 39–50Google Scholar
  2. Arun Kumar M, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543CrossRefGoogle Scholar
  3. Bhaskar BN, Tang G, Recht B (2013) Atomic norm denoising with applications to line spectral estimation. IEEE Trans Signal Process 61(23):5987–5999MathSciNetCrossRefGoogle Scholar
  4. Boyd S, Parikh N, Chu E, Peleato B, Eckstein J (2011) Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and trends\(\textregistered \) in. Mach Learn 3(1):1–122zbMATHGoogle Scholar
  5. Bradley PS, Mangasarian OL (1998) Feature selection via concave minimization and support vector machines. ICML 98:82–90Google Scholar
  6. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297zbMATHGoogle Scholar
  7. Deng J, Berg AC, Li K, Fei-Fei L (2010) What does classifying more than 10,000 image categories tell us? In: Computer vision–ECCV 2010, Springer, Berlin, pp 71–84Google Scholar
  8. Deng N, Tian Y (2004) New method in data mining: support vector machines. Science Press, BeijingGoogle Scholar
  9. Deng N, Tian Y, Zhang C (2012) Support vector machines: optimization based theory, algorithms, and extensions. CRC Press, Boca RatonzbMATHGoogle Scholar
  10. Fan RE, Chang KW, Hsieh CJ, Wang XR, Lin CJ (2008) Liblinear: a library for large linear classification. J Mach Learn Res 9:1871–1874zbMATHGoogle Scholar
  11. Guyon I, Weston J, Barnhill S, Vapnik V (2002) Gene selection for cancer classification using support vector machines. Mach learn 46(1–3):389–422CrossRefzbMATHGoogle Scholar
  12. Jayadeva RK, C S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910CrossRefGoogle Scholar
  13. Kasiviswanathan SP, Melville P, Banerjee A, Sindhwani V (2011) Emerging topic detection using dictionary learning. In: Proceedings of the 20th ACM international conference on Information and knowledge management, ACM, pp 745–754Google Scholar
  14. Kumar MA, Gopal M (2008) Application of smoothing technique on twin support vector machines. Pattern Recognit Lett 29(13):1842–1848CrossRefGoogle Scholar
  15. Li W, Zhao R, Wang X (2012) Human reidentification with transferred metric learning. In: ACCV (1), pp 31–44Google Scholar
  16. Liu D, Shi Y, Tian Y (2015) Ramp loss nonparallel support vector machine for pattern classification. Knowl-Based Syst 85:224–233CrossRefGoogle Scholar
  17. Ma J, Saul LK, Savage S, Voelker GM (2009) Identifying suspicious urls: an application of large-scale online learning. In: Proceedings of the 26th annual international conference on machine learning, ACM, pp 681–688Google Scholar
  18. Maas AL, Daly RE, Pham PT, Huang D, Ng AY, Potts C (2011) Learning word vectors for sentiment analysis. In: Proceedings of the 49th annual meeting of the Association for Computational Linguistics: Human Language Technologies-Volume 1, Association for Computational Linguistics, pp 142–150Google Scholar
  19. Naik GR, Kumar DK et al (2010) Twin SVM for gesture classification using the surface electromyogram. IEEE Trans Inf Technol Biomed 14(2):301–308CrossRefGoogle Scholar
  20. Qi Z, Tian Y, Shi Y (2012) Twin support vector machine with universum data. Neural Netw 36:112–119CrossRefzbMATHGoogle Scholar
  21. Qi Z, Tian Y, Shi Y (2013) Robust twin support vector machine for pattern classification. Pattern Recognit 46(1):305–316CrossRefzbMATHGoogle Scholar
  22. Qi Z, Tian Y, Shi Y (2014) A nonparallel support vector machine for a classification problem with universum learning. J Comput Appl Math 263:288–298MathSciNetCrossRefzbMATHGoogle Scholar
  23. Shao Y, Zhang C, Wang X, Deng N (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968CrossRefGoogle Scholar
  24. Tan J, Zhang C, Deng N (2010) Cancer related gene identification via p-norm support vector machine. In: The 4th international conference on computational systems biology, vol 1, pp 101–108Google Scholar
  25. Tian Y, Ping Y (2014) Large-scale linear nonparallel support vector machine solver. Neural Netw 50:166–174CrossRefzbMATHGoogle Scholar
  26. Tian Y, Ju X, Qi Z, Shi Y (2014a) Improved twin support vector machine. Sci China Math 57(2):417–432MathSciNetCrossRefzbMATHGoogle Scholar
  27. Tian Y, Qi Z, Ju X, Shi Y, Liu X (2014b) Nonparallel support vector machines for pattern classification. IEEE Trans Cybern 44(7):1067–1079CrossRefGoogle Scholar
  28. Tian Y, Zhang Q, Liu D (2014c) \(\nu \)-nonparallel support vector machine for pattern classification. Neural Comput Appl 25(5):1007–1020CrossRefGoogle Scholar
  29. Tian Y, Ju X, Shi Y (2016) A divide-and-combine method for large scale nonparallel support vector machines. Neural Netw 75:12–21CrossRefGoogle Scholar
  30. Vapnik V (1998) Statistical learning theory. Wiley, New YorkzbMATHGoogle Scholar
  31. Vapnik V (2000) The nature of statistical learning theory. Springer, BerlinCrossRefzbMATHGoogle Scholar
  32. Weston J, Watkins C et al (1999) Support vector machines for multi-class pattern recognition. ESANN 99:219–224Google Scholar
  33. Zhang HH, Ahn J, Lin X, Park C (2006) Gene selection using support vector machines with non-convex penalty. Bioinformatics 22(1):88–95CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • Dalian Liu
    • 1
    • 2
  • Dewei Li
    • 3
    • 4
  • Yong Shi
    • 1
    • 4
    • 5
    • 6
  • Yingjie Tian
    • 4
    • 5
  1. 1.School of Computer and Information TechnologyBeijing Jiaotong UniversityBeijingChina
  2. 2.Department of Basic Course TeachingBeijing Union UniversityBeijingChina
  3. 3.School of Mathematical SciencesUniversity of Chinese Academy of SciencesBeijingChina
  4. 4.Research Center on Fictitious Economy and Data ScienceChinese Academy of SciencesBeijingChina
  5. 5.Key Laboratory of Big Data Mining and Knowledge ManagementChinese Academy of SciencesBeijingChina
  6. 6.College of Information Science and TechnologyUniversity of Nebraska at OmahaOmahaUSA

Personalised recommendations