Skip to main content
Log in

Lagrangian supervised and semi-supervised extreme learning machine

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Two extreme learning machine (ELM) frameworks are proposed to handle supervised and semi-supervised classifications. The first is called lagrangian extreme learning machine (LELM), which is based on optimality conditions and dual theory. Then LELM is extended to semi-supervised setting to obtain a semi-supervised extreme learning machine (called Lap-LELM), which incorporates the manifold regularization into LELM to improve performance when insufficient training information is available. In order to avoid the inconvenience caused by matrix inversion, Sherman-Morrison-Woodbury (SMW) identity is used in LELM and Lap-LELM, which leads to two smaller sized unconstrained minimization problems. The proposed models are solvable in a space of dimensionality equal to the number of sample points. The resulting iteration algorithms converge globally and have low computational burden. So as to verify the feasibility and effectiveness of the proposed method, we perform a series of experiments on a synthetic dataset, near-infrared (NIR) spectroscopy datasets and benchmark datasets. Compared with the traditional methods, experimental results demonstrate that the proposed methods achieve better performances than the traditional supervised and semi-supervised methods in most of considered datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. http://archive.ics.uci.edu/ml/datasets.html

  2. http://www.cad.zju.edu.cn/home/dengcai/Data/MLData.html

References

  1. Huang G, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501

    Article  Google Scholar 

  2. Huang G, Ding XJ, Zhou HM (2010) Optimization method based extreme learning machine for classification. Neurocomputing 74:155–163

    Article  Google Scholar 

  3. Huang G, Huang G, Song S, You KY (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48

    Article  MATH  Google Scholar 

  4. Yang L, Zhang S (2017) A smooth extreme learning machine framework. J Intell Fuzzy Syst 33(6):3373–3381

    Article  Google Scholar 

  5. Yang L, Zhang S (2016) A sparse extreme learning machine framework by continuous optimization algorithms and its application in pattern recognition. Eng Appl Artif Intel 53(C):176– 189

    Article  Google Scholar 

  6. Wang Y, Cao F, Yuan Y (2011) A study on effectiveness of extreme learning machine. Neurocomputing 74(16):2483–2490

    Article  Google Scholar 

  7. Wang G, Lu M, Dong YQ, Zhao XJ (2016) Self-adaptive extreme learning machine. Neural Comput Appl 27(2):291–303

    Article  Google Scholar 

  8. Zhang W, Ji H, Liao G, Zhang Y (2015) A novel extreme learning machine using privileged information. Neurocomputing 168(C):823–828

    Article  Google Scholar 

  9. Zhang Y, Wu J, Cai Z, Zhang P, Chen L (2016) Memetic extreme learning machine. Pattern Recogn 58(C):135–148

    Article  Google Scholar 

  10. Ding XJ, Lan Y, Zhang ZF, Xu X (2017) Optimization extreme learning machine with ν regularization. Neurocomputing

  11. Vapnik, Vladimir N (2002) The nature of statistical learning theory. IEEE Trans Neural Netw 8(6):1564–1564

    Google Scholar 

  12. Belkin M, Niyogi P (2004) Semi-supervised learning on riemannian manifolds. Mach Learn 56(1-3):209–239

    Article  MATH  Google Scholar 

  13. Belkin M, Niyogi P, Sindhwani V (2006) Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. JMLR.org

  14. Xiaojin Z (2006) Semi-supervised learning literature sur-vey. Semi-Supervised Learning Literature Sur-vey, Technical report, Computer Sciences. University of Wisconsin-Madisoa 37(1):63–77

    Google Scholar 

  15. Chapelle O, Sindhwani V, Keerthi SS (2008) Optimization techniques for semi-supervised support vector machines. J Mach Learn Res 9(1):203–233

    MATH  Google Scholar 

  16. Wang G, Wang F, Chen T, Yeung DY, Lochovsky FH (2012) Solution path for manifold regularized semisupervised classification. IEEE Trans Syst Man Cybern Part B Cybern A Publ IEEE Syst Man Cybern Soc 42 (2):308

    Article  Google Scholar 

  17. Melacci S, Belkin M (2009) Laplacian support vector machines trained in the primal. J Mach Learn Res 12(5):1149–1184

    MathSciNet  MATH  Google Scholar 

  18. Chen WJ, Shao YH, Xu DK, Fu YF (2014) Manifold proximal support vector machine for semi-supervised classification. Appl Intell 40(4):623–638

    Article  Google Scholar 

  19. Deng W, Zheng Q, Chen L (2009) Regularized extreme learning machine. In: IEEE Symposium on computational intelligence and data mining, 2009. CIDM ’09. IEEE, pp 389–395

  20. Iosifidis A, Tefas A, Pitas I (2014) Semi-supervised classification of human actions based on neural networks. In: International conference on pattern recognition, vol 15. IEEE, pp 1336– 1341

  21. Huang G, Song S, Gupta JND, Wu C (2014) Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern 44(12):2405

    Article  Google Scholar 

  22. Zhou Y, Liu B, Xia S, Liu B (2015) Semi-supervised extreme learning machine with manifold and pairwise constraints regularization. Neurocomputing 149(PA):180–186

    Article  Google Scholar 

  23. Liu B, Xia SX, Meng FR, Zhou Y (2016) Manifold regularized extreme learning machine. Neural Comput Applic 27(2):255–269

    Article  Google Scholar 

  24. Mangasarian O, Musicant L, David R (2001) Lagrangian support vector machines. J Mach Learn Res 1(3):161–177

    MathSciNet  MATH  Google Scholar 

  25. Balasundaram S, Tanveer M (2013) On lagrangian twin support vector regression. Neural Comput Appl 22(1):257–267

    Article  Google Scholar 

  26. Tanveer M, Shubham K, Aldhaifallah M, Nisar KS (2016) An efficient implicit regularized lagrangian twin support vector regression. Appl Intell 44(4):1–18

    Article  Google Scholar 

  27. Shao YH, Chen WJ, Zhang JJ, Wang Z, Deng NY (2014) An efficient weighted lagrangian twin support vector machine for imbalanced data classification. Pattern Recogn 47(9):3158– 3167

    Article  MATH  Google Scholar 

  28. Balasundaram S, Gupta D, Prasad SC (2016) A new approach for training lagrangian twin support vector machine via unconstrained convex minimization. Appl Intell 46(1):1–11

    Google Scholar 

  29. Balasundaram S, Gupta D (2014) On implicit lagrangian twin support vector regression by newton method International. J Comput Intell Syst 7(1):50–64

    Article  Google Scholar 

  30. Tanveer M, Shubham K (2017) A regularization on lagrangian twin support vector regression. Int J Mach Learn Cybern 8(3):807–821

    Article  Google Scholar 

  31. Balasundaram S, Gupta D (2014) Training lagrangian twin support vector regression via unconstrained convex minimization. Knowl-Based Syst 59(59):85–96

    Article  MATH  Google Scholar 

  32. Tanveer M (2015) Newton method for implicit lagrangian twin support vector machines. Int J Mach Learn Cybern 6(6):1029–1040

    Article  Google Scholar 

  33. Shao YH, Hua XY, Liu LM, Yang ZM, Deng NY (2015) Combined outputs framework for twin support vector machines. Appl Intell 43(2):424–438

    Article  Google Scholar 

  34. Bertsekas DP (1997) Nonlinear programming. J Oper Res Soc 48(3):334–334

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was supported in part by National Natural Science Foundation of China (No11471010) and Chinese Universities Scientific Fund.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liming Yang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ma, J., Wen, Y. & Yang, L. Lagrangian supervised and semi-supervised extreme learning machine. Appl Intell 49, 303–318 (2019). https://doi.org/10.1007/s10489-018-1273-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-018-1273-4

Keywords

Navigation