Skip to main content
Log in

Decay-weighted extreme learning machine for balance and optimization learning

  • Special Issue Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

The original extreme learning machine (ELM) was designed for the balanced data, and it balanced misclassification cost of every sample to get the solution. Weighted extreme learning machine assumed that the balance can be achieved through the equality of misclassification costs. This paper improves previous weighted ELM with decay-weight matrix setting for balance and optimization learning. The decay-weight matrix is based on the sample number of each class, but the weight sum values of each class are not necessarily equal. When the number of samples is reduced, the weight sum is also reduced. By adjusting the decaying velocity, classifier could achieve more appropriate boundary position. From the experimental results, the decay-weighted ELM obtains the better effects in solving the imbalance classification tasks, particularly in multiclass tasks. This method was successfully applied to build the prediction model in the urban traffic congestion prediction system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)

    Article  Google Scholar 

  2. Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879–892 (2006)

    Article  Google Scholar 

  3. Huang, G.B., Chen, L.: Convex incremental extreme learning machine. Neurocomputing 70(16–18), 3056–3062 (2007)

    Article  Google Scholar 

  4. Huang, G.B., Chen, L.: Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16–18), 3460–3468 (2008)

    Article  Google Scholar 

  5. Huang, G.B., Ding, X., Zhou, H.: Optimization method based extreme learning machine for classification. Neurocomputing 74(1–3), 155–163 (2010)

    Article  Google Scholar 

  6. Huang, G.B., Zhou, H., Ding, X., et al.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man. Cybern. B Cybern. A Publ. IEEE Syst. Man Cybern. Soc. 42(42), 513–529 (2012)

    Article  Google Scholar 

  7. Zhong, H., Miao, C., Shen, Z., et al.: Comparing the learning effectiveness of BP, ELM, I-ELM, and SVM for corporate credit ratings. Neurocomputing 128(5), 285–295 (2014)

    Article  Google Scholar 

  8. Huang, G., Song, S., Gupta, J.N., et al.: Semi-supervised and unsupervised extreme learning machines. IEEE Trans. Cybern. 44(12), 2405–2417 (2014)

    Article  Google Scholar 

  9. Wang, P., Wang, D., Feng, W.: Online semi-supervised extreme learning machine based on manifold regularization[J]. J. Shanghai Jiaotong Univ. (Sci.) 49(08), 1153–1158 (2015)

    MathSciNet  Google Scholar 

  10. Wang, W., Zhang, R.: Improved Convex Incremental Extreme Learning Machine Based on Enhanced Random Search. Unifying Electrical Engineering and Electronics Engineering, pp. 2033–2040. Springer, New York (2014)

    Google Scholar 

  11. Hai-Feng, K.E., Ying, J.: Real-time license character recognition technology based on R-ELM. J. Zhejiang Univ. 48(2), 1209–1216 (2014)

  12. Stolfo, S. J., Fan, W., Lee, W., et al.: Cost-based modeling for fraud and intrusion detection: results from the JAM project. In: DARPA Information Survivability Conference and Exposition, 2000. DISCEX ’00. Proceedings. IEEE Xplore, vol. 2, pp. 130–144 (2000)

  13. Strack, B., Deshazo, J.P., Gennings, C., et al.: Impact of HbA1c measurement on hospital readmission rates: analysis of 70,000 clinical database patient records.[J]. Biomed. Res. Int. 2014(6), 781670 (2014)

    Google Scholar 

  14. He, H., Garcia, E.A.: Learning from imbalanced data. IEEE Trans. Knowl. Data Eng. 21(9), 1263–1284 (2009)

    Article  Google Scholar 

  15. Akbani, R., Kwek, S., Japkowicz, N.: Applying support vector machines to imbalanced datasets. Lecture Notes in Computer Science 3201, 39–50 (2004)

  16. Liu, X.Y., Wu, J., Zhou, Z.H.: Exploratory undersampling for class-imbalance learning. IEEE Trans. Syst. Cybern. B Cybern. A Publ. IEEE Syst. Man Cybern. Soc. 39(2), 539–550 (2009)

  17. Chawla, N.V., Lazarevic, A., Hall, L.O., et al.: SMOTEBoost: improving prediction of the minority class in boosting. Lecture Notes in Computer Science 2838, 107–119 (2003)

  18. Deng, W., Zheng, Q., Regularized, Chen L.: Learning, extreme, machine[C], computational intelligence and data mining, : CIDM ’09. IEEE Symposium on. IEEE Xplore 2009, 389–395 (2009)

  19. Zong, W., Huang, G.B., Chen, Y.: Weighted extreme learning machine for imbalance learning. Neurocomputing 101(3), 229–242 (2013)

    Article  Google Scholar 

  20. Fletcher, R.: Practical Methods of Optimization: Constrained Optimization. Practical Methods of Optimization, pp. 71–94. John Wiley, Hoboken (1981)

    MATH  Google Scholar 

  21. Lichman, M.: UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science (2013)

Download references

Acknowledgements

This work was supported by National Nature Science Foundation of P. R. China (No. 61272357, 61300074) and National Key Research and Development Program of China (No. 2016YFB0700502, 2016YFB1001404).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Qing Shen or Xiaojuan Ban.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shen, Q., Ban, X., Liu, R. et al. Decay-weighted extreme learning machine for balance and optimization learning. Machine Vision and Applications 28, 743–753 (2017). https://doi.org/10.1007/s00138-017-0828-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-017-0828-4

Keywords

Navigation