Ensemble OS-ELM based on combination weight for data stream classification

  • Haiyang Yu
  • Xiaoying Sun
  • Jian Wang


For online classification, how to design a self-adapted model is a challenging task. To make the model easily adaptable for the fast-changing data stream, a novel ensemble OS-ELM has been put forward. Different from traditional ensemble methods, the proposed approach provided a new self-adapted weight update algorithm. In online learning stage, both the current prediction accuracy and history record are considered. Based on suffer loss and the norm of output layer vector, an aggregate model of game theory is adopted to calculate the combination weight. This strategy fully considers the differences of individual learners. It helps the ensemble method reduce the fitting error of sequence fragment. Also, alterative hidden-layer output matrix can be calculated according to the current fragment, thus building the steady network architecture in the next chunk. So interactive parameter optimization is avoided and the automatic model is suitable for online learning. Numerical experiments are conducted on eight different kinds of UCI datasets. The results demonstrate that the proposed algorithm not only has better generalisation performance but also provides faster learning procedure.


Data stream classification Ensemble OS-ELM Game theory Online learning 



The work was supported by the National Key Research Project of China under Grant No.2016YFB1001304, the National Natural Science Foundation of China under Grant 61572229, the JLUSTIRT High-level Innovation Team, and the Fundamental Research Funds for Central Universities under Grant No.2017TD-19. The authors gratefully acknowledge financial support from the Research Centre for Intelligent Signal Identification and Equipment, Jilin Province.


  1. 1.
    Al-Yaseen WL, Othman ZA, Nazri MZA (2017) Real-time multi-agent system for an adaptive intrusion detection system[J]. Pattern Recogn Lett 85:56–64CrossRefGoogle Scholar
  2. 2.
    Barushka, Aliaksandr, and Petr Hajek. Spam filtering using integrated distribution-based balancing approach and regularized deep neural networks[J]. Appl Intell, 2018: 1–19Google Scholar
  3. 3.
    Popov MA, Alpert SI, Podorvan VN (2017) Satellite image classification method using the Dempster-Shafer approach[J]. Izvestiya, atmospheric and oceanic. Physics 53(9):1112–1122Google Scholar
  4. 4.
    Hu W, Yan L, Liu K et al (2016) A short-term traffic flow forecasting method based on the hybrid PSO-SVR[J]. Neural Process Lett 43(1):155–172CrossRefGoogle Scholar
  5. 5.
    Zhang Q, Zhang P, Long G et al (2016) Online learning from trapezoidal data streams[J]. IEEE Trans Knowl Data Eng 28(10):2709–2723MathSciNetCrossRefGoogle Scholar
  6. 6.
    Xin J, Wang Z, Qu L et al (2015) Elastic extreme learning machine for big data classification. Neurocomputing 149:464–471CrossRefGoogle Scholar
  7. 7.
    Mairal, Julien, et al. Online learning for matrix factorization and sparse coding[J]. J Mach Learn Res, 2010,11 (1): 19–60Google Scholar
  8. 8.
    Li K, Kong X, Lu Z, Wenyin L, Yin J (2014) Boosting weighted ELM for imbalanced learning. Neurocomputing 128:15–21CrossRefGoogle Scholar
  9. 9.
    Huang G, Huang GB, Song S et al (2015) Trends in extreme learning machines: a review[J]. Neural Netw 61:32–48CrossRefzbMATHGoogle Scholar
  10. 10.
    Liang NY, Huang GB, Saratchandran P et al (2006) A fast and accurate online sequential learning algorithm for feedforward networks[J]. IEEE Trans Neural Netw 17(6):1411–1423CrossRefGoogle Scholar
  11. 11.
    Savitha R, Suresh S, Kim HJ (2014) A meta-cognitive learning algorithm for an extreme learning machine classifier[J]. Cogn Comput 6(2):253–263CrossRefGoogle Scholar
  12. 12.
    Zhang T, Dai Q (2016) Hybrid ensemble selection algorithm incorporating GRASP with path relinking[J]. Appl Intell 44(3):704–724CrossRefGoogle Scholar
  13. 13.
    Mirza B, Lin Z (2016) Meta-cognitive online sequential extreme learning machine for imbalanced and concept-drifting data classification[J]. Neural Netw 80:79–94CrossRefGoogle Scholar
  14. 14.
    Yu H, Sun C, Yang X et al (2016) ODOC-ELM: optimal decision outputs compensation-based extreme learning machine for classifying imbalanced data[J]. Knowl-Based Syst 92:55–70CrossRefGoogle Scholar
  15. 15.
    Du KL SMNS (2016) Particle swarm optimization[M]. search and optimization by metaheuristics. Springer Int Publish:153–173Google Scholar
  16. 16.
    Han F, Zhao MR, Zhang JM et al (2017) An improved incremental constructive single-hidden-layer feedforward networks for extreme learning machine based on particle swarm optimization[J]. Neurocomputing 228:133–142CrossRefGoogle Scholar
  17. 17.
    Zhu X, Ni Z, Cheng M, Jin F, Li J, Weckman G (2018) Selective ensemble based on extreme learning machine and improved discrete artificial fish swarm algorithm for haze forecast[J]. Appl Intell 48(7):1757–1775CrossRefGoogle Scholar
  18. 18.
    Wang S, Minku LL, Yao X (2015) Resampling-based ensemble methods for online class imbalance learning [J]. IEEE Trans Knowl Data Eng 27(5):1356–1368CrossRefGoogle Scholar
  19. 19.
    Han D, Giraud-Carrier C, Li S (2015) Efficient mining of high-speed uncertain data streams[J]. Appl Intell 43(4):773–785CrossRefGoogle Scholar
  20. 20.
    Lu J, Zhao P, Hoi SCH (2016) Online passive-aggressive active learning[J]. Mach Learn 103(2):141–183MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Wang J, Zhao P, Hoi SCH et al (2014) Online feature selection and its applications[J]. IEEE Trans Knowl Data Eng 26(3):698–710CrossRefGoogle Scholar
  22. 22.
    Orabona F, Keshet J, Caputo B (2009) Bounded kernel-based online learning[J]. J Mach Learn Res 10(11):2643–2666MathSciNetzbMATHGoogle Scholar
  23. 23.
    Huang GB, Chen L (2007) Convex incremental extreme learning machine[J]. Neurocomputing 70(16):3056–3062CrossRefGoogle Scholar
  24. 24.
    Feng G, Huang GB, Lin Q et al (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning[J]. IEEE Trans Neural Netw 20(8):1352–1357CrossRefGoogle Scholar
  25. 25.
    Jiuwen Cao ZL (2012) Guang-bin Huang. Self-adaptive evolutionary extreme learning machine[J]. Neural Process Lett 36(3):285–305CrossRefGoogle Scholar
  26. 26.
    Bai Z, Huang GB, Wang D et al (2014) Sparse extreme learning machine for classification[J]. IEEE Trans Cybernet 44(10):1858–1870CrossRefGoogle Scholar
  27. 27.
    Zhang R, Lan Y, Huang G et al (2012) Universal approximation of extreme learning machine with adaptive growth of hidden nodes[J]. IEEE Trans Neural Netw Learn Syst 23(2):365–371CrossRefGoogle Scholar
  28. 28.
    Zhang R, Lan Y, Huang GB et al (2013) Dynamic extreme learning machine and its approximation capability[J]. IEEE Trans Cybernet 43(6):2054–2065CrossRefGoogle Scholar
  29. 29.
    Cavallanti G, Cesa-Bianchi N, Gentile C (2007) Tracking the best hyperplane with a simple budget perceptron[J]. Mach Learn 69(2):143–167CrossRefGoogle Scholar
  30. 30.
    Tang J, Deng C, Huang G-B (2016) Extreme learning machine for multilayer perceptron. IEEE Trans Neur Netw Learn Syst 27(4):809–821MathSciNetCrossRefGoogle Scholar
  31. 31.
    Scardapane S, Comminiello D, Scarpiniti M, Uncini A (2015) Online sequential extreme learning machine with kernels[J]. IEEE Trans Neural Netw Learn Syst 26(9):2214–2220MathSciNetCrossRefGoogle Scholar
  32. 32.
    Lan Y, Soh YC, Huang GB (2009) Ensemble of online sequential extreme learning machine. Neurocomputing 72(15):3391–3395CrossRefGoogle Scholar
  33. 33.
    Cao J et al (2012) Voting based extreme learning machine[J]. Inf Sci 185(1):66–77MathSciNetCrossRefGoogle Scholar
  34. 34.
    Tian HX, Mao ZZ (2010) An ensemble ELM based on modified AdaBoost. RT algorithm for predicting the temperature of molten steel in ladle furnace. IEEE Trans Autom Sci Eng 7(1):73–80CrossRefGoogle Scholar
  35. 35.
    Li K, Kong X, Lu Z et al (2014) Boosting weighted ELM for imbalanced learning[J]. Neurocomputing 128:15–21CrossRefGoogle Scholar
  36. 36.
    Zhang B, Ma Z, Liu Y, Yuan H, Sun L (2018) Ensemble based reactivated regularization extreme learning machine for classification. Neurocomputing 275:255–266CrossRefGoogle Scholar
  37. 37.
    Zhu X, Ni Z, Cheng M et al (2017) Selective ensemble based on extreme learning machine and improved discrete artificial fish swarm algorithm for haze forecast[J]. Appl Intell:1–19Google Scholar
  38. 38.
    Liu Y, He B, Dong D, Shen Y, Yan T, Nian R, Lendasse A (2015) Particle swarm optimization based selective ensemble of online sequential extreme learning machine[J]. Math Probl Eng:1–10Google Scholar
  39. 39.
    Zhang Y, Liu B, Yu J (2017) A selective ensemble learning approach based on evolutionary algorithm. J Intel Fuzzy Syst 32(3):2365–2373CrossRefGoogle Scholar
  40. 40.
    Liu T, Deng Y, Chan F (2018) Evidential supplier selection based on DEMATEL and game theory[J]. Int J Fuzzy Syst 20(4):1321–1333CrossRefGoogle Scholar
  41. 41.
    Frank A, Asuncion A. UCI Machine Learning Repository [http://archive.ics.]. Irvine

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.College of Communication EngineeringJilin UniversityChangchunChina
  2. 2.College of Computer Science and TechnologyJilin UniversityChangchunChina
  3. 3.Department of Intelligent VehicleChina Automotive Engineering Research Institute (CAERI)ChongqingChina

Personalised recommendations