Skip to main content

A Robust ELM Algorithm for Compensating the Effect of Node Fault and Weight Noise

  • 89 Accesses

Part of the Lecture Notes in Networks and Systems book series (volume 457)

Abstract

Although the extreme learning machine (ELM) technique is an efficient and effective neural approach, there are still some downsides in the traditional ELM technique. When there are some outlier training samples, the trained neural network is usually with poor performance. Another issue is that when there are some noise and faults in the trained network, the performance of the trained network is also poor. This paper looks into the ELM technique under multiple imperfections, including outlier training samples, weight noise and node faults. This paper first identifies a regularization term for handling weight noise and node faults. To handle outlier training samples, the maximum correntropy criterion (MCC) concept is used in the objective function. A learning algorithm, namely, robust fault aware ELM algorithm (RFAELM), for faulty networks is then proposed. Simulation results show that the performance of the proposed algorithm is much better than that of two state-of-art robust algorithms.

Keywords

  • Node fault
  • Outlier samples
  • Weight noise

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-031-00828-3_7
  • Chapter length: 10 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   229.00
Price excludes VAT (USA)
  • ISBN: 978-3-031-00828-3
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Hardcover Book
USD   299.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.

References

  1. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)

    CrossRef  Google Scholar 

  2. Huang, G.B., et al.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 42(2), 513–529 (2011)

    Google Scholar 

  3. Liu, Z., Jin, W., Mu, Y.: Variances-constrained weighted extreme learning machine for imbalanced classification. Neurocomputing 403, 45–52 (2020)

    CrossRef  Google Scholar 

  4. Lei, Y., et al.: A semi-supervised Laplacian extreme learning machine and feature fusion with CNN for industrial superheat identification. Neurocomputing 381, 186–195 (2020)

    CrossRef  Google Scholar 

  5. Li, X., Mao, W., Jiang, W.: Extreme learning machine based transfer learning for data classification. Neurocomputing 174, 203–210 (2016)

    CrossRef  Google Scholar 

  6. Wang, Z., et al.: Distributed and weighted extreme learning machine for imbalanced big data learning. In: Proceedings of ELM-2015 (2015)

    Google Scholar 

  7. Chen, B., Wang, X., Lu, N., Wang, S., Cao, J., Qin, J.: Mixture correntropy for robust learning. Pattern Recogn. 79, 318–327 (2018)

    CrossRef  Google Scholar 

  8. Wang, K., Pei, H., Cao, J., Zhong, P.: Robust regularized extreme learning machine for regression with non-convex loss function via DC program. J. Franklin Inst. 357(11), 7069–7091 (2020)

    MathSciNet  CrossRef  Google Scholar 

  9. Martolia, R., Jain, A., Singla, L.: Analysis & survey on fault tolerance in radial basis function networks. In: 2015 IEEE International Conference on Computing, Communication & Automation (ICCCA), pp. 469–473 (2015)

    Google Scholar 

  10. Feng, R.B., Han, Z.F., Wan, W.Y., Leung, C.S.: Properties and learning algorithms for faulty RBF networks with coexistence of weight and node failures. Neurocomputing 224, 166–176 (2017)

    CrossRef  Google Scholar 

  11. Liu, B., Kaneko, T.: Error analysis of digital filters realized with floating-point arithmetic. Proc. IEEE 57(10), 1735–1747 (1969)

    CrossRef  Google Scholar 

  12. Adegoke, M., Wong, H.T., Leung, A.C.S., Sum, J.: Two noise tolerant incremental learning algorithms for single layer feed-forward neural networks. J. Ambient Intell. Humaniz. Comput., 1–15 (2019). https://doi.org/10.1007/s12652-019-01488-8

  13. Shi, W., Li, Y., Wang, Y.: Noise-free maximum correntropy criterion algorithm in non-Gaussian environment. IEEE Trans. Circ. Syst. II: Express Briefs 67(10), 2224–2228 (2019)

    Google Scholar 

  14. Liu, W., Pokharel, P.P., Príncipe, J.C.: Correntropy: properties and applications in non-Gaussian signal processing. IEEE Trans. Signal Process. 55(11), 5286–5298 (2007)

    MathSciNet  CrossRef  Google Scholar 

  15. Pokharel, P.P., Liu, W., Principe, J.C.: A low complexity robust detector in impulsive noise. Signal Process. 89(10), 1902–1909 (2009)

    CrossRef  Google Scholar 

  16. Vapnik, V.N.: The Nature of Statistical Learning Theory (1995)

    Google Scholar 

  17. Yang, J., Cao, J., Wang, T., Xue, A., Chen, G.: Regularized correntropy criterion based semi-supervised ELM. Neural Netw. 122, 117–129 (2020)

    CrossRef  Google Scholar 

  18. Dua, D., Graff, C.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine (2019). http://archive.ics.uci.edu/ml

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chi-Sing Leung .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Adegoke, M., Xiao, Y., Leung, CS., Leung, K.W. (2022). A Robust ELM Algorithm for Compensating the Effect of Node Fault and Weight Noise. In: Ghazali, R., Mohd Nawi, N., Deris, M.M., Abawajy, J.H., Arbaiy, N. (eds) Recent Advances in Soft Computing and Data Mining. SCDM 2022. Lecture Notes in Networks and Systems, vol 457. Springer, Cham. https://doi.org/10.1007/978-3-031-00828-3_7

Download citation