Learning Capabilities of ELM-Trained Time-Varying Neural Networks

Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 26)

Abstract

System identification in nonstationary environments surely represents a challenging problem. The authors have recently proposed an innovative neural architecture, namely Time-Varying Neural Network (TV-NN), which has shown remarkable identification capabilities in this kind of scenarios. It is characterized by time-varying weights, each being a linear combination of a certain set of basis functions. This inevitably increases the network complexity with respect to the stationary NN counterpart and in order to keep the training time low, an Extreme Learning Machine (ELM) approach has been proposed by the same authors for TV-NN learning, instead of Back-Propagation based techniques. However the learning capabilities of TV-NN trained by means of ELM have not been investigated in the literature and in this contribution such a lack is faced: the theoretical foundations of ELM usage for TV-NN are analytically discussed, by extending the corresponding results obtained in the stationary case study.

Keywords

Extreme Learning Machine Time-Varying Neural Networks Nonstationary System Identification 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Cingolani, C., Squartini, S., Piazza, F.: An extreme learning machine approach for training Time Variant Neural Networks. In: Proc. IEEE Asia Pacific Conference on Circuits and Systems, APCCAS 2008, pp. 384–387 (2008)Google Scholar
  2. 2.
    Ding, S., Zhao, H., Zhang, Y., Xu, X., Nie, R.: Extreme learning machine: algorithm, theory and applications. Artificial Intelligence Review, 1–13 (2013)Google Scholar
  3. 3.
    Grenier, Y.: Time-dependent ARMA modeling of nonstationary signals. IEEE Transactions on Acoustics, Speech and Signal Processing 31(4), 899–911 (1983)CrossRefGoogle Scholar
  4. 4.
    Horn, R., Johnson, C.: Topics in matrix analysis. Cambridge University Press (1994)Google Scholar
  5. 5.
    Huang, G., Ding, X., Zhou, H.: Optimization method based extreme learning machine for classification. Neurocomputing 74(1), 155–163 (2010)CrossRefGoogle Scholar
  6. 6.
    Huang, G., Wang, D., Lan, Y.: Extreme learning machines: a survey. International Journal of Machine Learning and Cybernetics, 1–16 (2011)Google Scholar
  7. 7.
    Huang, G., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 42(2), 513–529 (2012)CrossRefGoogle Scholar
  8. 8.
    Huang, G.B.: Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Transactions on Neural Networks 14(2), 274–281 (2003), doi:10.1109/TNN.2003.809401CrossRefGoogle Scholar
  9. 9.
    Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Transactions on Neural Networks 17(4), 879–892 (2006)CrossRefGoogle Scholar
  10. 10.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: Theory and applications. Neurocomputing 70(1-3), 489–501 (2006)CrossRefGoogle Scholar
  11. 11.
    Ma, W., Hsieh, T., Chi, C.: Doa estimation of quasi-stationary signals via khatri-rao subspace. In: IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2009, pp. 2165–2168 (2009)Google Scholar
  12. 12.
    Sidiropoulos, N., Bro, R.: On the uniqueness of multilinear decomposition of n-way arrays. Journal of Chemometrics 14(3), 229–239 (2000)CrossRefGoogle Scholar
  13. 13.
    Titti, A., Squartini, S., Piazza, F.: A new time-variant neural based approach for nonstationary and non-linear system identification. In: Proc. IEEE International Symposium on Circuits and Systems, ISCAS 2005, pp. 5134–5137 (2005)Google Scholar
  14. 14.
    Ye, Y., Squartini, S., Piazza, F.: Incremental-Based Extreme Learning Machine Algorithms for Time-Variant Neural Networks. In: Huang, D.-S., Zhao, Z., Bevilacqua, V., Figueroa, J.C. (eds.) ICIC 2010. LNCS, vol. 6215, pp. 9–16. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  15. 15.
    Ye, Y., Squartini, S., Piazza, F.: A Group Selection Evolutionary Extreme Learning Machine Approach for Time-Variant Neural Networks. In: Neural Nets WIRN10: Proceedings of the 20th Italian Workshop on Neural Nets, pp. 22–33. IOS Press (2011)Google Scholar
  16. 16.
    Ye, Y., Squartini, S., Piazza, F.: ELM-Based Time-Variant Neural Networks with Incremental Number of Output Basis Functions. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds.) ISNN 2011, Part I. LNCS, vol. 6675, pp. 403–410. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  17. 17.
    Ye, Y., Squartini, S., Piazza, F.: On-Line Extreme Learning Machine for Training Time-Varying Neural Networks. In: Huang, D.-S., Gan, Y., Premaratne, P., Han, K. (eds.) ICIC 2011. LNCS, vol. 6840, pp. 49–54. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  18. 18.
    Ye, Y., Squartini, S., Piazza, F.: Online sequential extreme learning machine in nonstationary environments. Neurocomputing 116, 94–101 (2013)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Università Politecnica delle MarcheAnconaItaly

Personalised recommendations