Advertisement

Reservoir-Based Evolving Spiking Neural Network for Spatio-temporal Pattern Recognition

  • Stefan Schliebs
  • Haza Nuzly Abdull Hamed
  • Nikola Kasabov
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7063)

Abstract

Evolving spiking neural networks (eSNN) are computational models that are trained in an one-pass mode from streams of data. They evolve their structure and functionality from incoming data. The paper presents an extension of eSNN called reservoir-based eSNN (reSNN) that allows efficient processing of spatio-temporal data. By classifying the response of a recurrent spiking neural network that is stimulated by a spatio-temporal input signal, the eSNN acts as a readout function for a Liquid State Machine. The classification characteristics of the extended eSNN are illustrated and investigated using the LIBRAS sign language dataset. The paper provides some practical guidelines for configuring the proposed model and shows a competitive classification performance in the obtained experimental results.

Keywords

Spiking Neural Networks Evolving Systems Spatio-Temporal Patterns 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bohte, S.M., Kok, J.N., Poutré, J.A.L.: Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1-4), 17–37 (2002)CrossRefzbMATHGoogle Scholar
  2. 2.
    Dias, D., Madeo, R., Rocha, T., Biscaro, H., Peres, S.: Hand movement recognition for brazilian sign language: A study using distance-based neural networks. In: International Joint Conference on Neural Networks IJCNN 2009, pp. 697–704 (2009)Google Scholar
  3. 3.
    Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge (2002)CrossRefzbMATHGoogle Scholar
  4. 4.
    Goodman, D., Brette, R.: Brian: a simulator for spiking neural networks in python. BMC Neuroscience 9(Suppl 1), 92 (2008)CrossRefGoogle Scholar
  5. 5.
    Hamed, H., Kasabov, N., Shamsuddin, S.: Probabilistic evolving spiking neural network optimization using dynamic quantum-inspired particle swarm optimization. Australian Journal of Intelligent Information Processing Systems 11(01), 23–28 (2010)Google Scholar
  6. 6.
    Hamed, H., Kasabov, N., Shamsuddin, S., Widiputra, H., Dhoble, K.: An extended evolving spiking neural network model for spatio-temporal pattern classification. In: 2011 International Joint Conference on Neural Networks, pp. 2653–2656 (2011)Google Scholar
  7. 7.
    Indiveri, G., Chicca, E., Douglas, R.: Artificial cognitive systems: From VLSI networks of spiking neurons to neuromorphic cognition. Cognitive Computation 1, 119–127 (2009)CrossRefGoogle Scholar
  8. 8.
    Indiveri, G., Stefanini, F., Chicca, E.: Spike-based learning with a generalized integrate and fire silicon neuron. In: International Symposium on Circuits and Systems, ISCAS 2010, pp. 1951–1954. IEEE (2010)Google Scholar
  9. 9.
    Kasabov, N.: The ECOS framework and the ECO learning method for evolving connectionist systems. JACIII 2(6), 195–202 (1998)CrossRefGoogle Scholar
  10. 10.
    Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14(11), 2531–2560 (2002)CrossRefzbMATHGoogle Scholar
  11. 11.
    Norton, D., Ventura, D.: Preparing more effective liquid state machines using hebbian learning. In: International Joint Conference on Neural Networks, IJCNN 2006, pp. 4243–4248. IEEE, Vancouver (2006)Google Scholar
  12. 12.
    Norton, D., Ventura, D.: Improving liquid state machines through iterative refinement of the reservoir. Neurocomputing 73(16-18), 2893–2904 (2010)CrossRefGoogle Scholar
  13. 13.
    Schliebs, S., Defoin-Platel, M., Worner, S., Kasabov, N.: Integrated feature and parameter optimization for an evolving spiking neural network: Exploring heterogeneous probabilistic models. Neural Networks 22(5-6), 623–632 (2009)CrossRefGoogle Scholar
  14. 14.
    Schliebs, S., Nuntalid, N., Kasabov, N.: Towards Spatio-Temporal Pattern Recognition Using Evolving Spiking Neural Networks. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds.) ICONIP 2010, Part I. LNCS, vol. 6443, pp. 163–170. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  15. 15.
    Schrauwen, B., D’Haene, M., Verstraeten, D., Campenhout, J.V.: Compact hardware liquid state machines on fpga for real-time speech recognition. Neural Networks 21(2-3), 511–523 (2008)CrossRefGoogle Scholar
  16. 16.
    Thorpe, S.J.: How can the human visual system process a natural scene in under 150ms? On the role of asynchronous spike propagation. In: ESANN. D-Facto public (1997)Google Scholar
  17. 17.
    Watts, M.: A decade of Kasabov’s evolving connectionist systems: A review. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 39(3), 253–269 (2009)CrossRefGoogle Scholar
  18. 18.
    Wysoski, S.G., Benuskova, L., Kasabov, N.K.: Adaptive Learning Procedure for a Network of Spiking Neurons and Visual Pattern Recognition. In: Blanc-Talon, J., Philips, W., Popescu, D., Scheunders, P. (eds.) ACIVS 2006. LNCS, vol. 4179, pp. 1133–1142. Springer, Heidelberg (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Stefan Schliebs
    • 1
  • Haza Nuzly Abdull Hamed
    • 1
    • 2
  • Nikola Kasabov
    • 1
    • 3
    • 4
  1. 1.KEDRI, Auckland University of TechnologyNew Zealand
  2. 2.Soft Computing Research GroupUniversiti Teknologi MalaysiaJohor BahruMalaysia
  3. 3.Institute for NeuroinformaticsETHSwitzerland
  4. 4.University of ZurichSwitzerland

Personalised recommendations