Multi-Neural Networks hardware and software architecture: Application of the divide to simplify paradigm DTS

  • A. Chebira
  • K. Madani
  • G. Mercier
Neural Nets Simulation, Emulation and Implementation
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1240)


We present in this paper the implementation of the data driven method we called DTS (Divide to Simplify), that builds dynamically a Multi-Neural Network Architecture. The Multi-Neural Network architecture, we propose, solves a complex problem by splitting it into several easier problems. We have previously present a software version of the DTS multi-neural network architecture. The main idea of the DTS approach is to use a set of small and specialized mapping neural networks, or Slave Neural Networks (SNN), that are guided by a prototype based neural network, or Master Neural Network (MNN). In this paper, the MNN manages a set of hardware digital neural networks. Learning is performed in few milliseconds. We get a very good rate of classification when using the two spirals problem as a benchmark.

Key words

Divide To Simplify Kohonen Self Organization Maps Multi-Neural Networks Systems Cooperative and parallel architecture IBM Zero Instruction Set computer ZISC-036 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    S. Goonatilake and S. Khebbal, « Intelligent Hybrid Systems: Issues, Classification and Future Directions », In « Intelligent Hybrid Systems » John Wiley & Sons, pp 1–20, ISBN 0 471 94242 1Google Scholar
  2. [2]
    A. Krogh, J. Vedelsby, « Neural Network Ensembles, Cross Validation, and Active Learning », Advances in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 231–238.Google Scholar
  3. [3]
    ZISC036 data book, IBM Essonnes Component Development Laboratory, IBM Microelectronics, Corbeil-Essonnes, France.Google Scholar
  4. [4]
    K. J. Lang and M. J. Witbrock, « Learning to tell two spirals apart », Proc. of the 1988 Connectionist Models Summer School, Morgan Kauffman, pp 52–59.Google Scholar
  5. [5]
    S. E. Fahlman, C. Lebiere, « The Cascaded-Correlation Learning Architecture », Advances in Neural Information Processing Systems 2, Morgan Kauffman, San Mateo, pp 524–534.Google Scholar
  6. [6]
    T. R. Shultz, Y. O. Takane and Y. Takana, « Analysis of Unstandardized Contributions of Cross Connected Networks », Advances in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 610–608.Google Scholar
  7. [7]
    J. Hun, C. Moraga, « The influence of Sigmoid Function Parameters on the speed of Backpropagation Learning», International Workshop on Artificial Neural Network, Malaga-Torremolinos, Spain, June 1995, Springer, pp 195–201.Google Scholar
  8. [8]
    B. Fritzke, « Growing Cell Structure, A self organizing network for unsupervised and supervised training », ICSI Berkeley, Technical Report, tr-93-026.Google Scholar
  9. [9]
    J. Bruske, G. Sommer, «Dynamic Cell Structure», Advances in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 497–504.Google Scholar
  10. [10]
    A Hannibal « VLSI Building Block for Neural Networks with on chip Back Learning », Neurocoputing n∘5,1993, pp 25–37.Google Scholar
  11. [11]
    K. K. Sang and P. Niyogi, « Active learning for function approximation », Advances in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 593–600.Google Scholar
  12. [12]
    A. Chebira, G. Mercier, K.Madani, G. de Tremiolles « A prototype based neural network supervising a set of feature based networks: application to the two spirals problem», Fourth European Congress on Intelligent Techniques and Soft Computing, Aachen, Germany, September 2–5, 1996, volume 1, pp 303–307.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • A. Chebira
    • 1
  • K. Madani
    • 1
  • G. Mercier
    • 1
  1. 1.Laboratoire d'Etudes et de Recherches en Instrumentation, Signaux et Systèmes Division Réseaux NeuronauxUniversité Paris XII I.U.T. De Creteil-SenartLieusaintFrance

Personalised recommendations