Advertisement

An Adaptive Network Topology for Classification

  • Qingyu Xiong
  • Jian Huang
  • Xiaodong Xian
  • Qian Xiao
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)

Abstract

Constructive learning algorithms have been proved to be powerful methods for training feedforward neural networks. In this paper, we present an adaptive network topology with constructive learning algorithm. It consists of SOM and RBF networks as a basic network and a cluster network respectively. The SOM network performs unsupervised learning to locate SOM output cells at suitable position in the input space. And also the weight vectors belonging to its output cells are transmitted to the hidden cells in the RBF network as the centers of RBF activation functions. As a result, the one to one correspondence relationship is produced between the output cells of SOM and the hidden cells of RBF network. The RBF network performs supervised training using delta rule. The output errors of the RBF network are used to determine where to insert a new SOM cell according to a rule. This also makes it possible to let the RBF cells grow while the SOM output cells increasing, until a performance criterion is fulfilled or until a desired network size is obtained. The simulation results for the two-spirals benchmark are shown that the proposed adaptive network structure can get good performance and generalization results.

Keywords

Output Cell Unsupervised Learning Basic Network Cluster Network Topological Neighbor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Duda, R., Hart, P.: Pattern Classification and Scene Analysis. Wiley, New York (1973)MATHGoogle Scholar
  2. 2.
    Gallant, S.: Neural Network Learning and Expert Systems. MIT Press, Cambridge (1993)MATHGoogle Scholar
  3. 3.
    Honavar, V., Uhr, V.L.: Generative Learning Structures for Generalized Connectionist Networks. Inform. Sci. 70(1/2), 75–108 (1993)CrossRefGoogle Scholar
  4. 4.
    Reed, R.: Pruning Algorithms—A Survey. IEEE Trans. Neural Networks 4, 740–747 (1993)CrossRefGoogle Scholar
  5. 5.
    Finnoff, W., Hergert, Z.F.: H. G.: Improving Model Selection by Nonconvergent Methods. Neural Networks. 6, 771–783 (1993)Google Scholar
  6. 6.
    Kwok, T.Y., Yeung, D.Y.: Constructive Algorithms for Structure Learning in Feedforward Neural Networks for Regression Problems. IEEE Trans. Neural Networks. 8, 630–645 (1997)CrossRefGoogle Scholar
  7. 7.
    Kohonen, T.: Self-Organizing Maps. Springer, Heidelberg (1995)Google Scholar
  8. 8.
    Carpenter, G.A., Grossberg, S.N., Markuzon, J.H., Reynold, Rosen, D.B.: Fuzzy ARTMAP: A Neural Network Architecture for Incremental Supervised Learning of Analog Multidimensional Maps. IEEE Trans. Neural Networks 3(5), 698–713 (1992)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Qingyu Xiong
    • 1
  • Jian Huang
    • 1
  • Xiaodong Xian
    • 1
  • Qian Xiao
    • 1
  1. 1.The Key Lab of High Voltage Engineering & Electrical New Technology of Education Ministry of China, Automation CollegeChongqing UniversityChongqingChina

Personalised recommendations