Comparison of Constructive Algorithms for Neural Networks
The determination of an optimal architecture of a supervised neural network is still an open problem. A promising approach is represented by the constructive algorithms. The basic idea is to start from minimal network structures and properly add further elements. The strategies to be pursued in this growth can be very different. Several algorithms have been proposed with different performances in terms of resulting architectural complexity, generalisation capability and resistance to overfilling, computational cost required for extracting the architecture from the training examples, time delay for processing the input signal, and so on. In the present paper, we will discuss some preliminary results regarding a comparison among four constructive algorithms carried out on a classical benchmark problem: the two-spirals classification problem (Lang et al., 1988). This problem, for its extreme nonlinearity, is out of the possibility of the multilayer perceptron trained by the classical back-propagation (BP) learning algorithm. Instead, the considered constructive algorithms are always able to arrive to a solution.
Unable to display preview. Download preview PDF.
- Fahlman S.E., Lebiere C. (1990). The cascade-correlation learning architecture. Adv. In Neur. Inf. Proc. Sys., 2, D. Touretzky, 524–532, Morgan Kaufmann.Google Scholar
- Frattale Mascioli F.M., Martinelli G. (1993). A constructive algorithm for binary mapping. Proc. of ICANN ’93, 776.Google Scholar
- Lang K.J., Witbrock M.J. (1988). Learning to tell two spirals apart. Proc. Connectionist Models Summer School, Morgan Kaufmann.Google Scholar
- Martinelli G.,F. MascioliF.M.,and G. Bei (1993). Cascade neural network for binary mapping. IEEE TrailS. On NN, 1(4), 148–150.Google Scholar