Weight freezing in constructive neural networks: A novel approach
Constructive algorithms can be classified in two main groups: freezing and non-freezing, each one having its own advantages and inconveniences. In large scale problems, freezing algorithms are more suitable thanks to their speed. The main problem of these algorithms, however, comes from the fixed-size nature of the new units that they use. In this paper, we present a new freezing algorithm which constructs the main network by adding small and variable-size accessory networks trained by a non-freezing algorithm instead of simple units...
Unable to display preview. Download preview PDF.
- 2.S.E. Fahlman and C. Lebiere, The Cascade-Correlation Learning Architecture, in Advances in Neural Information Processing Systems 2, D.S. Touretzky Ed., pp. 524–532, Morgan Kaufmann, Los Altos CA, 1990.Google Scholar
- 4.T. Ash, Dynamic Node Creation in Backpropagation networks, Connection Sciences, vol. 1, no. 4, pp. 365–375, 1989.Google Scholar
- 6.T.Y. Kwok and D.Y. Yeung, Experimental Analysis of Input Weight Freezing in Constructive Neural Networks, In Proceedings of the IEEE International Conference on Neural Networks, vol. 1, pp. 511–516, San Francisco, California, USA, 1993.Google Scholar
- 9.Sh. Hosseini and Ch. Jutten, Simultaneous Estimation of Signal and Noise by Constructive Neural Networks, In Proceedings of International ICSC/IFAC Symposium on Neural Computation, Vienna, Austria, September 1998.Google Scholar
- 10.G. Baillargeon, Méthodes Statistiques de l'Ingénieur. volume 1. Les éditions SMG, 1994.Google Scholar
- 11.Y. Le Cun, J.S. Denker, and S. A. Solla, Optimal Brain Damage, in Advanced in Neural Information Processing (2), D.S. Touretzky Ed., pp. 598–605, Morgan Kaufmann, 1990.Google Scholar