Weight freezing in constructive neural networks: A novel approach

  • Shahram Hosseini
  • Christian Jutten
Artificial Neural Nets Simulation and Implementation
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1607)


Constructive algorithms can be classified in two main groups: freezing and non-freezing, each one having its own advantages and inconveniences. In large scale problems, freezing algorithms are more suitable thanks to their speed. The main problem of these algorithms, however, comes from the fixed-size nature of the new units that they use. In this paper, we present a new freezing algorithm which constructs the main network by adding small and variable-size accessory networks trained by a non-freezing algorithm instead of simple units...


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    T.Y. Kwok and D.Y. Yeung, Constructive Algorithms for Structure Learning in Feedforward Neural Networks for Regression Problems. IEEE Trans. on Neural Networks, vol. 8, no. 3, pp. 630–645, May 1997.CrossRefGoogle Scholar
  2. 2.
    S.E. Fahlman and C. Lebiere, The Cascade-Correlation Learning Architecture, in Advances in Neural Information Processing Systems 2, D.S. Touretzky Ed., pp. 524–532, Morgan Kaufmann, Los Altos CA, 1990.Google Scholar
  3. 3.
    J.H. Friedman and W. Stuetzle, Projection Pursuit Regression, Journal of American Statistical Association, vol. 76, no. 376, pp. 817–823, 1981.CrossRefMathSciNetGoogle Scholar
  4. 4.
    T. Ash, Dynamic Node Creation in Backpropagation networks, Connection Sciences, vol. 1, no. 4, pp. 365–375, 1989.Google Scholar
  5. 5.
    Ch. Jutten and R. Chentouf, A New Scheme for Incremental Learning, Neural Processing Letters, vol. 2, no. 1, pp. 1–4, 1995.CrossRefGoogle Scholar
  6. 6.
    T.Y. Kwok and D.Y. Yeung, Experimental Analysis of Input Weight Freezing in Constructive Neural Networks, In Proceedings of the IEEE International Conference on Neural Networks, vol. 1, pp. 511–516, San Francisco, California, USA, 1993.Google Scholar
  7. 7.
    T.Y. Kwok and D.Y. Yeung, Objective Functions for Training New Hidden Units in Constructive Neural Networks, IEEE Transaction on Neural Networks, vol. 8, no. 5, pp. 1131–1148, September 1997.CrossRefGoogle Scholar
  8. 8.
    N.A.C. Cressie, Statistics for Spatial Data, John Wiley & sons, New York, 1991.MATHGoogle Scholar
  9. 9.
    Sh. Hosseini and Ch. Jutten, Simultaneous Estimation of Signal and Noise by Constructive Neural Networks, In Proceedings of International ICSC/IFAC Symposium on Neural Computation, Vienna, Austria, September 1998.Google Scholar
  10. 10.
    G. Baillargeon, Méthodes Statistiques de l'Ingénieur. volume 1. Les éditions SMG, 1994.Google Scholar
  11. 11.
    Y. Le Cun, J.S. Denker, and S. A. Solla, Optimal Brain Damage, in Advanced in Neural Information Processing (2), D.S. Touretzky Ed., pp. 598–605, Morgan Kaufmann, 1990.Google Scholar
  12. 12.
    M. Lehtocangas, J. Saarinen and K. Kaski, Fine-tuning cascade-correlation feedforward network trained with backpropagation, Neural Processing Letters, vol. 2, no. 2, pp. 10–12, 1995.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Shahram Hosseini
    • 1
  • Christian Jutten
    • 1
  1. 1.LIS, INPGGrenoble cedexFrance

Personalised recommendations