An Interval Approach for Weight’s Initialization of Feedforward Neural Networks

  • Marcela Jamett
  • Gonzalo Acuña
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4293)


This work addresses an important problem in Feedforward Neural Networks (FNN) training, i.e. finding the pseudo-global minimum of the cost function, assuring good generalization properties to the trained architecture. Firstly, pseudo-global optimization is achieved by employing a combined parametric updating algorithm which is supported by the transformation of network parameters into interval numbers. It solves the network weight initialization problem, performing an exhaustive search for minimums by means of Interval Arithmetic (IA). Then, the global minimum is obtained once the search has been limited to the region of convergence (ROC). IA allows representing variables and parameters as compact-closed sets, then, a training procedure using interval weights can be done. The methodology developed is exemplified by an approximation of a known non-linear function in last section.


Root Mean Square Gradient Descent Feedforward Neural Network Interval Arithmetic Interval Approach 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Duch, W., Adamczak, R., Jankowski, N.: Initialization and Optimization of Multilayered Perceptrons. In: Proceedings of the 3rd Conference on Neural Networks and Their Applications, Kule, Poland, pp. 105–110 (1997)Google Scholar
  2. 2.
    Thimm, G., Fiesler, E.: High-Order and Multilayer Perceptron Initialization. IEEE Transactions on Neural Networks 8, 349–359 (1997)CrossRefGoogle Scholar
  3. 3.
    Erdogmus, D., Fontenla-Romero, O., Principe, J., Alonso-Betanzos, A., Castillo, E., Jenssen, R.: Accurate Initialization of Neural Network Weights by Backpropagation of the Desired Response. In: Proceedings of the International Joint Conference on Neural Networks, Portland, USA, vol. 3, pp. 2005–2010 (2003)Google Scholar
  4. 4.
    Colla, V., Reyneri, L., Sgarbi, M.: Orthogonal Least Squares Algorithm Applied to the Initialization of Multilayer Perceptrons. In: Proceedings of the European Symposium on Artificial Neural Networks, pp. 363–369 (1999)Google Scholar
  5. 5.
    Yam, Y., Chow, T.: A New Method in Determining the Initial Weights of Feedforward Neural Networks. Neurocomputing 16, 23–32 (1997)CrossRefGoogle Scholar
  6. 6.
    Husken, M., Goerick, C.: Fast Learning for Problem Classes Using Knowledge Based Network Initialization. In: Proceedings of the IJCNN, pp. 619–624 (2000)Google Scholar
  7. 7.
    Hansen, E.: Global Optimization using Interval Analysis. Marcel Dekker, New York (1992)MATHGoogle Scholar
  8. 8.
    Stolfi, J., Figuereido, L.: Self–Validated Numerical Methods and Applications. In: 21st Brazilian Mathematics Colloquium, IMPA (1997)Google Scholar
  9. 9.
    Jaulin, L., Kiefer, M., Didrit, O., Walter, E.: Applied Interval Analysis. Laboratoire des Signaux et Systèmes, CNRS-SUPÉLEC. Université Paris-Sud, France (2001)Google Scholar
  10. 10.
    Chen, S., Wu, J.: Interval optimization of dynamic response for structures with interval parameters. Computer and Structures 82, 1–11 (2004)CrossRefGoogle Scholar
  11. 11.
    Valdés, H., Flaus, J.-M., Acuña, G.: Moving horizon state estimation with global convergence using interval techniques: application to biotechnological processes. Journal of Process Control 13, 325–336 (2003)CrossRefGoogle Scholar
  12. 12.
    Cybenko, G.: Approximation by Superposition of a Sigmoidal Function. Mathematics of Control, Signals and Systems 2, 303–314 (1989)MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Hornik, K., Stinchcombe, M., White, H.: Multilayer Feedforward Networks are Universal Approximators. Neural Networks 2, 359–366 (1989)CrossRefGoogle Scholar
  14. 14.
    Attali, J., Pagès, G.: Approximations of Functions by a Multilayer Perceptron: a New Approach. Neural Networks 10, 1069–1081 (1997)CrossRefGoogle Scholar
  15. 15.
    Acuña, G., Pinto, E.: Development of a Matlab® Toolbox for the Design of Grey-Box Neural Models. International Journal of Computers, Communications and Control 1, 7–14 (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Marcela Jamett
    • 1
  • Gonzalo Acuña
    • 2
  1. 1.Departamento de DiseñoUniversidad Tecnológica Metropolitana, UTEMSantiagoChile
  2. 2.Departamento de Ingeniería InformáticaUniversidad de Santiago de Chile, USACHSantiagoChile

Personalised recommendations