Neural Processing Letters

, Volume 2, Issue 6, pp 1–4 | Cite as

Use of some sensitivity criteria for choosing networks with good generalization ability

  • Yannis Dimopoulos
  • Paul Bourret
  • Sovan Lek


In most applications of the multilayer perceptron (MLP) the main objective is to maximize the generalization ability of the network. We show that this ability is related to the sensitivity of the output of the MLP to small input changes. Several criteria have been proposed for the evaluation of the sensitivity. We propose a new index and present a way for improving these sensitivity criteria. Some numerical experiments allow a first comparison of the efficiencies of these criteria.


Neural Network Artificial Intelligence Complex System Numerical Experiment Nonlinear Dynamics 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    G. Cybenko. Approximations by superpositions of a sigmoidal function,Mathematics of Control, Signals and Systems, vol. 2, pp. 303–314, 1989.MATHMathSciNetGoogle Scholar
  2. [2]
    J.H. Friedman. An overview of predictive learning and function approximation, in V. Cherkassky, J.H. Friedman, H. Wechsler eds.,From Statistics to Neural Networks, Theory and Pattern Recognition Applications, Springer-Verlag, pp. 1–61, 1993.Google Scholar
  3. [3]
    A. S. Weigend, B. A. Huberman and D. E. Rumelhart. Predicting sunspots and exchange rates with connectionist networks, in M. Casdagli and S. Eubank eds.,Nonlinear Modeling and Forecasting, Addison-Wesley, pp. 395–432, 1992.Google Scholar
  4. [4]
    L. Fu and T. Chen. Sensitivity analysis for input vector in multilayer feedforward neural networks,IEEE Internatioal Conférence on Neural Networks, San Francisco, California, pp. 215–218, 1993.Google Scholar
  5. [5]
    K. Matsuoka. An approach to generalization problem in back-propagation learning,Proc. INNC'90, Paris, pp. 765–768, 1990.Google Scholar
  6. [6]
    C. M. Bishop. Curvature-driven smoothing in backpropagation neural networks,Proc. INNC'90, Paris, pp. 749–752, 1990.Google Scholar
  7. [7]
    A. R. Webb. Functional approximation by feedforward networks: a least-squares approach to generalization,IEEE Trans. on Neural Networks, vol. 5, no. 3, pp 363–371, 1994.Google Scholar

Copyright information

© Kluwer Academic Publishers 1995

Authors and Affiliations

  • Yannis Dimopoulos
    • 1
  • Paul Bourret
    • 2
  • Sovan Lek
    • 1
  1. 1.UMR 9964, Equipe de Biologie QuantitativeUniversité Paul SabotierToulouse CedesFrance
  2. 2.Département d'Etudes et de Recherches en InformatiqueONERA-CERTToulouse CedexFrance

Personalised recommendations