Skip to main content
Log in

Use of some sensitivity criteria for choosing networks with good generalization ability

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

In most applications of the multilayer perceptron (MLP) the main objective is to maximize the generalization ability of the network. We show that this ability is related to the sensitivity of the output of the MLP to small input changes. Several criteria have been proposed for the evaluation of the sensitivity. We propose a new index and present a way for improving these sensitivity criteria. Some numerical experiments allow a first comparison of the efficiencies of these criteria.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. G. Cybenko. Approximations by superpositions of a sigmoidal function,Mathematics of Control, Signals and Systems, vol. 2, pp. 303–314, 1989.

    MATH  MathSciNet  Google Scholar 

  2. J.H. Friedman. An overview of predictive learning and function approximation, in V. Cherkassky, J.H. Friedman, H. Wechsler eds.,From Statistics to Neural Networks, Theory and Pattern Recognition Applications, Springer-Verlag, pp. 1–61, 1993.

  3. A. S. Weigend, B. A. Huberman and D. E. Rumelhart. Predicting sunspots and exchange rates with connectionist networks, in M. Casdagli and S. Eubank eds.,Nonlinear Modeling and Forecasting, Addison-Wesley, pp. 395–432, 1992.

  4. L. Fu and T. Chen. Sensitivity analysis for input vector in multilayer feedforward neural networks,IEEE Internatioal Conférence on Neural Networks, San Francisco, California, pp. 215–218, 1993.

  5. K. Matsuoka. An approach to generalization problem in back-propagation learning,Proc. INNC'90, Paris, pp. 765–768, 1990.

  6. C. M. Bishop. Curvature-driven smoothing in backpropagation neural networks,Proc. INNC'90, Paris, pp. 749–752, 1990.

  7. A. R. Webb. Functional approximation by feedforward networks: a least-squares approach to generalization,IEEE Trans. on Neural Networks, vol. 5, no. 3, pp 363–371, 1994.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dimopoulos, Y., Bourret, P. & Lek, S. Use of some sensitivity criteria for choosing networks with good generalization ability. Neural Process Lett 2, 1–4 (1995). https://doi.org/10.1007/BF02309007

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02309007

Keywords

Navigation