Learning

From Natural to Artificial Neural Computation

Volume 930 of the series Lecture Notes in Computer Science pp 551-558

Date:

Balancing bias and variance: Network topology and pattern set reduction techniques

  • T. D. GedeonAffiliated withSchool of Computer Science and Engineering, The University of New South WalesDepartment of Telecommunications and Telematics, The Technical University of Budapest
  • , P. M. WongAffiliated withCentre for Petroleum Engineering, The University of New South Wales
  • , D. HarrisAffiliated withCentre for Neural Networks, Kings College

* Final gross prices may vary according to local VAT.

Get Access

Abstract

It has been estimated that some 70% of applications of neural networks use some variant of the multi-layer feed-forward network trained using back-propagation. These neural networks are non-parametric estimators, and their limitations can be explained by a well understood problem in non-parametric statistics, being the “bias and variance” dilemma. The dilemma is that to obtain a good approximation of an input-output relationship using some form of estimator, constraints must be placed on the structure of the estimator and hence introduce bias, or a very large number of examples of the relationship must be used to construct the estimator. Thus, we have a trade off between generalisation ability and training time.

We overview this area and introduce our own methods for reducing the size of trained networks without compromising their trained generalisation abilities, and to reduce the size of the training pattern set to improve the training time again without reducing generalisation.