Skip to main content

Training Data Selection for Optimal Generalization with Noise Variance Reduction in Neural Networks

  • Conference paper
Neural Nets WIRN VIETRI-98

Part of the book series: Perspectives in Neural Computing ((PERSPECT.NEURAL))

Abstract

In this paper, we discuss the problem of active training data selection in the presence of noise. We formalize the learning problem in neural networks as an inverse problem using a functional analytic framework and use the Averaged Projection criterion as our optimization criterion for learning. Based on the above framework, we look at training data selection from two objectives, namely, improving the generalization ability and secondly, reducing the noise variance in order to achieve better learning results. The final result uses the apriori correlation information on noise characteristics and the original function ensemble to devise an efficient sampling scheme, which can be used in conjunction with the incremental learning schemes devised in our earlier work to achieve optimal generalization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. V.V. Federov. Theory of optimal experiments. Academic Press, New York, 1972.

    Google Scholar 

  2. K. Fukumizu. Active learning in multilayer perceptrons. In Advances in Neural Information Processing Systems, Vol. 8, pp. 295–301. MIT Press, 1996.

    Google Scholar 

  3. Y. Koide, Y. Yamashita, and H. Ogawa. A unified theory of the family of projection filters for signal and image estimation. Transactions of the IEICE Japan, Vol. J-77 D-II, No. 7, pp. 1293–1301, 1994. In Japanese.

    Google Scholar 

  4. S.P. Luttrell. The use of transinformation in the design of data sampling schemes for inverse problems. Inverse Problems, Vol. 1, No. 1, pp. 199–218, 1985.

    Article  MathSciNet  MATH  Google Scholar 

  5. D. Mackay. Information-based objective functions for active data selection. Neural Computation, Vol. 4, No. 4, pp. 590–604, 1992.

    Article  Google Scholar 

  6. M. Plutowski and H. White. Selecting concise training sets from clean data. IEEE Transactions on Neural Networks, Vol. 4, No. 2, pp. 305–318, 1993.

    Article  Google Scholar 

  7. P. Sollich and D. Saad. Learning from queries for maximum information gain in imperfectly learnable problems. In Advances in Neural Information Processing System, Vol. 7, pp. 287–294. MIT Press, 1995.

    Google Scholar 

  8. S. Vijayakumar. Computational theory of incremental and active leanrning for optimal generalization. PhD thesis, Tokyo Institute of Technology, 1998.

    Google Scholar 

  9. M. Wann, T. Hediger, and N.N. Greenbaun. The influence of training sets on generalization in feedforward neural networks. In Proceedings, International Joint Conf on Neural Networks, Vol. 3, pp. 137–142, 1990.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1999 Springer-Verlag London Limited

About this paper

Cite this paper

Vijayakumar, S., Sugiyama, M., Ogawa, H. (1999). Training Data Selection for Optimal Generalization with Noise Variance Reduction in Neural Networks. In: Marinaro, M., Tagliaferri, R. (eds) Neural Nets WIRN VIETRI-98. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-0811-5_14

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-0811-5_14

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-1208-2

  • Online ISBN: 978-1-4471-0811-5

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics