Skip to main content

Data Representation And Generalisation In An Application Of a Feed-Forward Neural Net

  • Conference paper
Neural Network Applications

Part of the book series: Perspectives in Neural Computing ((PERSPECT.NEURAL))

  • 194 Accesses

Abstract

This paper examines the role of a feed-forward neural network in an application involving the segmentation of medical images. Particular emphasis is focussed on the the choice of input to the network in terms of both its representation and content. It is shown that pre-processing of the input information by a statistical classifier leads to significant improvement in the networks performance. An examination is also made of the networks ability to generalise. In particular the importance of the use of validation data during training is demonstrated.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. R.M.Haralick, K.Shanmugan, Its’Hak Dinstein. Texture Features for Image Classification, IEEE Trans. Sys. Cyber., SMC-3, 610–621. 1973.

    Google Scholar 

  2. J.F.Haddon, J.F.Boyce Image Segmentation by Unifying Region and Boundary Information, PAMI-12, 929–948, 1990.

    Google Scholar 

  3. D.L.Toulson, J.F.Boyce Segmentation Of MR Images Using Neural Nets, in P.Mowforth (ed.), BMVC91, 284–292

    Google Scholar 

  4. A.P.Dempster, N.M.Laird, D.B.Rubin. Maximum Likelihood Estimation From Incomplete Data via the EM Algorithm (with discussion), J.R. Statis. Soc. B 39, 1–38. 1977.

    MATH  MathSciNet  Google Scholar 

  5. L.I.Perlovsky M.M.McManus Maximum Likelihood Neural Networks for Sensor Fusion and Adaptive Classification. Neural Networks 4–2 89–102. 1991.

    Article  Google Scholar 

  6. D.M.Titteringdon A.F.M.Smith U.E.Makov Statistical Analysis of Finite Mixture Distributions. New York:John Wiley and Sons. 1985.

    Google Scholar 

  7. D.E.Rummelhart, G.E. Hinton, R.J.Williams Learning Internal Representations By Error Propagation. In Parallel Distributed Processing, Chapter 8. MIT Press, Cambridge, Mass. 1986

    Google Scholar 

  8. D.H.Ballard Interpolation Coding: A Representation For Numbers In Neural Models. Biological Cybernetics 57 389–402 1987.

    Article  MATH  Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1992 Springer-Verlag London Limited

About this paper

Cite this paper

Toulson, D.L., Boyce, J.F., Hinton, C. (1992). Data Representation And Generalisation In An Application Of a Feed-Forward Neural Net. In: Taylor, J.G. (eds) Neural Network Applications. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-2003-2_4

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-2003-2_4

  • Publisher Name: Springer, London

  • Print ISBN: 978-3-540-19772-0

  • Online ISBN: 978-1-4471-2003-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics