Neural Computing & Applications

, Volume 3, Issue 1, pp 38–49

Comparison of kernel estimators, perceptrons and radial-basis functions for OCR and speech classification


DOI: 10.1007/BF01414175

Cite this article as:
Alpaydin, E. & Gürgen, F. Neural Comput & Applic (1995) 3: 38. doi:10.1007/BF01414175


We compare kernel estimators, single and multi-layered perceptrons and radial-basis functions for the problems of classification of handwritten digits and speech phonemes. By taking two different applications and employing many techniques, we report here a two-dimensional study whereby a domain-independent assessment of these learning methods can be possible. We consider a feed-forward network with one hidden layer. As examples of the local methods, we use kernel estimators like k-nearest neighbour (k-nn), Parzen windows, generalised k-nn, and Grow and Learn (Condensed Nearest Neighbour). We have also considered fuzzy k-nn due to its similarity. As distributed networks, we use linear perceptron, pairwise separating linear perceptron and multi-layer perceptrons with sigmoidal hidden units. We also tested the radial-basis function network, which is a combination of local and distributed networks. Four criteria are taken for comparison: correct classification of the test set; network size; learning time; and the operational complexity. We found that perceptrons, when the architecture is suitable, generalise better than local, memory-based kernel estimators, but require a longer training and more precise computation. Local networks are simple, leant very quickly and acceptably, but use more memory.


Kernel estimators Perceptrons Back-propagation Radial-basis functions Optical character recognition Speech recognition 

Copyright information

© Springer-Verlag London Limited 1995

Authors and Affiliations

  1. 1.Department of Computer EngineeringBogaziçi UniversityIstanbulTurkey