Fast Classification with Neural Networks via Confidence Rating
We present a novel technique to reduce the computational burden associated to the operational phase of neural networks. To get this, we develop a very simple procedure for fast classification that can be applied to any network whose output is calculated as a weighted sum of terms, which comprises a wide variety of neural schemes, such as multi-net networks and Radial Basis Function (RBF) networks, among many others. Basically, the idea consists on sequentially evaluating the sum terms, using a series of thresholds which are associated to the confidence that a partial output will coincide with the overall network classification criterion. The possibilities of this strategy are well-illustrated by some experiments on a benchmark of binary classification problems, using RealAdaboost and RBF networks as the underlying technologies.
Unable to display preview. Download preview PDF.
- 1.Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford Univ. Press, New York (1995)Google Scholar
- 5.Schölkopf, B., Smola, A.J.: Learning with Kernels. MIT Press, Cambridge (1998)Google Scholar
- 8.Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Irvine, Dept. of Information and Computer Sciences (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
- 9.Müller, K.-R., et al.: Predicting time series with Support Vector Machines. In: Advances in Kernel Methods–Support Vector Learning, Cambridge, MA (1999)Google Scholar