Abstract
Standard multiclass pattern recognition requires frequent re-learning stages when the set of categories of interest evolves in time. In order to minimize the computation costs of class incorporation and removal, we divide the global multiclass recognizer into a collection of class pairwise neural dichotomizers. When a new class appears, an adequate set of dichotomizers is created and trained to discriminate the new class from the rest. If a class disappears, its associated dichotomizers are eliminated In both cases previously learned knowledge is not disturbed. The properties of neural recognizers and pairwise modularization allow an analytic quasi-optimum method for combining network outputs to obtain the global multiclass response. An incremental and distributed pattern recognition architecture is presented and its performance experimentally evaluated, obtaining better error rates and learning times than conventional multiclass recognizers using similar resources. The design is highly parallel and asyncronous, adequate for dynamic real time applications.
Preview
Unable to display preview. Download preview PDF.
Bibliography
A. Abidi, R. C. Gonzalez (eds.), Data Fusion: In Robotics and Machine Intelligence, Academic Press, 1992.
F. J. Arcas Túnez. Descomposición de Reconocedores de Patrones Multiclase, Term Project, Facultad de Informática, Universidad de Murcia, Spain, December 1993 (in Spanish).
W. G. Baxt, “Improving the Accuracy of an Artificial Neural Network Using Multiple Differently Trained Networks”, Neural Computation, 4, 772–780 (1992)
J. S. Denker, Y. leCun, “Transforming Neural-Net Output Levels to Probability Distributions” Adv. Neur. Inf. 3, Morgan Kaufmann 1991
R. O. Duda, P. E. Hart, Pattern Classification and Scene Analysis, John Wiley & Sons, 1973
K. Fukunaga, Introduction to Statistical Pattern Recognition, Academic Press, 1990
J. Hertz, A. Krogh, R.G. Palmer, Introduction to the Theory of Neural Computation, Addison Wesley 1991
T. K. Ho, J. J. Hull, S. N. Srihari, “Decision Combination in Multiple Classifier Systems”, IEEE T. on Pattern Analysis and Machine Intelligence, V16 N1 Jan 1994, pp. 66–75
K. Hornik, M. Stinchcombe, H. White, “Multilayer Feed Forward Networks are Universal Approximators”, Neural Networks N2, 1989
R. A Jacobs “Task Decomposition Through Competition in a Modular Connectionist Architecture: The What and Where Vision Tasks”, COINS Tech. Rep. 90-27, March 1990
F. Kimura, M. Shridhar. “Handwritten numerical recognition based on multiple algorithms”, Pattern Recognition, V24 N10 pp969–983, 1991
S. Knerr, L. Personnaz, G. Dreyfus, “A New Approach To The Design Of Neural Networks Classifiers and its Applic. to the Aut Recog. of Handwriten Digits”, Adv.Neur.Inf. 3, Morgan Kauffmann, 1991
N. Morgan, H. Bourlard, “Factoring Networks by a Statistical Method”, Neural Computation, 4, 835–838 (1992)
A. Papoulis, Probability, Random Variables and Stochastic Processes 3rd ed. McGraw-Hill, 1991
L. I. Perlovsky, M. M. McManus, “Maximum Likelihood Neural Networks for Sensor Fusion and Adaptive Classification”, Neural Networks, V4 pp89–102, 1991
L. Xu, A. Krzyzak, C. Y. Suen, “Associative Switch For Combining Multiple Classifiers”, IEEE, 1991
L. Xu, A. Kryzak, C. Y. Suen, “Methods of Combining Multiple Classifiers and Their Applications to Handwritten Recognition”, IEEE T Systems, Man and Cybernetics, V22 N3, 1992
E. A. Wan, “Neural Networks Classification: A Bayesian Interpretation”. IEEE T. Neural Networks, V1 N4 Dec. 1990
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1995 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
García, A.R., Túnez, F.J.A. (1995). Quasi-optimum combination of multilayer perceptrons for adaptive multiclass pattern recognition. In: Mira, J., Sandoval, F. (eds) From Natural to Artificial Neural Computation. IWANN 1995. Lecture Notes in Computer Science, vol 930. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-59497-3_266
Download citation
DOI: https://doi.org/10.1007/3-540-59497-3_266
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-59497-0
Online ISBN: 978-3-540-49288-7
eBook Packages: Springer Book Archive