Abstract
When dealing with normally distributed classes, it is well known that the optimal discriminant function for two-classes is linear when the covariance matrices are equal. In this paper, we determine conditions for the optimal linear classifier when the covariance matrices are non-equal. In all the cases discussed here, the classifier is given by a pair of straight lines which is a particular case of the general equation of second degree. One of these cases is when we have two overlapping classes with equal means, which is a general case of the Minsky’s Paradox for the Perceptron. Our results, which to our knowledge are the pioneering results for pairwise linear classifiers, yield a general linear classifier for this particular case, which can be obtained directly from the parameters of the distribution. Numerous other analytic results for two and d-dimensional normal vectors have been derived. Finally, we have also provided some empirical results in all the cases, and demonstrated that these linear classifiers achieve very good performance.
Chapter PDF
Similar content being viewed by others
Keywords
- Discriminant Function
- Covariance Matrice
- Positive Real Number
- Back Propagation Neural Network
- Linear Discriminant Function
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
M. Aladjem., Linear Discriminant Analysis for Two Classes Via Removal of Classification Structure. IEEE Trans. Pattern Analysis and Machine Intelligence, 19(2):187–192, 1997.
P. Devijver and J. Kittler. Pattern Recognition: A Statistical Approach. Prentice-Hall, 1982.
R. Duda and P. Hart. Pattern Classification and Scene Analysis. John Wiley and Sons, Inc., 1973.
K. Fukunaga. Introduction to Statistical Pattern Recognition. Academic Press, 1990.
K. Fukunaga. Statistical Pattern Recognition. Handbook of Pattern Recognition and Computer Vision, pages 33–60, 1993.
T. M. Ha. Optimum Decision Rules in Pattern Recognition. Advances in Pattern Recognition, SSPR’98-SPR’98, pages 726–735, 1998.
M. Minsky. Perceptrons. MIT Press, 2nd edition, 1988.
A. Rao, D. Miller, K. Rose, and A. Gersho. A Deterministic Annealing Approach for Parsimonious Design of Piecewise Regression Models. IEEE Trans. Pattern Analysis and Machine Intelligence, 21(2):159–173, 1999.
S. Raudys. Linear Classifiers in Perceptron Design. Proc. 13th ICPR, Track D, Wien, 1996.
S. Raudys. On Dimensionality, Sample Size, and Classification Error of Nonparametric Linear Classification. IEEE Trans. Pattern Analysis and Machine Intelligence, 19(6):667–671, 1997.
B. Ripley. Pattern Recognition and Neural Networks. Cambridge Univ. Press, 1996.
L. Rueda and B.J. Oommen. On Optimal Pairwise Linear Classifiers. Unabridged version of this paper. Submitted for Publication.
R. Schalkoff. Pattern Recognition: Statistical, Structural and Neural Approaches. John Wiley and Sons, Inc., 1992.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Rueda, L., Oommen, B.J. (2000). The Foundational Theory of Optimal Bayesian Pairwise Linear Classifiers. In: Ferri, F.J., Iñesta, J.M., Amin, A., Pudil, P. (eds) Advances in Pattern Recognition. SSPR /SPR 2000. Lecture Notes in Computer Science, vol 1876. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44522-6_60
Download citation
DOI: https://doi.org/10.1007/3-540-44522-6_60
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67946-2
Online ISBN: 978-3-540-44522-7
eBook Packages: Springer Book Archive