New Bounds and Approximations for the Error of Linear Classifiers
In this paper, we derive lower and upper bounds for the probability of error for a linear classifier, where the random vectors representing the underlying classes obey the multivariate normal distribution. The expression of the error is derived in the one-dimensional space, independently of the dimensionality of the original problem. Based on the two bounds, we propose an approximating expression for the error of a generic linear classifier. In particular, we derive the corresponding bounds and the expression for approximating the error of Fisher’s classifier. Our empirical results on synthetic data, including up to five-hundred-dimensional featured samples, show that the computations for the error are extremely fast and quite accurate; the approximation differs from the actual error by at most ε=0.0184340683.
Unable to display preview. Download preview PDF.
- 2.Davies, P., Higham, N.: Numerically Stable Generation of Correlation Matrices and their Factors. Technical Report 354, Manchester, England (1999)Google Scholar
- 3.Duda, R., Hart, P., Stork, D.: Pattern Classification, 2nd edn. John Wiley and Sons, Inc., New York (2000)Google Scholar
- 6.Kendall, M., Stuart, A.: Kendall’s Advanced Theory of Statistics, 6th edn. Distribution Theory, vol. I. Edward Arnold, London (1998)Google Scholar
- 9.Rueda, L.: A One-dimensional Analysis for the Probability of Error of Linear Classifiers for Normally Distributed Classes (2004) (submitted for Publication), Electronically available at http://davinci.newcs.uwindsor.ca/~lrueda/papers/ErrorEstJnl.pdf
- 11.Vaswani, N.: A Linear Classifier for Gaussian Class Conditional Distributions with Unequal Covariance Matrices. In: Proceedings of the 16th International Conference on Pattern Recognition, vol. 2, pp. 60–63. Quebec, Canada (2002)Google Scholar