New Bounds and Approximations for the Error of Linear Classifiers

  • Luis Rueda
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3287)

Abstract

In this paper, we derive lower and upper bounds for the probability of error for a linear classifier, where the random vectors representing the underlying classes obey the multivariate normal distribution. The expression of the error is derived in the one-dimensional space, independently of the dimensionality of the original problem. Based on the two bounds, we propose an approximating expression for the error of a generic linear classifier. In particular, we derive the corresponding bounds and the expression for approximating the error of Fisher’s classifier. Our empirical results on synthetic data, including up to five-hundred-dimensional featured samples, show that the computations for the error are extremely fast and quite accurate; the approximation differs from the actual error by at most ε=0.0184340683.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Cody, W.: A Portable FORTRAN Package of Special Function Routines and Test Drivers. ACM Transactions on Mathematical Software 19, 22–32 (1993)MATHCrossRefGoogle Scholar
  2. 2.
    Davies, P., Higham, N.: Numerically Stable Generation of Correlation Matrices and their Factors. Technical Report 354, Manchester, England (1999)Google Scholar
  3. 3.
    Duda, R., Hart, P., Stork, D.: Pattern Classification, 2nd edn. John Wiley and Sons, Inc., New York (2000)Google Scholar
  4. 4.
    Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press, London (1990)MATHGoogle Scholar
  5. 5.
    Herbrich, R., Graepel, T.: A PAC-Bayesian Margin Bound for Linear Classifiers. IEEE Transactions on Information Theory 48(12), 3140–3150 (2002)MATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Kendall, M., Stuart, A.: Kendall’s Advanced Theory of Statistics, 6th edn. Distribution Theory, vol. I. Edward Arnold, London (1998)Google Scholar
  7. 7.
    Lee, C., Choi, E.: Bayes Error Evaluation of the Gaussian ML Classifier. IEEE Transactions on Geoscience and Remote Sensing 38(3), 1471–1475 (2000)CrossRefGoogle Scholar
  8. 8.
    Raudys, S., Duin, R.: Expected Classification Error of the Fisher Linear Classifier with Pseudo-inverse Covariance Matrix. Pattern Recognition Letters 19, 385–392 (1999)CrossRefGoogle Scholar
  9. 9.
    Rueda, L.: A One-dimensional Analysis for the Probability of Error of Linear Classifiers for Normally Distributed Classes (2004) (submitted for Publication), Electronically available at http://davinci.newcs.uwindsor.ca/~lrueda/papers/ErrorEstJnl.pdf
  10. 10.
    Rueda, L.: An Efficient Approach to Compute the Threshold for Multi-dimensional Linear Classifiers. Pattern Recognition 37(4), 811–826 (2004)MATHCrossRefGoogle Scholar
  11. 11.
    Vaswani, N.: A Linear Classifier for Gaussian Class Conditional Distributions with Unequal Covariance Matrices. In: Proceedings of the 16th International Conference on Pattern Recognition, vol. 2, pp. 60–63. Quebec, Canada (2002)Google Scholar
  12. 12.
    Webb, A.: Statistical Pattern Recognition, 2nd edn. John Wiley & Sons, N.York (2002)MATHCrossRefGoogle Scholar
  13. 13.
    Xu, Y., Yang, Y., Jin, Z.: A Novel Method for Fisher Discriminant Analysis. Pattern Recognition 37(2), 381–384 (2004)MATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Luis Rueda
    • 1
  1. 1.School of Computer ScienceUniversity of WindsorWindsorCanada

Personalised recommendations