Advertisement

A Method to Estimate the True Mahalanobis Distance from Eigenvectors of Sample Covariance Matrix

  • Masakazu Iwamura
  • Shinichiro Omachi
  • Hirotomo Aso
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2396)

Abstract

In statistical pattern recognition, the parameters of distributions are usually estimated from training sample vectors. However, estimated parameters contain estimation errors, and the errors cause bad influence on recognition performance when the sample size is not sufficient. Some methods can obtain better estimates of the eigenvalues of the true covariance matrix and can avoid bad influences caused by estimation errors of eigenvalues. However, estimation errors of eigenvectors of covariance matrix have not been considered enough. In this paper, we consider estimation errors of eigenvectors and show the errors can be regarded as estimation errors of eigenvalues. Then, we present a method to estimate the true Mahalanobis distance from eigenvectors of the sample covariance matrix. Recognition experiments show that by applying the proposed method, the true Mahalanobis distance can be estimated even if the sample size is small, and better recognition accuracy is achieved. The proposed method is useful for the practical applications of pattern recognition since the proposed method is effective without any hyper-parameters.

Keywords

Estimation Error Feature Vector Recognition Rate Mahalanobis Distance Sample Covariance Matrix 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Sakai, M., Yoneda, M., Hase, H.: A new robust quadratic discriminant function. In: Proc. ICPR. (1998) 99–102Google Scholar
  2. 2.
    Sakai, M., Yoneda, M., Hase, H., Maruyama, H., Naoe, M.: A quadratic discriminant function based on bias rectification of eigenvalues. Trans. IEICE J82-D-II (1999) 631–640Google Scholar
  3. 3.
    James, W., Stein, C.: Estimation with quadratic loss. In: Proc. 4th Berkeley Symp. on Math. Statist. and Prob. (1961) 361–379Google Scholar
  4. 4.
    Iwamura, M., Omachi, S., Aso, H.: A modification of eigenvalues to compensate estimation errors of eigenvectors. In: Proc. ICPR. Volume 2., Barcelona, Spain (2000) 378–381Google Scholar
  5. 5.
    Hammersley, J. M., Handscomb, D. C.: 6. In: Monte Carlo Methods. Methuen, London (1964)Google Scholar
  6. 6.
    Grother, P. J.: NIST special database 19 handprinted forms and characters database. Technical report, National Institute of Standards and Technology (1995)Google Scholar
  7. 7.
    Yamada, H., Yamamoto, Saito, T.: A nonlinear normalization method for handprinted kanji character recognition — line density equalization-. Pattern Recognition 23 (1990) 1023–1029CrossRefGoogle Scholar
  8. 8.
    Sun, N., Uchiyama, Y., Ichimura, H., Aso, H., Kimura, M.: Intelligent recognition of characters using associative matching technique. In: Proc. Pacific Rim Int’l Conf. Artificial Intelligence (PRICAI’90). (1990) 546–551Google Scholar
  9. 9.
    Muirhead, R. J.: Aspects of Multivariate Statistical Theory. John Wiley & Sons, Inc., New York (1982)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Masakazu Iwamura
    • 1
  • Shinichiro Omachi
    • 1
  • Hirotomo Aso
    • 1
  1. 1.Graduate School of EngineeringTohoku UniversitySendai-shiJapan

Personalised recommendations