Advertisement

Pattern Classification Using Composite Features

  • Chunghoon Kim
  • Chong-Ho Choi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4132)

Abstract

In this paper, we propose a new classification method using composite features, each of which consists of a number of primitive features. The covariance of two composite features contains information on statistical dependency among multiple primitive features. A new discriminant analysis (C-LDA) using the covariance of composite features is a generalization of the linear discriminant analysis (LDA). Unlike LDA, the number of extracted features can be larger than the number of classes in C-LDA. Experimental results on several data sets indicate that C-LDA provides better classification results than other methods.

Keywords

Face Recognition Linear Discriminant Analysis Machine Intelligence Composite Feature Primitive Feature 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jain, A.K., Duin, R.P.W., Mao, J.: Statistical Pattern Recognition: A Review. IEEE Trans. Pattern Analysis and Machine Intelligence 22, 4–37 (2000)CrossRefGoogle Scholar
  2. 2.
    Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, New York (1990)MATHGoogle Scholar
  3. 3.
    Belhumeur, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection. IEEE Trans. Pattern Analysis and Machine Intelligence 19, 711–720 (1997)CrossRefGoogle Scholar
  4. 4.
    Cevikalp, H., Neamtu, M., Wilkes, M., Barkana, A.: Discriminative Common Vectors for Face Recognition. IEEE Trans. Pattern Analysis and Machine Intelligence 27, 4–13 (2005)CrossRefGoogle Scholar
  5. 5.
    Ye, J., Li, Q.: A Two-Stage Linear Discriminant Analysis via QR-Decomposition. IEEE Trans. Pattern Analysis and Machine Intelligence 27, 929–941 (2005)CrossRefGoogle Scholar
  6. 6.
    Fukunaga, K., Mantock, J.M.: Nonparametric Discriminant Analysis. IEEE Trans. Pattern Analysis and Machine Intelligence 5, 671–678 (1983)MATHCrossRefGoogle Scholar
  7. 7.
    Brunzell, H., Eriksson, J.: Feature Reduction for Classification of Multidimensional Data. Pattern Recognition 33, 1741–1748 (2000)CrossRefGoogle Scholar
  8. 8.
    Loog, M., Duin, R.P.W.: Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion. IEEE Trans. Pattern Analysis and Machine Intelligence 26, 732–739 (2004)CrossRefGoogle Scholar
  9. 9.
    Chen, C.H.: On Information and Distance Measures, Error Bounds, and Feature Selection. Information Sciences 10, 159–173 (1976)CrossRefGoogle Scholar
  10. 10.
    Yang, J., Zhang, D., Yong, X., Yang, J.-y.: Two-dimensional Discriminant Transform for Face Recognition. Pattern Recognition 38, 1125–1129 (2005)MATHCrossRefGoogle Scholar
  11. 11.
    Yang, J., Yang, J.-y.: From image vector to matrix: a straightforward image projection technique-IMPCA vs. PCA. Pattern Recognition 35, 1997–1999 (2002)MATHCrossRefGoogle Scholar
  12. 12.
    Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases (1998), http://www.ics.uci.edu/mlearn/MLRepository.html
  13. 13.
    Papadimitriou, C.H., Steiglitz, K.: Combinatorial Optimization: Algorithms and Complexity. Prentice-Hall, Englewood Cliffs (1982)MATHGoogle Scholar
  14. 14.
    Kwak, N., Choi, C.-H.: Input Feature Selection for Classification Problems. IEEE Trans. Neural Networks 13, 143–159 (2002)CrossRefGoogle Scholar
  15. 15.
    Webb, A.: Statistical Pattern Recognition, 2nd edn. Wiley, West Sussex (2002)MATHCrossRefGoogle Scholar
  16. 16.
    Moghaddam, B., Pentland, A.: Probabilistic Visual Learning for Object Representation. IEEE Trans. Pattern Analysis and Machine Intelligence 19, 696–710 (1997)CrossRefGoogle Scholar
  17. 17.
    Fukunaga, K., Hummels, D.M.: Bayes Error Estimation Using Parzen and k-NN Procedures. IEEE Trans. Pattern Analysis and Machine Intelligence 9, 634–643 (1987)MATHCrossRefGoogle Scholar
  18. 18.
    Parzen, E.: On Estimation of a Probability Density Function andMode. The Annals of Mathematical Statistics 33, 1065–1076 (1962)MATHCrossRefMathSciNetGoogle Scholar
  19. 19.
    Kim, C., Oh, J., Choi, C.-H.: Combined Subspace Method Using Global and Local Features for Face Recognition. In: Proc. Int’l Joint Conf. Neural Networks, pp. 2030–2035 (2005)Google Scholar
  20. 20.
    Veenman, C.J., Reinders, M.J.T.: The Nearest Subclass Classifier: A Compromise Between the Nearest Mean and Nearest Neighbor Classifier. IEEE Trans. Pattern Analysis and Machine Intelligence 27, 1417–1429 (2005)CrossRefGoogle Scholar
  21. 21.
    Toh, K.-A., Tran, Q.-L., Srinivasan, D.: Benchmarking a Reduced Multivariate Polynomial Pattern Classifier. IEEE Trans. Pattern Analysis and Machine Intelligence 26, 740–755 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Chunghoon Kim
    • 1
  • Chong-Ho Choi
    • 1
  1. 1.School of Electrical Engineering and Computer ScienceSeoul National UniversitySeoulKorea

Personalised recommendations