Advertisement

Regularized Neighborhood Component Analysis

  • Zhirong Yang
  • Jorma Laaksonen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4522)

Abstract

Discriminative feature extraction is one of the fundamental problems in pattern recognition and signal processing. It was recently proposed that maximizing the class prediction by neighboring samples in the transformed space is an effective objective for learning a low-dimensional linear embedding of labeled data. The associated methods, Neighborhood Component Analysis (NCA) and Relevant Component Analysis (RCA), have been proven to be useful preprocessing techniques for discriminative information visualization and classification. We point out here that NCA and RCA are prone to overfitting and therefore regularization is required. NCA and RCA’s failure for high-dimensional data is demonstrated in this paper by experiments in facial image processing. We also propose to incorporate a Gaussian prior into the NCA objective and obtain the Regularized Neighborhood Component Analysis (RNCA). The empirical results show that the generalization can be significantly enhanced by using the proposed regularization method.

Keywords

Linear Discriminant Analysis Facial Image FERET Database Discriminative Component Discriminative Direction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Chen, Z., Haykin, S.: On different facets of regularization theory. Neural Computation 14, 2791–2846 (2002)CrossRefzbMATHGoogle Scholar
  2. 2.
    Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (2000)Google Scholar
  3. 3.
    Fisher, R.A.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7 (1963)Google Scholar
  4. 4.
    Flynn, P.J., Bowyer, K.W., Phillips, P.J.: Assessment of time dependency in face recognition: An initial study. In: Kittler, J., Nixon, M.S. (eds.) AVBPA 2003. LNCS, vol. 2688, Springer, Heidelberg (2003)CrossRefGoogle Scholar
  5. 5.
    Girosi, F., Jones, M., Poggio, T.: Regularization theory and neural networks architectures. Neural Computation 7(2), 219–269 (1995)CrossRefGoogle Scholar
  6. 6.
    Goldberger, J., Roweis, S.T., Hinton, G.E., Salakhutdinov, R.: Neighbourhood components analysis. In: NIPS (2004)Google Scholar
  7. 7.
    Golub, G.H., van Loan, C.F.: Matrix Computations, 2nd edn. The Johns Hopkins University Press, Baltimore (1989)zbMATHGoogle Scholar
  8. 8.
    Peltonen, J., Kaski, S.: Discriminative components of data. IEEE Transactions on Neural Networks 16(1), 68–83 (2005)CrossRefGoogle Scholar
  9. 9.
    Phillips, P.J., Moon, H., Rizvi, S.A., Rauss, P.J.: The FERET evaluation methodology for face recognition algorithms. IEEE Trans. Pattern Analysis and Machine Intelligence 22, 1090–1104 (2000)CrossRefGoogle Scholar
  10. 10.
    Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge (2002)Google Scholar
  11. 11.
    Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Royal. Statist. Soc. B. 58(1), 267–288 (1996)MathSciNetzbMATHGoogle Scholar
  12. 12.
    Yang, Z., Laaksonen, J.: Partial relevance in interactive facial image retrieval. In: Singh, S., Singh, M., Apte, C., Perner, P. (eds.) ICAPR 2005. LNCS, vol. 3687, pp. 216–225. Springer, Heidelberg (2005)CrossRefGoogle Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Zhirong Yang
    • 1
  • Jorma Laaksonen
    • 1
  1. 1.Laboratory of Computer and Information Science, Helsinki University of Technology, P.O. Box 5400, FI-02015 TKK, EspooFinland

Personalised recommendations