Advertisement

Feature Extraction for Nonlinear Classification

  • Anil Kumar Ghosh
  • Smarajit Bose
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3776)

Abstract

Following the idea of neural networks, multi-layer statistical classifier [3] was designed to capture interactions between measurement variables using nonlinear transformation of additive models. However, unlike neural nets, this statistical method can not readjust the initial features, and as a result it often leads to poor classification when those features are not adequate. This article presents an iterative algorithm based on backfitting which can modify these features dynamically. The resulting method can be viewed as an approach for estimating posterior class probabilities by projection pursuit regression, and the associated model can be interpreted as a generalized version of the neural network and other statistical models.

Keywords

Linear Discriminant Analysis Class Boundary Quadratic Discriminant Analysis Projection Pursuit Regression Posterior Class Probability 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Anderson, T.W.: An Introduction to Multivariate Statistical Analysis. Wiley, New York (1984)MATHGoogle Scholar
  2. 2.
    Bose, S.: Classification using splines. Comp. Statist. Data Anal. 22, 505–525 (1996)MATHCrossRefGoogle Scholar
  3. 3.
    Bose, S.: Multilayer Statistical Classifiers. Comp. Statist. Data Anal. 42, 685–701 (2003)MATHCrossRefGoogle Scholar
  4. 4.
    Breiman, L.: Fitting additive models to regression data: diagnostics and alternating views. Comp. Statist. Data Anal. 15, 13–46 (1993)MATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth and Brooks Press, Monterrey (1984)MATHGoogle Scholar
  6. 6.
    Duda, R., Hart, P., Stork, D.G.: Pattern classification. Wiley, New York (2000)Google Scholar
  7. 7.
    Friedman, J., Stuetzle, W.: Projection Pursuit Regression. J. Amer. Statist. Assoc. 76, 817–823 (1981)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Ghosh, A.K., Bose, S.: Backfitting neural networks. Computational Statistics 19, 193–210 (2004)MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Ghosh, A.K., Chaudhuri, P.: Data depth and distribution free discriminant analysis using separating surfaces. Bernoulli 11, 1–27 (2005)MATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    Ghosh, A.K., Chaudhuri, P., Sengupta, D.: Multi-scale kernel discriminant analysis. In: Mukherjee, D.P., Pal, S. (eds.) Proceedings of Fifth International Conference on Advances in Pattern Recognition (ICAPR 2003), pp. 89–93. Allied Publishers, Kolkata (2005)Google Scholar
  11. 11.
    Hand, D.J.: Kernel Discriminant Analysis. Wiley, Chichester (1982)MATHGoogle Scholar
  12. 12.
    Hastie, T., Tibshirani, R., Friedman, J.H.: The elements of statistical learning: data mining, inference and prediction. Springer, New York (2001)MATHGoogle Scholar
  13. 13.
    Kooperberg, C., Bose, S., Stone, C.J.: Polychotomus regression. J. Amer. Statist. Assoc. 92, 117–127 (1997)MATHCrossRefGoogle Scholar
  14. 14.
    Ripley, B.D.: Pattern recognition and neural networks. CUP, Cambridge (1996)MATHGoogle Scholar
  15. 15.
    Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Chapman and Hall, London (1986)MATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Anil Kumar Ghosh
    • 1
  • Smarajit Bose
    • 1
  1. 1.Theoretical Statistics and Mathematics UnitIndian Statistical InstituteCalcuttaIndia

Personalised recommendations