Feature Extraction for Nonlinear Classification
Following the idea of neural networks, multi-layer statistical classifier  was designed to capture interactions between measurement variables using nonlinear transformation of additive models. However, unlike neural nets, this statistical method can not readjust the initial features, and as a result it often leads to poor classification when those features are not adequate. This article presents an iterative algorithm based on backfitting which can modify these features dynamically. The resulting method can be viewed as an approach for estimating posterior class probabilities by projection pursuit regression, and the associated model can be interpreted as a generalized version of the neural network and other statistical models.
KeywordsLinear Discriminant Analysis Class Boundary Quadratic Discriminant Analysis Projection Pursuit Regression Posterior Class Probability
- 6.Duda, R., Hart, P., Stork, D.G.: Pattern classification. Wiley, New York (2000)Google Scholar
- 10.Ghosh, A.K., Chaudhuri, P., Sengupta, D.: Multi-scale kernel discriminant analysis. In: Mukherjee, D.P., Pal, S. (eds.) Proceedings of Fifth International Conference on Advances in Pattern Recognition (ICAPR 2003), pp. 89–93. Allied Publishers, Kolkata (2005)Google Scholar