Abstract
Following the idea of neural networks, multi-layer statistical classifier [3] was designed to capture interactions between measurement variables using nonlinear transformation of additive models. However, unlike neural nets, this statistical method can not readjust the initial features, and as a result it often leads to poor classification when those features are not adequate. This article presents an iterative algorithm based on backfitting which can modify these features dynamically. The resulting method can be viewed as an approach for estimating posterior class probabilities by projection pursuit regression, and the associated model can be interpreted as a generalized version of the neural network and other statistical models.
Chapter PDF
References
Anderson, T.W.: An Introduction to Multivariate Statistical Analysis. Wiley, New York (1984)
Bose, S.: Classification using splines. Comp. Statist. Data Anal. 22, 505–525 (1996)
Bose, S.: Multilayer Statistical Classifiers. Comp. Statist. Data Anal. 42, 685–701 (2003)
Breiman, L.: Fitting additive models to regression data: diagnostics and alternating views. Comp. Statist. Data Anal. 15, 13–46 (1993)
Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth and Brooks Press, Monterrey (1984)
Duda, R., Hart, P., Stork, D.G.: Pattern classification. Wiley, New York (2000)
Friedman, J., Stuetzle, W.: Projection Pursuit Regression. J. Amer. Statist. Assoc. 76, 817–823 (1981)
Ghosh, A.K., Bose, S.: Backfitting neural networks. Computational Statistics 19, 193–210 (2004)
Ghosh, A.K., Chaudhuri, P.: Data depth and distribution free discriminant analysis using separating surfaces. Bernoulli 11, 1–27 (2005)
Ghosh, A.K., Chaudhuri, P., Sengupta, D.: Multi-scale kernel discriminant analysis. In: Mukherjee, D.P., Pal, S. (eds.) Proceedings of Fifth International Conference on Advances in Pattern Recognition (ICAPR 2003), pp. 89–93. Allied Publishers, Kolkata (2005)
Hand, D.J.: Kernel Discriminant Analysis. Wiley, Chichester (1982)
Hastie, T., Tibshirani, R., Friedman, J.H.: The elements of statistical learning: data mining, inference and prediction. Springer, New York (2001)
Kooperberg, C., Bose, S., Stone, C.J.: Polychotomus regression. J. Amer. Statist. Assoc. 92, 117–127 (1997)
Ripley, B.D.: Pattern recognition and neural networks. CUP, Cambridge (1996)
Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Chapman and Hall, London (1986)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ghosh, A.K., Bose, S. (2005). Feature Extraction for Nonlinear Classification. In: Pal, S.K., Bandyopadhyay, S., Biswas, S. (eds) Pattern Recognition and Machine Intelligence. PReMI 2005. Lecture Notes in Computer Science, vol 3776. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11590316_21
Download citation
DOI: https://doi.org/10.1007/11590316_21
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-30506-4
Online ISBN: 978-3-540-32420-1
eBook Packages: Computer ScienceComputer Science (R0)