Abstract
In paper we present the theoretical foundation for optimal classification using class-specific features and provide examples of its use. A new PDF projection theorem makes it possible to project probability density functions from a low-dimensional feature space back to the raw data space. An M-ary classifier is constructed by estimating the PDFs of class-specific features, then transforming each PDF back to the raw data space where they can be fairly compared. Although statistical sufficiency is not a requirement, the classifier thus constructed will become equivalent to the optimal Bayes classifier if the features meet sufficiency requirements individually for each class. This classifier is completely modular and avoids the dimensionality curse associated with large complex problems. By recursive application of the projection theorem, it is possible to analyze complex signal processing chains. It is possible to automate the feature and model selection process by direct comparison of log-likelihood values on the common raw data domain. Pre-tested modules are available for a wide range of features including linear functions of independent random variables, cepstrum, and MEL cepstrum.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Baggenstoss, P.M.: A modified Baum-Welch algorithm for hidden Markov models with multiple observation spaces. IEEE Trans. Speech and Audio, 411–416 (2001)
Baggenstoss, P.M.: The PDF projection theorem and the class-specific method. IEEE Trans Signal Processing, 672–685 (2003)
Belhumeur, P., Hespanha, J., Kriegman, D.: Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. PAMI 19(7), 711–720 (1997)
Durbin, J.: Approximations for densities of sufficient estimators. Biometrika 67(2), 311–333 (1980)
Frimpong-Ansah, Pearce, K., Holmes, D., Dixon, W.: A stochastic/feature based recogniser and its training algorithm. In: ICASSP 1989, vol. 1, pp. 401–404 (1989)
Kay, S.M., Nuttall, A.H., Baggenstoss, P.M.: Multidimensional probability density function approximation for detection, classification and model order selection. IEEE Trans. Signal Processing, 2240–2252 (2001)
Kumar, S., Ghosh, J., Crawford, M.: A versatile framework for labeling imagery with large number of classes. In: Proceedings of the International Joint Conference on Neural Networks, Washington, D. C., pp. 2829–2833 (1999)
Kumar, S., Ghosh, J., Crawford, M.: A hierarchical multiclassifier system for hyperspectral data analysis. In: Kittler, J., Roli, F. (eds.) Multiple Classifier Systems, pp. 270–279. Springer, Heidelberg (2000)
Oh, I.-S., Lee, J.-S., Suen, C.Y.: A class-modularity for character recognition. In: Proceedings of International Conference on Document Analysis and Recognition (ICDAR) 2001, Seattle, Washington, pp. 64–68 (2001)
Picone, J.W.: Signal modeling techniques in speech recognition. Proceedings of the IEEE 81(9), 1215–1247 (1993)
Sebald, D.: Support vector machines and the multiple hypothesis test problem. IEEE Trans. Signal Processing 49(11), 2865–2872 (2001)
Strawderman, R.L.: Higher-order asymptotic approximation: Laplace, saddlepoint, and related methods. Journal of the American Statistical Association 95(452), 1358–1364 (2000)
Watanabe, H., Yamaguchi, T., Katagiri, S.: Discriminative metric design for robust pattern recognition. IEEE Trans. Signal Processing 45(11), 2655–2661 (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Baggenstoss, P.M. (2003). A New Optimal Classifier Architecture to Aviod the Dimensionality Curse. In: Perales, F.J., Campilho, A.J.C., de la Blanca, N.P., Sanfeliu, A. (eds) Pattern Recognition and Image Analysis. IbPRIA 2003. Lecture Notes in Computer Science, vol 2652. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-44871-6_9
Download citation
DOI: https://doi.org/10.1007/978-3-540-44871-6_9
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40217-6
Online ISBN: 978-3-540-44871-6
eBook Packages: Springer Book Archive