Abstract
In pattern recognition, feature selection is an important technique for reducing the measurement cost of features or for improving the performance of classifiers, or both. Removal of features with no discriminative information is effective for improving the precision of estimated parameters of parametric classifiers. Many feature selection algorithms choose a feature subset that is useful for all classes in common. However, the best feature subset for separating one group of classes from another may depend on groups. In this study, we investigate the effectiveness of choosing feature subsets depending on groups of classes (class-dependent features), and propose a classifier system that is built as a decision tree in which nodes have class-dependent feature subsets.
Chapter PDF
References
P. A. Devijver and J. Kittler, Pattern Recognition: A Statistical Approach. Prentice-Hall, 1982.
P. Pudil, J. Novovičová and J. Kittler, Floating Search Methods in Feature Selection. Pattern Recognition Letters, 15(1998), 1119–1125.
P. Somol, P. Pudil, J. Novovičová and P. Paclík, Adaptive Floating Search Methods in Feature Selection. Pattern Recognition Letters, 20(1999), 1157–1163.
F. J. Ferri, P. Pudil, M. Hatef and J. Kittler, Comparative Study of Techniques for Large-Scale Feature Selection. Pattern Recognition in Practice IV(1994), 403–413.
M. Kudo and J. Sklansky, A Comparative Evaluation of Medium-and Large-scale Feature Selectors for Pattern Classifiers. 1st International Workshop on Statistical Techniques in Pattern Recognition(1997), 91–96.
M. Kudo and J. Sklansky, Classifier-Independent Feature Selection for Two-stage Feature Selection. Advances in Pattern Recognition, 1451(1998), 548–554.
D. Zongker and A. Jain, Algorithms for Feature Selection: An Evaluation. 13th International Conference on Pattern Recognition, 2(1996), 18–22.
I. S. Oh, J. S. Lee and C. Y. Suen, Analysis of Class Separation and Combination of Class-Dependent Features for Handwriting Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(1999), 1089–1094.
I. S. Oh, J. S. Lee, K. C. Hong and S. M. Choi, Class Expert Approach to Handwritten Numerical Recognition. Proceedings of IWFHR’ 96(1996), 35–40.
R. O. Duda, P. E. Hart and D. G. Stork, Pattern Classification: Second Edition. John Wiley & Sons, 2000.
L. Breiman, J. H. Friedman, R. A. Olshen and C. J. Stone, Classification and Regression Trees. Wadsworth & Brooks / Cole Advanced Books & Software, 1984.
P. M. Murphy and D. W. Aha, UCI Repsitory of Machine Learning Databases [Machine-readable data repository]. University of California Irnive, Department of Informaiton and Computations Science(1996).
M. Kudo and M. Shimbo, Feature Selection Based on the Structural Indices of Categories. Pattern Recognition, 26(1993), 891–901.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Aoki, K., Kudo, M. (2002). Decision Tree Using Class-Dependent Feature Subsets. In: Caelli, T., Amin, A., Duin, R.P.W., de Ridder, D., Kamel, M. (eds) Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2002. Lecture Notes in Computer Science, vol 2396. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-70659-3_80
Download citation
DOI: https://doi.org/10.1007/3-540-70659-3_80
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44011-6
Online ISBN: 978-3-540-70659-5
eBook Packages: Springer Book Archive