Abstract
Although feature selection is a central problem in inductive learning as suggested by the growing amount of research in this area, most of the work has been carried out under the supervised learning paradigm, paying little attention to unsupervised learning tasks and, particularly, clustering tasks. In this paper, we analyze the particular benefits that feature selection may provide in hierarchical clustering. We propose a view of feature selection as a tree pruning process similar to those used in decision tree learning. Under this framework, we perform several experiments using different pruning strategies and considering a multiple prediction task. Results suggest that hierarchical clusterings can be greatly simplified without diminishing accuracy.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
A. L. Blum and P. Langley. Selection of relevant features and examples in machine learning. Artificial Intelligence, 97:245–271, 1997.
M. Dash, H. Liu, and J. Yao. Dimensionality reduction for unsupervised data. In Ninth IEEE International Conference on Tools with AI, ICTAI’97, 1997.
M. Devaney and A. Ram. Efficient feature selection in conceptual clustering. In Machine Learning: Proceedings of the Fourteenth International Conference, pages 92–97, Nashville, TN, 1997. Morgan Kaufmann.
U. M. Fayyad, G. Piatetsky-Shapiro, and P. Smyth. From data mining to knowledge discovery: An overview. In U. M. Fayyad, G. Piatetsky-Shapiro, P. Smyth, and R. Uthurusamy, editors, Advances in Knowledge Discovery and Data Mining, pages 1–34. AAAI Press, Cambridge, MA, 1996.
D. H. Fisher. Knowledge acquisition via incremental conceptual clustering. Machine Learning, 2:139–172, 1987.
D. H. Fisher. Iterative optimization and Simplification of hierarchical clusterings. Journal of Artificial Intelligence Research, (4):147–179, 1996.
D. H. Fisher and J. C. Schlimmer. Concept Simplification and prediction accuracy. In Proceedings of the Fifth International Conference on Machine Learning, pages 22–28, Ann Arbor, MI, 1988. Morgan Kaufmann.
J. H. Gennari. Concept formation and attention. pages 724–728, 1991.
G. H. John, R. Kohavi, and K. Pfleger. Irrelevant features and the subset selection problem. In Proceedings of the Eleventh International Conference on Machine Learning, pages 121–129. Morgan Kauffmann, San Mateo, CA, 1994.
P. Langley. Elements of machine learning. Morgan Kaufmann, San Francisco, CA, 1995.
R. S. Michalski and R. E. Stepp. Learning from observation: Conceptual clustering. In R. S. Michalski, J. G. Carbonell, and T. M. Mitchell, editors, Machine Learning: An Artificial intelligence approach, pages 331–363. Morgan Kauffmann, San Mateo, CA, 1983.
J. Mingers. An empirical comparison of pruning methods for decision tree induction. Machine Learning, 4(2):227–243, 1989.
J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, 1993.
L. Talavera. Feature selection as a preprocessing step for hierarchical clustering. In Proceedings of the Sixteenth International Conference on Machine Learning. Morgan Kaufmann, 1999. (To appear).
D. Wettschereck and D. W. Aha. Weighting features. In Proceedings of the First International Conference on Case-Based Reasoning, Portugal, 1995. Lisbon.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Talavera, L. (1999). Feature Selection as Retrospective Pruning in Hierarchical Clustering. In: Hand, D.J., Kok, J.N., Berthold, M.R. (eds) Advances in Intelligent Data Analysis. IDA 1999. Lecture Notes in Computer Science, vol 1642. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48412-4_7
Download citation
DOI: https://doi.org/10.1007/3-540-48412-4_7
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66332-4
Online ISBN: 978-3-540-48412-7
eBook Packages: Springer Book Archive