Abstract
This paper presents a general scheme for feature construction and its application to decision trees. In this scheme, a higher level attribute is constructed from two lower level ones under the guidance of the distributions of the examples from different classes. It can be used with different selective induction algorithms such as decision tree learning, rule learning and instance-based learning. A simple prototype has been implemented and integrated into a decision tree learning system. The experimental results empirically show that our approach outperforms the standard decision tree learning algorithm on three domains tested.
Preview
Unable to display preview. Download preview PDF.
References
Clark, P., and Niblett, T. (1989). The CN2 induction algorithm. Machine Learning, 3.
Drastal, G., Czako, G., and Raatz, S. (1989). Induction in an abstraction space: a form of constructive induction. Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, Deroit, MI.
Fayyad, U.M., Irani, K.B. (1992). On the handling of continuous-valued attributes in decision tree generation. Machine Learning 8(1).
Matheus, C.J. (1989). Feature Construction: An Analytical Framework and an Application to Decision Trees. PhD thesis, University of Illinois at Urbana-Champaign.
Michalski, R.S. (1983). A theory and methodology of inductive learning. In Machine learning: an artificial intelligence approach. Edited by R.S. Michalski, J.G. Carbonell, and T.M. Mitchell. Tioga Publishing Co., Palo Alto, CA.
Pagallo, G. (1990). Adaptive Decision Tree Algorithms for Learning from Examples. Ph.D thesis, University of California at Santa Cruz.
Quinlan, J.R. (1986). Induction of decision trees. Machine Learning 1(1).
Rendell, L.A., and Seshu, R. (1990). Learning hard concepts through constructive induction: framework and rationale. Computational Intelligence, 6.
Schlimmer, J.C. (1987). Learning and representation change. In proceedings of the Sixth National Conference on Artificial Intelligence.
Utgoff, P.E. (1986). Shift of bias for inductive concept learning. In Machine learning: an artificial intelligence approach, Vol. II. Edited by R.S. Michalski, J.G. Carbonell, and T.M. Mitchell. Tioga Publishing Co., Palo Alto, CA.
Utgoff, P.E. (1990). An incremental method for finding multivariate splits for decision trees. Proceedings of the Seventh International Conference on Machine Learning, Austin, TX.
Yang, D.S., Rendell, L.A., and Blix, G. (1991). A Scheme for Feature Construction and a Comparison of Empirical Methods. Proceedings of the Twelfth Internation Joint Conference on Artificial Intelligence, Sydney, Australia.
Wnek, J., and Michalski, R.S. (1993). Hypothesis-driven constructive induction in AQ17: A method and experiments. Machine Learning, Special Issue on Evaluating and Changing Representations.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1994 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, J., Lu, HH. (1994). A data-driven approach to feature construction. In: RaÅ›, Z.W., Zemankova, M. (eds) Methodologies for Intelligent Systems. ISMIS 1994. Lecture Notes in Computer Science, vol 869. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-58495-1_45
Download citation
DOI: https://doi.org/10.1007/3-540-58495-1_45
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-58495-7
Online ISBN: 978-3-540-49010-4
eBook Packages: Springer Book Archive