Advertisement

A data-driven approach to feature construction

  • Jianping Zhang
  • Hsueh-Hsiang Lu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 869)

Abstract

This paper presents a general scheme for feature construction and its application to decision trees. In this scheme, a higher level attribute is constructed from two lower level ones under the guidance of the distributions of the examples from different classes. It can be used with different selective induction algorithms such as decision tree learning, rule learning and instance-based learning. A simple prototype has been implemented and integrated into a decision tree learning system. The experimental results empirically show that our approach outperforms the standard decision tree learning algorithm on three domains tested.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Clark, P., and Niblett, T. (1989). The CN2 induction algorithm. Machine Learning, 3.Google Scholar
  2. 2.
    Drastal, G., Czako, G., and Raatz, S. (1989). Induction in an abstraction space: a form of constructive induction. Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, Deroit, MI.Google Scholar
  3. 3.
    Fayyad, U.M., Irani, K.B. (1992). On the handling of continuous-valued attributes in decision tree generation. Machine Learning 8(1).Google Scholar
  4. 4.
    Matheus, C.J. (1989). Feature Construction: An Analytical Framework and an Application to Decision Trees. PhD thesis, University of Illinois at Urbana-Champaign.Google Scholar
  5. 5.
    Michalski, R.S. (1983). A theory and methodology of inductive learning. In Machine learning: an artificial intelligence approach. Edited by R.S. Michalski, J.G. Carbonell, and T.M. Mitchell. Tioga Publishing Co., Palo Alto, CA.Google Scholar
  6. 6.
    Pagallo, G. (1990). Adaptive Decision Tree Algorithms for Learning from Examples. Ph.D thesis, University of California at Santa Cruz.Google Scholar
  7. 7.
    Quinlan, J.R. (1986). Induction of decision trees. Machine Learning 1(1).Google Scholar
  8. 8.
    Rendell, L.A., and Seshu, R. (1990). Learning hard concepts through constructive induction: framework and rationale. Computational Intelligence, 6.Google Scholar
  9. 9.
    Schlimmer, J.C. (1987). Learning and representation change. In proceedings of the Sixth National Conference on Artificial Intelligence.Google Scholar
  10. 10.
    Utgoff, P.E. (1986). Shift of bias for inductive concept learning. In Machine learning: an artificial intelligence approach, Vol. II. Edited by R.S. Michalski, J.G. Carbonell, and T.M. Mitchell. Tioga Publishing Co., Palo Alto, CA.Google Scholar
  11. 11.
    Utgoff, P.E. (1990). An incremental method for finding multivariate splits for decision trees. Proceedings of the Seventh International Conference on Machine Learning, Austin, TX.Google Scholar
  12. 14.
    Yang, D.S., Rendell, L.A., and Blix, G. (1991). A Scheme for Feature Construction and a Comparison of Empirical Methods. Proceedings of the Twelfth Internation Joint Conference on Artificial Intelligence, Sydney, Australia.Google Scholar
  13. 15.
    Wnek, J., and Michalski, R.S. (1993). Hypothesis-driven constructive induction in AQ17: A method and experiments. Machine Learning, Special Issue on Evaluating and Changing Representations.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1994

Authors and Affiliations

  • Jianping Zhang
    • 1
  • Hsueh-Hsiang Lu
    • 1
  1. 1.Department of Computer ScienceUtah State UniversityLoganUSA

Personalised recommendations