Advertisement

The replication problem: A constructive induction approach

  • Der-Shung Yang
  • Gunnar Blix
  • Larry A. Rendell
Part 1: Constructive Induction And Multi-Strategy Approaches
Part of the Lecture Notes in Computer Science book series (LNCS, volume 482)

Abstract

Pagallo's FRINGE and Symmetric FRINGE can improve learning by constructing new features based on the decision tree output of an induction algorithm. The new features help the replication problem. This paper examines the influence of replication problem in learning, and studies an refined version of Symmetric FRINGE called DCFringe. Like Symmetric FRINGE, DCFringe attacks both DNF and CNF problems using a dual heuristic. But unlike Symmetric FRINGE, DCFringe distinguishes between conjunctive and disjunctive replication, thus outperforming FRINGE for CNF-type concepts while equaling its performance for DNF-type concepts. We study the scope of the replication problem by relating it to other known characteristics of difficult concepts, such as concept dispersion, relative concept size, feature interaction and embedded parity. We discuss the generality of our solution in terms of its extensibility to other representations. We also suggest approaches to overcome some limitations of our approach such as its tendency to overfit the data and its susceptibility to noise.

Keywords

Empirical learning constructive induction 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [Breiman et al., 1984]
    Leo Breiman, Jerome H. Friedman, Richard A. Olshen, and Charles J. Stone. Classification and Regression Trees. Wadsworth, Belmont, CA, 1984.Google Scholar
  2. [Holte and Porter, 1989]
    R. C. Holte and Bruce W. Porter. Concept learning and the problem of small disjuncts. In Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, pages 813–824, 1989.Google Scholar
  3. [Kearns et al., 1987]
    Michael Kearns, Ming Li, Leonard Pitt, and Leslie G. Valiant. Recent results on boolean concept learning. In Proceedings of the Fourth International Workshop on Machine Learning, Irvine, CA, 1987.Google Scholar
  4. [Matheus, 1989]
    Christopher J. Matheus. Feature Construction: An Analytical Framework and an Application to Decision Trees. PhD thesis, University of Illinois at Urbana-Champaign, December 1989.Google Scholar
  5. [Pagallo and Haussler, 1990]
    Giulia Pagallo and David Haussler. Boolean feature discovery in empirical learning. Machine Learning, 5:71–99, 1990.CrossRefGoogle Scholar
  6. [Pagallo, 1989]
    Giulia Pagallo. Learning DNF by decision trees. In Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, 1989.Google Scholar
  7. [Pagallo, 1990]
    Giulia Pagallo. Adaptive Decision Tree Algorithms for Learning from Examples. PhD thesis, University of California at Santa Cruz, June 1990.Google Scholar
  8. [Quinlan, 1986]
    J. Ross Quinlan. Induction of decision trees. Machine Learning, 1(1):81–106, 1986.Google Scholar
  9. [Ragavan and Rendell, 1990]
    Harish Ragavan and Larry A. Rendell. Context-based acquisition of difficult concepts. Unpublished manuscript., 1990.Google Scholar
  10. [Rendell and Cho, 1990]
    Larry A. Rendell and Howard Cho. Empirical learning as a function of concept character. Machine Learning, 5(3):267–298, 1990.Google Scholar
  11. [Rendell and Seshu, 1990]
    Larry A. Rendell and Raj Seshu. Learning hard concepts. Computational Intelligence, 1990. (To appear).Google Scholar
  12. [Rendell, 1983]
    Larry A. Rendell. A new basis for state-space learning systems and a successful implementation. Artificial Intelligence, 20(4):369–392, 1983.CrossRefGoogle Scholar
  13. [Seshu, 1989]
    Raj Seshu. Solving the parity problem. In Proceedings of the Fourth European Working Session on Learning, pages 263–271, Montpellier, France, December 1989.Google Scholar
  14. [Yang and Blix, 1990]
    Der-Shung Yang and Gunnar Blix. FRINGE, DCFringe and μ concepts. Presented at the Workshop on Computational Learning Theory and Natural Learning Systems, 1990.Google Scholar
  15. [Yang et al., 1991]
    Der-Shung Yang, Larry A. Rendell, and Gunnar Blix. A scheme for feature construction and a comparison of empirical methods. Submitted to the Thirteenth International Joint Conference on Artificial Intelligence, 1991.Google Scholar
  16. [Yang, 1991]
    Der-Shung Yang. Feature discovery in decision tree representation. Master's thesis, University of Illinois at Urbana-Champaign, May 1991.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1991

Authors and Affiliations

  • Der-Shung Yang
    • 1
  • Gunnar Blix
    • 1
  • Larry A. Rendell
    • 1
  1. 1.Beckman Institute and Computer Science DepartmentUniversity of Illinois at Urbana-ChampaignUrbana

Personalised recommendations