Overcoming the Myopia of Inductive Learning Algorithms with RELIEFF
Purchase on Springer.com
$39.95 / €34.95 / £29.95*
Rent the article at a discountRent now
* Final gross prices may vary according to local VAT.
Current inductive machine learning algorithms typically use greedy search with limited lookahead. This prevents them to detect significant conditional dependencies between the attributes that describe training objects. Instead of myopic impurity functions and lookahead, we propose to use RELIEFF, an extension of RELIEF developed by Kira and Rendell [10, 11], for heuristic guidance of inductive learning algorithms. We have reimplemented Assistant, a system for top down induction of decision trees, using RELIEFF as an estimator of attributes at each selection step. The algorithm is tested on several artificial and several real world problems and the results are compared with some other well known machine learning algorithms. Excellent results on artificial data sets and two real world problems show the advantage of the presented approach to inductive learning.
- L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone, Classification and Regression Trees, Wadsworth International Group, 1984.
- B. Cestnik, “Estimating probabilities: A crucial task in machine learning,” Proc. European Conference on Artificial Intelligence, Stockholm, Aug. 1990, pp. 147–149.
- B. Cestnik and I. Bratko, “On estimating probabilities in tree pruning,” Proc. European Working Session on Learning, edited by Y. Kodratoff, Springer-Verlag: Porto, March 1991, pp. 138–150.
- B. Cestnik, I. Kononenko, and I. Bratko, “ASSISTANT 86: A knowledge elicitation tool for sophisticated users,” in Progress in Machine Learning, edited by I. Bratko and N. Lavrač, Sigma Press: Wilmslow, England, 1987.
- W. Chase and F. Brown, General Statistics, John Wiley & Sons, 1986.
- B. Dolšak and S. Muggleton, “The application of inductive logic programming to finite element mesh design,” in Inductive Logic Programming, edited by S. Muggleton, Academic Press, 1992.
- S. Džeroski, “Handling noise in inductive logic programming,” M.Sc. Thesis, University of Ljubljana, Faculty of Electrical Engineering & Computer Science, Ljubljana, Slovenia, 1991.
- S.J. Hong, “Use of contextual information for feature ranking and discretization,” Technical Report, IBM RC19664, 7/94, 1994 (to appear in IEEE Trans. on Knowledge and Data Engineering).
- E. Hunt, J. Martin, and P. Stone, Experiments in Induction, Academic Press: New York, 1966.
- K. Kira and L. Rendell, “A practical approach to feature selection,” Proc. Intern. Conf. on Machine Learning, edited by D. Sleeman and P. Edwards, Morgan Kaufmann: Aberdeen, July 1992, pp. 249–256.
- K. Kira and L. Rendell, “The feature selection problem: Traditional methods and new algorithm,” Proc. AAAI'92, San Jose, CA, July 1992.
- I. Kononenko, “Inductive and Bayesian learning in medical diagnosis,” Applied Artificial Intelligence, vol. 7, pp. 317–337, 1993.
- I. Kononenko, “Estimating attributes: Analysis and extensions of RELIEF,” Proc. European Conf. on Machine Learning, edited by L. De Raedt and F. Bergadano, Springer-Verlag: Catania, April 1994, pp. 171–182.
- I. Kononenko, “On biases when estimating multivalued attributes,” Proc. IJCAI-95, edited by C. Mellish, Morgan Kaufmann: Montreal, Aug. 1995, pp. 1034–1040.
- I. Kononenko and I. Bratko, “Information based evaluation criterion for classifier's performance,” Machine Learning, vol. 6, pp. 67–80, 1991.
- R.L. Mantaras, “ID3 Revisited: A distance based criterion for attribute selection,” Proc. Int. Symp. Methodologies for Intelligent Systems, Charlotte, North Carolina, U.S.A., Oct. 1989.
- R.S. Michalski and R.L. Chilausky, “Learning by being told and learning from examples: An experimental comparison of the two methods of knowledge acquisition in the context of developing an expert system for soybean disease diagnosis,” International Journal of Policy Analysis and Information Systems, vol. 4, pp. 125–161, 1980.
- D. Michie, D.J. Spiegelhalter, and C.C. Taylor (eds.), Machine Learning, Neural and Statistical Classification, Ellis Horwood Limited, 1994.
- D. Mladenič, “Combinatorial optimization in inductive concept learning,” Proc. 10th Intern. Conf. on Machine Learning, Morgan Kaufmann: Amherst, June 1993, pp. 205–211.
- S. Muggleton (ed.), Inductive Logic Programming, Academic Press, 1992.
- P.M. Murphy and D.W. Aha, UCI Repository of Machine Learning Databases [Machine-readable data repository], Irvine, CA, University of California, Department of Information and Computer Science, 1991.
- T. Niblett and I. Bratko, “Learning decision rules in noisy domains,” Proc. Expert Systems 86, Brighton, UK, Dec. 1986.
- U. Pompe and I. Kononenko, “Linear space induction in first order logic with RELIEFF,” in Mathematical and Statistical Methods in Artificial Intelligence, edited by G. Della Riccia, R. Kruse, and R. Viertl, CISM Lecture Notes, Springer-Verlag, 1995.
- U. Pompe, M. Kovačič, and I. Kononenko, “SFOIL: Stochastic approach to inductive logic programming,” Proc. Slovenian Conf. on Electrical Engineering and Computer Science, Portorož, Slovenia, Sept. 1993, pp. 189–192.
- R. Quinlan, “Induction of decision trees,” Machine Learning, vol. 1, pp. 81–106, 1986.
- R. Quinlan, “The minimum description length principle and categorical theories,” Proc. 11th Int. Conf. on Machine Learning edited by W. Cohen and H. Hirsh, Morgan Kaufmann: Ruthers University, New Brunswick, July 1994, pp. 233–241.
- H. Ragavan and L. Rendell, “Lookahead feature construction for learning hard concepts,” Proc. 10th Intern. Conf. on Machine Learning, Morgan Kaufmann: Amherst, June 1993, pp. 252–259.
- H. Ragavan, L. Rendell, M. Shaw, and A. Tessmer, “Learning complex real-world concepts through feature construction,” Technical Report UIUC-BI-AI-93-03, The Beckman Institute, University of Illinois, 1993.
- M. Robnik, “Constructive induction with decision trees,” B.Sc. Thesis (in Slovene), University of Ljubljana, Faculty of Electrical Engineering & Computer Science, Ljubljana, Slovenia, 1993.
- P. Smyth and R.M. Goodman, “Rule induction using information theory,” in Knowledge Discovery in Databases, edited by G. Piatetsky-Shapiro and W. Frawley, MIT Press, 1990.
- P. Smyth, R.M. Goodman, and C. Higgins, “A hybrid rule-based Bayesian classifier,” Proc. European Conf. on Artificial Intelligence, Stockholm, Aug. 1990, pp. 610–615.
- Overcoming the Myopia of Inductive Learning Algorithms with RELIEFF
Volume 7, Issue 1 , pp 39-55
- Cover Date
- Print ISSN
- Online ISSN
- Kluwer Academic Publishers
- Additional Links
- learning from examples
- estimating attributes
- impurity function
- empirical evaluation
- Industry Sectors