Beam Search Extraction and Forgetting Strategies on Shared Ensembles
Ensemble methods improve accuracy by combining the predictions of a set of different hypotheses. However, there is an important shortcoming associated with ensemble methods. Huge amounts of memory are required to store a set of multiple hypotheses. In this work, we have devised an ensemble method that partially solves this problem. The key point is that components share their common parts. We employ a multi-tree, which is a structure that can simultaneously contain an ensemble of decision trees but has the advantage that decision trees share some conditions. To construct this multi-tree, we define an algorithm based on a beam search with several extraction criteria and with several forgetting policies for the suspended nodes. Finally, we compare the behaviour of this ensemble method with some well-known methods for generating hypothesis ensembles.
KeywordsEnsemble Methods Decision Trees Randomisation Search Space Beam Search
Unable to display preview. Download preview PDF.
- 1.C.L. Blake and C.J. Merz. UCI repository of machine learning databases, 1998.Google Scholar
- 3.W. Buntine. A Theory of Learning Classification Rules. PhD thesis, School of Computing Science in the University of Technology, Sydney, February 1990.Google Scholar
- 4.W. Buntine. Learning classification trees. In D. J. Hand, editor, Artificial Intelligence frontiers in statistics, pages 182–201. Chapman & Hall, London, 1993.Google Scholar
- 5.P. Clark and T. Niblett. The CN2 induction algorithm. Machine Learning, 3:261–283, 1989.Google Scholar
- 6.T. Dean and M. Boddy. An analysis of time-dependent planning. In Proc. of the 7th National Conference on Artificial Intelligence, pages 49–54, 1988.Google Scholar
- 7.T. G Dietterich. Ensemble methods in machine learning. In First International Workshop on Multiple Classifier Systems, pages 1–15, 2000.Google Scholar
- 10.V. Estruch, C. Ferri, J. Hernández, and M.J. Ramírez. Shared Ensembles using Multi-trees. In the 8th Iberoamerican Conference on Artificial. Intelligence, Iberamia’02, volume 2527 of Lecture Notes in Computer Science, pages 204–213, 2002.Google Scholar
- 11.Y. Freund and R.E. Schapire. Experiments with a new boosting algorithm. In Proc. 13th International Conference on Machine Learning, pages 148–146. Morgan Kaufmann, 1996.Google Scholar
- 12.R. Kohavi and C. Kunz. Option decision trees with majority votes. In Proc. 14th International Conference on Machine Learning, pages 161–169. Morgan Kaufmann, 1997.Google Scholar
- 14.L. Kuncheva and C. J. Whitaker. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Submitted to Machine Learning, 2002.Google Scholar
- 15.N.J. Nilsson. Artificial Intelligence: a new synthesis. Morgan Kaufmann, 1998.Google Scholar
- 16.J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, 1993.Google Scholar