Logistic Model Trees Article Received: 22 June 2004 Revised: 21 December 2004 Accepted: 22 December 2004 DOI:
Cite this article as: Landwehr, N., Hall, M. & Frank, E. Mach Learn (2005) 59: 161. doi:10.1007/s10994-005-0466-3 Abstract
Tree induction methods and linear models are popular techniques for supervised learning tasks, both for the prediction of nominal classes and numeric values. For predicting numeric quantities, there has been work on combining these two schemes into ‘model trees’, i.e. trees that contain linear regression functions at the leaves. In this paper, we present an algorithm that adapts this idea for classification problems, using logistic regression instead of linear regression. We use a stagewise fitting process to construct the logistic regression models that can select relevant attributes in the data in a natural way, and show how this approach can be used to build the logistic regression models at the leaves by incrementally refining those constructed at higher levels in the tree. We compare the performance of our algorithm to several other state-of-the-art learning schemes on 36 benchmark UCI datasets, and show that it produces accurate and compact classifiers.
Keywords model trees logistic regression classification Editor
This is an extended version of a paper that appeared in the Proceedings of the 14th European Conference on Machine Learning (Landwehr et al., 2003).
Download to read the full article text References
Blake C. and Merz, C. J. (1998). UCI repository of machine learning databases. [www.ics.uci.edu/~mlearn/MLRepository.html].
Breiman, L., Friedman, H., Olshen, J. A., & Stone, C. J. (1984).
Classification and Regression Trees. Wadsworth.
Chan, K. Y., & Loh, W. Y. (2004). LOTUS: An algorithm for building accurate and comprehensible logistic regression trees.
Journal of Computational and Graphical Statistics, 13:4
MathSciNet Google Scholar
Frank, E., Wang, Y., Inglis, S., Holmes, G., & Witten, I. H. (1998). Using model trees for classification.
Machine Learning, 32:1
CrossRef Google Scholar
Freund, Y., & Schapire, R. E. (1996). Experiments with a new boosting algorithm. In
Proc 13th International Conference on Machine Learning (pp. 148–156). Morgan Kaufmann.
Friedman, J., Hastie, T., & Tibshirani, R. (2000). Additive logistic regression: A statistical view of boosting.
The Annals of Statistics, 38:2
Gama, J. (2004). Functional trees.
Machine Learning, 55:3
Hastie, T., Tibshirani, R., & Friedman, J. (2001).
The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer-Verlag.
Ihaka, R. & Gentleman, R. (1996). R: A language for data analysis and graphics.
Journal of Computational and Graphical Statistics, 5:3
Kohavi, R. (1996). Scaling up the accuracy of naive bayes classifiers: A decision-tree hybrid. In
Proc 2nd International Conference on Knowledge Discovery and Data Mining
(pp. 202–207). Menlo Park, CA: AAAI Press.
Landwehr, N., Hall, M., & Frank, E. (2003). Logistic model trees. In
Proc 14th European Conference on Machine Learning (pp. 241–252). Springer-Verlag.
Lim, T.-S., Loh, W. Y., & Shih, Y. S. (2000). A comparison of prediction accuracy, complexity, and training time for thirty-three old and new classification algorithms.
Machine Learning, 40:3
Lubinsky, D. (1994). Tree structured interpretable regression. In D. Fisher, and H. J. Lenz (Eds.),
Learning from Data (pp. 387–398). Springer-Verlag.
Malerba, D., Appice, A., Ceci, M., & Monopoli, M. (2002). Trading-off local versus global effects of regression nodes in model trees. In M.-S. Hacid, Z. W. Ras, D. A. Zighed, & Y. Kodratoff (Eds.),
Proc 13th International Symposium Foundations of Intelligent Systems (pp. 393–402). Springer-Verlag.
Nadeau, C., & Bengio, Y. (2003). Inference for the generalization error.
Machine Learning, 52:3
CrossRef Google Scholar
Perlich, C., Provost, F., & Simonoff, J. (2003). Tree inductions vs. logistic regression: A learning-curve analysis.
Journal of Machine Learning Research, 4
CrossRef Google Scholar
Quinlan, J. R. (1992). Learning with continuous classes. In
Proc 5th Australian Joint Conference on Artificial Intelligence (pp. 343–348). World Scientific Publishing Company Incorporated.
Quinlan, R. (1993).
C4.5: Programs for Machine Learning. Morgan Kaufmann.
Wang, Y. & Witten, I. (1997). Inducing model trees for continuous classes. In
Proc of Poster Papers, European Conf. on Machine Learning. Prague, Czech Republic.
Witten, I. H., & Frank, E. (2000).
Data Mining: Practical Machine Learning Tools and Techniques with Java Implemenations
. San Francisco: Morgan Kaufmann.
Google Scholar Copyright information
© Springer Science + Business Media, Inc. 2005