Abstract
Learning systems need to face several problems: incrementality, tracking concept drift, robustness to noise and recurring contexts in order to operate continuously. A method for on-line induction of decision trees motivated by the above requirements is presented. It uses the following strategy: creating a delayed window in every node for applying forgetting mechanisms; automatic modification of the delayed window; and constructive induction for identifying recurring contexts. The default configuration of the proposed approach has shown to be globally efficient, reactive, robust and problem-independent, which is suitable for problems with unknown dynamics. Notable results have been obtained when noise and concept drift are present.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases Irvine, University of California (1998), http://www.ics.uci.edu/mlearn/MLRepository.html
Kolter, J., Maloof, M.: Dynamic Weighted Majority: A new ensemble method for tracking concept drift. In: Proceedings of 3rd IEEE ICDM, pp. 123–130. IEEE Press, Los Alamitos (2003)
Krikazova, I., Kubat, M.: Favorit: Concept Formation with Ageing of Knowledge. Pattern Recognition Letters 13, 19–25 (1993)
Maloof, M., Michalski, R.: Selecting Examples for Partial Memory Learning. Machine Learning 41, 27–42 (2000)
Maloof, M., Michalski, R.: Incremental learning with partial instance memory. In: Hacid, M.-S., Raś, Z.W., Zighed, D.A., Kodratoff, Y. (eds.) ISMIS 2002. LNCS (LNAI), vol. 2366, pp. 16–27. Springer, Heidelberg (2002)
RFC-793. TCP Specification. ARPANET Working Group Requests for Comment, DDN Network Information Center, SRI Int., Postel, P. editor. Menlo Park, CA (September 1981)
RFC-2988. Computing TCP’s transmission timer. Network Working Group Requests for Comment, Paxon, V. and Allman, M. (eds.) (November 2000)
Quinlan, J.R.: Induction of Decision Trees. Machine Learning 1, 81–106 (1986)
Quinlan, J.R.: C4.5. Programs for machine learning. Morgan Kaufmann, San Francisco (1993)
Schlimmer, J., Granger, R.: Incremental Learning from Noisy Data. Machine Learning 1, 317–354 (1986)
Tanenbaum, A.S.: Computer Networks, 2nd edn., pp. 314–315. Prentice-Hall Int., Englewood Cliffs (1988)
Utgoff, P., Berkman, N., Clouse, J.: Decision Tree Induction Based on Efficient Tree Restructuring. Machine Learning 29(1), 5–44 (1997)
Widmer, G., Kubat, M.: Learning in the Presence of Concept Drift and Hidden Contexts. Machine Learning 23, 69–101 (1996)
Wnek, J., Michalski, R.S.: Hypothesis-Driven Constructive Induction in AQ17: A Method and Experiments. Machine Learning 14, 139–169 (1994)
Zupan, B., Bohanec, M., Bratko, I., Demsar, J.: Machine learning by function decomposition. In: Proceedings of 14th ICML, pp. 421–429. Morgan Kaufmann, San Francisco (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Núñez, M., Fidalgo, R., Morales, R. (2005). On-Line Learning of Decision Trees in Problems with Unknown Dynamics. In: Gelbukh, A., de Albornoz, Á., Terashima-Marín, H. (eds) MICAI 2005: Advances in Artificial Intelligence. MICAI 2005. Lecture Notes in Computer Science(), vol 3789. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11579427_45
Download citation
DOI: https://doi.org/10.1007/11579427_45
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-29896-0
Online ISBN: 978-3-540-31653-4
eBook Packages: Computer ScienceComputer Science (R0)