Abstract
In this paper, we give experimental evaluation of our time-series decision tree induction method under various conditions. Our time-series tree has a value (i.e. a time sequence) of a time-series attribute in its internal node, and splits examples based on dissimilarity between a pair of time sequences. Our method selects, for a split test, a time sequence which exists in data by exhaustive search based on class and shape information. It has been empirically observed that the method induces accurate and comprehensive decision trees in time-series classification, which has gaining increasing attention due to its importance in various real-world applications. The evaluation has revealed several important findings including interaction between a split test and its measure of goodness.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Berka, P.: ECML/PKDD 2002 discovery challenge, download data about hepatitis (2002), http://lisp.vse.cz/challenge/ecmlpkdd2002/ (current September 28, 2002)
Bradford, J.P., Kunz, C., Kohavi, R., Brunk, C., Brodley, C.E.: Pruning decision trees with misclassification costs. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, pp. 131–136. Springer, Heidelberg (1998)
Drücker, C., et al.: As time goes by - using time series based decision tree induction to analyze the behaviour of opponent players. In: Birk, A., Coradeschi, S., Tadokoro, S. (eds.) RoboCup 2001. LNCS (LNAI), vol. 2377, pp. 325–330. Springer, Heidelberg (2002)
Geurts, P.: Pattern extraction for time series classification. In: Siebes, A., De Raedt, L. (eds.) PKDD 2001. LNCS (LNAI), vol. 2168, pp. 115–127. Springer, Heidelberg (2001)
Hettich, S., Bay, S.D.: The UCI KDD archive. Department of Information and Computer Science. University of California, Irvine (1999), http://kdd.ics.uci.edu
RodrÃguez, J.J., Alonso, C.J., Bostrvm, H.: Learning first order logic time series classifiers. In: Proc. Work-in-Progress Track at the Tenth International Conference on Inductive Logic Programming, pp. 260–275 (2000)
Kadous, M.W.: Learning comprehensible descriptions of multivariate time series. In: Proc. Sixteenth International Conference on Machine Learning (ICML), pp. 454–463 (1999)
Keogh, E.J.: Mining and indexing time series data. http:// Tutorial at the, IEEE International Conference on Data Mining, ICDM (2001), http://www.cs.ucr.edu/%7Eeamonn/tutorial_on_time_series.ppt
Keogh, E.J.: Exact indexing of dynamic time warping. In: Proc. 28th International Conference on Very Large Data Bases, pp. 406–417 (2002)
Keogh, E.J., Pazzani, M.J.: Scaling up dynamic time warping for datamining application. In: Proc. Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), pp. 285–289 (2000)
Mingers, J.: An empirical comparison of pruning methods for decision tree induction. Machine Learning 4 (1989)
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo (1993)
Sakoe, H., Chiba, S.: Dynamic programming algorithm optimization for spoken word recognition. IEEE Transaction on Acoustics, Speech, and Signal Processing ASSP 26 (1978)
Tanaka, H.: Electronic Patient Record and IT Medical Treatment. MED, Tokyo (2001) (in Japanese)
Yamada, Y., Suzuki, E., Yokoi, H., Takabayashi, K.: Decision-tree induction from time-series data based on a standard-example split test. In: Proc. Twentieth International Conference on Machine Learning (ICML), pp. 840–847 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yamada, Y., Suzuki, E., Yokoi, H., Takabayashi, K. (2005). Experimental Evaluation of Time-Series Decision Tree. In: Tsumoto, S., Yamaguchi, T., Numao, M., Motoda, H. (eds) Active Mining. Lecture Notes in Computer Science(), vol 3430. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11423270_11
Download citation
DOI: https://doi.org/10.1007/11423270_11
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26157-5
Online ISBN: 978-3-540-31933-7
eBook Packages: Computer ScienceComputer Science (R0)