Design of Multiple Classifier Systems for Time Series Data
In previous work, we showed that the use of Multiple Input Representation(MIR) for the classification of time series data provides complementary information that leads to better accuracy. . In this paper, we introduce the Static Minimization-Maximization approach to build Multiple Classifier Systems(MCSs) using MIR. SMM consists of two steps. In the minimization step, a greedy algorithm is employed to iteratively select the classifiers from the knowledge space to minimize the training error of MCSs. In the maximization step, a modified version of Behavior Knowledge Space(BKS), Balanced Behavior Knowledge Space(BBKS), is used to maximize the expected accuracy of the whole system given that the training error is minimized. Several popular techniques including AdaBoost, Bagging and Random Subspace are used as the benchmark to evaluate the proposed approach on four time series data sets. The results obtained from our experiments show that the performance of the proposed approach is effective as well as robust for the classification of time series data. In addition, this approach could be further extended to other applications in our future research.
KeywordsControl Chart Time Series Data Training Error Random Subspace Multiple Classifier System
Unable to display preview. Download preview PDF.
- 1.Breiman, L., Friedman, J.H., Olshen, A., Stone, C.J.: Classification and regression trees. Chapman and Hall, New York (1993); Previously published by Wadsworth and Books/Cole (1984)Google Scholar
- 2.Breiman, L.: Bagging predictors. Machie Learning 26, 123–140 (1996)Google Scholar
- 5.Duda, R.O., Hart, P.E., Stork, D.G.: Pattern classification, 2nd edn. Wiely-Interscience, Hoboken (2000)Google Scholar
- 8.Ghosh, J., Beck, S., Chu, C.C.: Evidence combination techniques for robust classification of short-duration oceanic signals. In: SPIE Conf. on Adaptive and Learning Systems, vol. 1706, pp. 266–276 (1999)Google Scholar
- 9.González, C.J.A., Diez, J.R.: Time series classification by boosting interval based literals. Inteligencia Artificial, Revista Iberoamericana de Inteligencia Artificial 11, 2–11 (2000)Google Scholar
- 12.Hsu, W.H., Ray, S.R.: Construction of recurrent mixture models for time series classification. In: Proceedings of the International Joint Conference on Neural Networks, vol. 3, pp. 1574–1579 (1999)Google Scholar
- 14.Hastie, T., Tibshirani, R., Friedman, J.: Elements of statistical learning: data mining, inference, and prediction. Springer, Heidelberg (2001)Google Scholar
- 16.Saito, N.: Local feature extraction and its applications using a library of bases. Phd thesis, Department of Mathematics, Yale University (1994)Google Scholar
- 17.Sancho, Q., Moro, I., Alonso, C., Rodrguez, J.J.: Applying simple combining techniques with artificial neural networks to some standard time series classification problems. In: Corchado, J.M., Alonso, L., Fyfe, C. (eds.) Artificial Neural Networks in Pattern Recognition, pp. 43–50 (2001)Google Scholar