Design of Multiple Classifier Systems for Time Series Data

  • Lei Chen
  • Mohamed S. Kamel
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3541)

Abstract

In previous work, we showed that the use of Multiple Input Representation(MIR) for the classification of time series data provides complementary information that leads to better accuracy. [4]. In this paper, we introduce the Static Minimization-Maximization approach to build Multiple Classifier Systems(MCSs) using MIR. SMM consists of two steps. In the minimization step, a greedy algorithm is employed to iteratively select the classifiers from the knowledge space to minimize the training error of MCSs. In the maximization step, a modified version of Behavior Knowledge Space(BKS), Balanced Behavior Knowledge Space(BBKS), is used to maximize the expected accuracy of the whole system given that the training error is minimized. Several popular techniques including AdaBoost, Bagging and Random Subspace are used as the benchmark to evaluate the proposed approach on four time series data sets. The results obtained from our experiments show that the performance of the proposed approach is effective as well as robust for the classification of time series data. In addition, this approach could be further extended to other applications in our future research.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Breiman, L., Friedman, J.H., Olshen, A., Stone, C.J.: Classification and regression trees. Chapman and Hall, New York (1993); Previously published by Wadsworth and Books/Cole (1984)Google Scholar
  2. 2.
    Breiman, L.: Bagging predictors. Machie Learning 26, 123–140 (1996)Google Scholar
  3. 3.
    Breiman, L.: Random forests. Machie Learning 45(1), 5–32 (2001)MATHCrossRefGoogle Scholar
  4. 4.
    Chen, L., Kamel, M., Jiang, J.: A modular system for classification of time series data. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 134–143. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  5. 5.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern classification, 2nd edn. Wiely-Interscience, Hoboken (2000)Google Scholar
  6. 6.
    Diez, J.J.R., González, C.J.A.: Applying boosting to similarity literals for time series classification. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 210–219. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  7. 7.
    Dietterich, T.G.: An experimental comparison of three methods of constructing ensembles of decision trees: bagging,boositing and randomization. Machine Learning 40(2), 139–157 (2000)CrossRefGoogle Scholar
  8. 8.
    Ghosh, J., Beck, S., Chu, C.C.: Evidence combination techniques for robust classification of short-duration oceanic signals. In: SPIE Conf. on Adaptive and Learning Systems, vol. 1706, pp. 266–276 (1999)Google Scholar
  9. 9.
    González, C.J.A., Diez, J.R.: Time series classification by boosting interval based literals. Inteligencia Artificial, Revista Iberoamericana de Inteligencia Artificial 11, 2–11 (2000)Google Scholar
  10. 10.
    Ghosh, J., Deuser, L., Beck, S.: A neural network based hybrid system for detection,characterization and classification of short-duration oceanic signals. IEEE Journal of Ocean Engineering 17(4), 351–363 (1992)CrossRefGoogle Scholar
  11. 11.
    Giancinto, G., Roli, F.: Dynamic classifier selection based on multiple classifier behaviour. Pattern Recognition 34(9), 1879–1881 (2001)CrossRefGoogle Scholar
  12. 12.
    Hsu, W.H., Ray, S.R.: Construction of recurrent mixture models for time series classification. In: Proceedings of the International Joint Conference on Neural Networks, vol. 3, pp. 1574–1579 (1999)Google Scholar
  13. 13.
    Huang, Y.S., Suen, C.Y.: A method of combining multiple experts for the recognition of unconstrained handwritten numerals. IEEE Transactions on Pattern Analysis and Machine Intelligence 17(1), 90–94 (1995)CrossRefGoogle Scholar
  14. 14.
    Hastie, T., Tibshirani, R., Friedman, J.: Elements of statistical learning: data mining, inference, and prediction. Springer, Heidelberg (2001)Google Scholar
  15. 15.
    Ho, T.K.: The random subspace method for constructing decision forests. IEEE Transactions on pattern analysis and machine in intelligence 20(8), 832–844 (1998)CrossRefGoogle Scholar
  16. 16.
    Saito, N.: Local feature extraction and its applications using a library of bases. Phd thesis, Department of Mathematics, Yale University (1994)Google Scholar
  17. 17.
    Sancho, Q., Moro, I., Alonso, C., Rodrguez, J.J.: Applying simple combining techniques with artificial neural networks to some standard time series classification problems. In: Corchado, J.M., Alonso, L., Fyfe, C. (eds.) Artificial Neural Networks in Pattern Recognition, pp. 43–50 (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Lei Chen
    • 1
  • Mohamed S. Kamel
    • 1
  1. 1.Pattern Analysis and Machine Intelligence Lab, Electrical and Computer EngineeringUniversity of WaterlooCanada

Personalised recommendations