Skip to main content

A Significantly Faster Elastic-Ensemble for Time-Series Classification

  • Conference paper
  • First Online:
Intelligent Data Engineering and Automated Learning – IDEAL 2019 (IDEAL 2019)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11871))

Abstract

The Elastic-Ensemble [7] has one of the longest build times of all constituents of the current state of the art algorithm for time series classification: the Hierarchical Vote Collective of Transformation-based Ensembles (HIVE-COTE) [8]. We investigate two simple and intuitive techniques to reduce the time spent training the Elastic Ensemble to consequently reduce HIVE-COTE train time. Our techniques reduce the effort involved in tuning parameters of each constituent nearest-neighbour classifier of the Elastic Ensemble. Firstly, we decrease the parameter space of each constituent to reduce tuning effort. Secondly, we limit the number of training series in each nearest neighbour classifier to reduce parameter option evaluation times during tuning. Experimentation over 10-folds of the UEA/UCR time-series classification problems show both techniques and give much faster build times and, crucially, the combination of both techniques give even greater speedup, all without significant loss in accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/TonyBagnall/uea-tsc/commit/07408d166072e8fd3057cb1fcbfd 913e603094e3.

  2. 2.

    https://github.com/alan-turing-institute/sktime.

References

  1. Chen, L., Ng, R.: On the marriage of Lp-norms and edit distance. In: Proceedings 30th International Conference on Very Large Databases (VLDB) (2004)

    Chapter  Google Scholar 

  2. Chen, Y., et al.: The UEA-UCR time series classification archive (2015). http://www.cs.ucr.edu/~eamonn/time_series_data/

  3. Deng, H., Runger, G., Tuv, E., Vladimir, M.: A time series forest for classification and feature extraction. Inf. Sci. 239, 142–153 (2013)

    Article  MathSciNet  Google Scholar 

  4. Hills, J., Lines, J., Baranauskas, E., Mapp, J., Bagnall, A.: Classification of time series by shapelet transformation. Data Min. Knowl. Disc. 28(4), 851–881 (2014)

    Article  MathSciNet  Google Scholar 

  5. Jeong, Y., Jeong, M., Omitaomu, O.: Weighted dynamic time warping for time series classification. Pattern Recogn. 44, 2231–2240 (2011)

    Article  Google Scholar 

  6. Keogh, E., Pazzani, M.: Derivative dynamic time warping. In: Proceedings 1st SIAM International Conference on Data Mining (SDM) (2001)

    Google Scholar 

  7. Lines, J., Bagnall, A.: Time series classification with ensembles of elastic distance measures. Data Min. Knowl. Disc. 29, 565–592 (2015)

    Article  MathSciNet  Google Scholar 

  8. Lines, J., Taylor, S., Bagnall, A.: Time series classification with HIVE-COTE: the hierarchical vote collective of transformation-based ensembles. ACM Trans. Knowl. Discovery From Data 12(5) (2018)

    Article  Google Scholar 

  9. Marteau, P.: Time warp edit distance with stiffness adjustment for time series matching. IEEE Trans. Pattern Anal. Mach. Intell. 31(2), 306–318 (2009)

    Article  Google Scholar 

  10. Schäfer, P.: The BOSS is concerned with time series classification in the presence of noise. Data Min. Knowl. Disc. 29(6), 1505–1530 (2015)

    Article  MathSciNet  Google Scholar 

  11. Schäfer, P., Leser, U.: Fast and accurate time series classification with weasel. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp. 637–646. ACM (2017)

    Google Scholar 

  12. Stefan, A., Athitsos, V., Das, G.: The Move-Split-Merge metric for time series. IEEE Trans. Knowl. Data Eng. 25(6), 1425–1438 (2013)

    Article  Google Scholar 

  13. Ye, L., Keogh, E.: Time series shapelets: a novel technique that allows accurate, interpretable and fast classification. Data Min. Knowl. Disc. 22(1–2), 149–182 (2011)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to George Oastler or Jason Lines .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Oastler, G., Lines, J. (2019). A Significantly Faster Elastic-Ensemble for Time-Series Classification. In: Yin, H., Camacho, D., Tino, P., Tallón-Ballesteros, A., Menezes, R., Allmendinger, R. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2019. IDEAL 2019. Lecture Notes in Computer Science(), vol 11871. Springer, Cham. https://doi.org/10.1007/978-3-030-33607-3_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-33607-3_48

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-33606-6

  • Online ISBN: 978-3-030-33607-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics