Skip to main content
Log in

Deep learning for time series classification: a review

  • Published:
Data Mining and Knowledge Discovery Aims and scope Submit manuscript

Abstract

Time Series Classification (TSC) is an important and challenging problem in data mining. With the increase of time series data availability, hundreds of TSC algorithms have been proposed. Among these methods, only a few have considered Deep Neural Networks (DNNs) to perform this task. This is surprising as deep learning has seen very successful applications in the last years. DNNs have indeed revolutionized the field of computer vision especially with the advent of novel deeper architectures such as Residual and Convolutional Neural Networks. Apart from images, sequential data such as text and audio can also be processed with DNNs to reach state-of-the-art performance for document classification and speech recognition. In this article, we study the current state-of-the-art performance of deep learning algorithms for TSC by presenting an empirical study of the most recent DNN architectures for TSC. We give an overview of the most successful deep learning applications in various time series domains under a unified taxonomy of DNNs for TSC. We also provide an open source deep learning framework to the TSC community where we implemented each of the compared approaches and evaluated them on a univariate TSC benchmark (the UCR/UEA archive) and 12 multivariate time series datasets. By training 8730 deep learning models on 97 time series datasets, we propose the most exhaustive study of DNNs for TSC to date.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

Notes

  1. The implementations are available on https://github.com/hfawaz/dl-4-tsc.

  2. www.github.com/hfawaz/dl-4-tsc.

References

  • Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Jozefowicz R, Kaiser L, Kudlur M, Levenberg J, Mané D, Monga R, Moore S, Murray D, Olah C, Schuster M, Shlens J, Steiner B, Sutskever I, Talwar K, Tucker P, Vanhoucke V, Vasudevan V, Viégas F, Vinyals O, Warden P, Wattenberg M, Wicke M, Yu Y, Zheng X (2015) TensorFlow: large-scale machine learning on heterogeneous systems. https://www.tensorflow.org/. Accessed 28 Feb 2019

  • Al-Jowder O, Kemsley E, Wilson R (1997) Mid-infrared spectroscopy and authenticity problems in selected meats: a feasibility study. Food Chem 59(2):195–201

    Google Scholar 

  • Aswolinskiy W, Reinhart RF, Steil J (2016) Time series classification in reservoir- and model-space: a comparison. In: Artificial neural networks in pattern recognition, pp 197–208

  • Aswolinskiy W, Reinhart RF, Steil J (2017) Time series classification in reservoir- and model-space. Neural Process Lett 48:789–809

    Google Scholar 

  • Bagnall A, Janacek G (2014) A run length transformation for discriminating between auto regressive time series. J Classif 31(2):154–178

    MathSciNet  MATH  Google Scholar 

  • Bagnall A, Lines J, Hills J, Bostrom A (2016) Time-series classification with COTE: the collective of transformation-based ensembles. In: International conference on data engineering, pp 1548–1549

  • Bagnall A, Lines J, Bostrom A, Large J, Keogh E (2017) The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Min Knowl Discov 31(3):606–660

    MathSciNet  Google Scholar 

  • Bahdanau D, Cho K, Bengio Y (2015) Neural machine translation by jointly learning to align and translate. In: International conference on learning representations

  • Baird HS (1992) Document image defect models. Springer, Berlin, pp 546–556

    Google Scholar 

  • Banerjee D, Islam K, Mei G, Xiao L, Zhang G, Xu R, Ji S, Li J (2017) A deep transfer learning approach for improved post-traumatic stress disorder diagnosis. In: IEEE international conference on data mining, pp 11–20

  • Baydogan MG (2015) Multivariate time series classification datasets. http://www.mustafabaydogan.com. Accessed 28 Feb 2019

  • Baydogan MG, Runger G, Tuv E (2013) A bag-of-features framework to classify time series. IEEE Trans Pattern Anal Mach Intell 35(11):2796–2802

    Google Scholar 

  • Bellman R (2010) Dynamic programming. Princeton University Press, Princeton

    MATH  Google Scholar 

  • Benavoli A, Corani G, Mangili F (2016) Should we really use post-hoc tests based on mean-ranks? Mach Learn Res 17(1):152–161

    MathSciNet  MATH  Google Scholar 

  • Bengio Y, Yao L, Alain G, Vincent P (2013) Generalized denoising auto-encoders as generative models. In: International conference on neural information processing systems, pp 899–907

  • Bianchi FM, Scardapane S, Løkse S, Jenssen R (2018) Reservoir computing approaches for representation and classification of multivariate time series. arXiv:1803.07870

  • Bishop C (2006) Pattern recognition and machine learning. Springer, Berlin

    MATH  Google Scholar 

  • Bostrom A, Bagnall A (2015) Binary shapelet transform for multiclass time series classification. In: Big data analytics and knowledge discovery, pp 257–269

  • Che Z, Cheng Y, Zhai S, Sun Z, Liu Y (2017a) Boosting deep learning risk prediction with generative adversarial networks for electronic health records. In: IEEE international conference on data mining, pp 787–792

  • Che Z, He X, Xu K, Liu Y (2017b) DECADE: a deep metric learning model for multivariate time series. In: KDD workshop on mining and learning from time series

  • Chen H, Tang F, Tino P, Yao X (2013) Model-based kernel for efficient time series analysis. In: ACM SIGKDD international conference on knowledge discovery and data mining, pp 392–400

  • Chen H, Tang F, Tiño P, Cohn A, Yao X (2015a) Model metric co-learning for time series classification. In: International joint conference on artificial intelligence, pp 3387–394

  • Chen Y, Keogh E, Hu B, Begum N, Bagnall A, Mueen A, Batista G (2015b) The UCR time series classification archive. www.cs.ucr.edu/~eamonn/time_series_data/. Accessed 28 Feb 2019

  • Chollet Fea (2015) Keras. https://keras.io. Accessed 28 Feb 2019

  • Chouikhi N, Ammar B, Alimi AM (2018) Genesis of basic and multi-layer echo state network recurrent autoencoders for efficient data representations. arXiv:1804.08996

  • Cristian Borges Gamboa J (2017) Deep learning for time-series analysis. arXiv:1701.01887

  • Cui Z, Chen W, Chen Y (2016) Multi-scale convolutional neural networks for time series classification. arXiv:1603.06995

  • Dau HA, Silva DF, Petitjean F, Forestier G, Bagnall A, Keogh E (2017) Judicious setting of dynamic time warping’s window width allows more accurate classification of time series. In: IEEE international conference on big data, pp 917–922

  • Dau HA, Bagnall A, Kamgar K, Yeh CCM, Zhu Y, Gharghabi S, Ratanamahatana CA, Keogh E (2018) The UCR time series archive. ArXiv arXiv:1810.07758

  • Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  • Deng H, Runger G, Tuv E, Vladimir M (2013) A time series forest for classification and feature extraction. Inf Sci 239:142–153

    MathSciNet  MATH  Google Scholar 

  • Esling P, Agon C (2012) Time-series data mining. ACM Comput Surv 45(1):12:1–12:34

    MATH  Google Scholar 

  • Faust O, Hagiwara Y, Hong TJ, Lih OS, Acharya UR (2018) Deep learning for healthcare applications based on physiological signals: a review. Comput Methods Programs Biomed 161:1–13

    Google Scholar 

  • Forestier G, Petitjean F, Dau HA, Webb GI, Keogh E (2017) Generating synthetic time series to augment sparse datasets. In: IEEE international conference on data mining, pp 865–870

  • Friedman M (1940) A comparison of alternative tests of significance for the problem of \(m\) rankings. Ann Math Stat 11(1):86–92

    MathSciNet  MATH  Google Scholar 

  • Gallicchio C, Micheli A (2017) Deep echo state network (DeepESN): a brief survey. arXiv:1712.04323

  • Garcia S, Herrera F (2008) An extension on “statistical comparisons of classifiers over multiple data sets” for all pairwise comparisons. Mach Learn Res 9:2677–2694

    MATH  Google Scholar 

  • Geng Y, Luo X (2018) Cost-sensitive convolution based neural networks for imbalanced time-series classification. ArXiv arXiv:1801.04396

  • Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. In: International conference on artificial intelligence and statistics, vol 9, pp 249–256

  • Goldberg Y (2016) A primer on neural network models for natural language processing. Artif Intell Res 57(1):345–420

    MathSciNet  MATH  Google Scholar 

  • Gong Z, Chen H, Yuan B, Yao X (2018) Multiobjective learning in the model space for time series classification. IEEE Trans Cybern 99:1–15

    Google Scholar 

  • Grabocka J, Schilling N, Wistuba M, Schmidt-Thieme L (2014) Learning time-series shapelets. In: ACM SIGKDD international conference on knowledge discovery and data mining, pp 392–401

  • Hatami N, Gavet Y, Debayle J (2017) Classification of time-series images using deep convolutional neural networks. In: International conference on machine vision

  • He K, Zhang X, Ren S, Sun J (2015) Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: IEEE international conference on computer vision, pp 1026–1034

  • He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: IEEE conference on computer vision and pattern recognition, pp 770–778

  • Hills J, Lines J, Baranauskas E, Mapp J, Bagnall A (2014) Classification of time series by shapelet transformation. Data Min Knowl Discov 28(4):851–881

    MathSciNet  MATH  Google Scholar 

  • Hinton G, Deng L, Yu D, Dahl GE, Mohamed AR, Jaitly N, Senior A, Vanhoucke V, Nguyen P, Sainath TN, Kingsbury B (2012) Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process Mag 29(6):82–97

    Google Scholar 

  • Hoerl AE, Kennard RW (1970) Ridge regression: applications to nonorthogonal problems. Technometrics 12(1):69–82

    MATH  Google Scholar 

  • Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 6(2):65–70

    MathSciNet  MATH  Google Scholar 

  • Höppner F (2016) Improving time series similarity measures by integrating preprocessing steps. Data Min Knowl Discov 31:851–878

    MathSciNet  MATH  Google Scholar 

  • Hu Q, Zhang R, Zhou Y (2016) Transfer learning for short-term wind speed prediction with deep neural networks. Renew Energy 85:83–95

    Google Scholar 

  • Ignatov A (2018) Real-time human activity recognition from accelerometer data using convolutional neural networks. Appl Soft Comput 62:915–922

    Google Scholar 

  • Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. In: International conference on machine learning, vol 37, pp 448–456

  • Ismail Fawaz H, Forestier G, Weber J, Idoumghar L, Muller PA (2018a) Data augmentation using synthetic data for time series classification with deep residual networks. In: International workshop on advanced analytics and learning on temporal data, ECML PKDD

  • Ismail Fawaz H, Forestier G, Weber J, Idoumghar L, Muller PA (2018b) Evaluating surgical skills from kinematic data using convolutional neural networks. In: Medical image computing and computer assisted intervention, pp 214–221

  • Ismail Fawaz H, Forestier G, Weber J, Idoumghar L, Muller PA (2018c) Transfer learning for time series classification. In: IEEE international conference on big data, pp 1367–1376

  • Jaeger H, Haas H (2004) Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667):78–80

    Google Scholar 

  • Kate RJ (2016) Using dynamic time warping distances as features for improved time series classification. Data Min Knowl Discov 30(2):283–312

    MathSciNet  MATH  Google Scholar 

  • Keogh E, Mueen A (2017) Curse of dimensionality. In: Encyclopedia of Machine Learning and Data Mining. Springer, pp 314–315

  • Kim Y (2014) Convolutional neural networks for sentence classification. In: Empirical methods in natural language processing

  • Kingma DP, Ba J (2015) Adam: a method for stochastic optimization. In: International conference on learning representations

  • Kotsifakos A, Papapetrou P (2014) Model-based time series classification. In: Advances in intelligent data analysis, pp 179–191

  • Krasin I, Duerig T, Alldrin N, Ferrari V, Abu-El-Haija S, Kuznetsova A, Rom H, Uijlings J, Popov S, Kamali S, Malloci M, Pont-Tuset J, Veit A, Belongie S, Gomes V, Gupta A, Sun C, Chechik G, Cai D, Feng Z, Narayanan D, Murphy K (2017) OpenImages: a public dataset for large-scale multi-label and multi-class image classification. https://storage.googleapis.com/openimages/web/index.html. Accessed 28 Feb 2019

  • Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: Advances in neural information processing systems vol 25, pp 1097–1105

  • Kruskal JB, Wish M (1978) Multidimensional scaling. Number 07–011 in sage university paper series on quantitative applications in the social sciences

  • Längkvist M, Karlsson L, Loutfi A (2014) A review of unsupervised feature learning and deep learning for time-series modeling. Pattern Recognit Lett 42:11–24

    Google Scholar 

  • Large J, Lines J, Bagnall A (2017) The heterogeneous ensembles of standard classification algorithms (HESCA): the whole is greater than the sum of its parts. arXiv:1710.09220

  • Le Q, Mikolov T (2014) Distributed representations of sentences and documents. In: International conference on machine learning, vol 32, pp II–1188–II–1196

  • Le Guennec A, Malinowski S, Tavenard R (2016) Data augmentation for time series classification using convolutional neural networks. In: ECML/PKDD workshop on advanced analytics and learning on temporal data

  • LeCun Y, Bottou L, Bengio Y, Haffner P (1998a) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324

    Google Scholar 

  • LeCun Y, Bottou L, Orr GB, Müller KR (1998b) Efficient backprop. In: Montavon G (ed) Neural networks: tricks of the trade. Springer, Berlin, pp 9–50

    Google Scholar 

  • LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521:436–444

    Google Scholar 

  • Lin S, Runger GC (2018) GCRNN: group-constrained convolutional recurrent neural network. IEEE Trans Neural Netw Learn Syst 99:1–10

    Google Scholar 

  • Lines J, Bagnall A (2015) Time series classification with ensembles of elastic distance measures. Data Min Knowl Discov 29(3):565–592

    MathSciNet  MATH  Google Scholar 

  • Lines J, Taylor S, Bagnall A (2016) HIVE-COTE: the hierarchical vote collective of transformation-based ensembles for time series classification. In: IEEE international conference on data mining, pp 1041–1046

  • Lines J, Taylor S, Bagnall A (2018) Time series classification with HIVE-COTE: the hierarchical vote collective of transformation-based ensembles. ACM Trans Knowl Discov Data 12(5):52:1–52:35

    Google Scholar 

  • Liu C, Hsaio W, Tu Y (2018) Time series classification with multivariate convolutional neural network. IEEE Trans Ind Electron 66:1–1

    Google Scholar 

  • Lu J, Young S, Arel I, Holleman J (2015) A 1 tops/w analog deep machine-learning engine with floating-gate storage in 0.13 \(\mu \)m cmos. IEEE J Solid-State Circuits 50(1):270–281

    Google Scholar 

  • Lucas B, Shifaz A, Pelletier C, O’Neill L, Zaidi N, Goethals B, Petitjean F, Webb GI (2018) Proximity forest: an effective and scalable distance-based classifier for time series. Data Min Knowl Discov 28:851–881

    Google Scholar 

  • Ma Q, Shen L, Chen W, Wang J, Wei J, Yu Z (2016) Functional echo state network for time series classification. Inf Sci 373:1–20

    MATH  Google Scholar 

  • Malhotra P, TV V, Vig L, Agarwal P, Shroff G (2018) TimeNet: pre-trained deep recurrent neural network for time series classification. In: European symposium on artificial neural networks, computational intelligence and machine learning, pp 607–612

  • Martinez C, Perrin G, Ramasso E, Rombaut M (2018) A deep reinforcement learning approach for early classification of time series. In: European signal processing conference

  • Mehdiyev N, Lahann J, Emrich A, Enke D, Fettke P, Loos P (2017) Time series classification using deep learning for process planning: a case from the process industry. Proc Comput Sci 114:242–249

    Google Scholar 

  • Mikolov T, Chen K, Corrado G, Dean J (2013a) Efficient estimation of word representations in vector space. In: International conference on learning representations—workshop

  • Mikolov T, Sutskever I, Chen K, Corrado G, Dean J (2013b) Distributed representations of words and phrases and their compositionality. In: Neural information processing systems, pp 3111–3119

  • Mittelman R (2015) Time-series modeling with undecimated fully convolutional neural networks. arXiv:1508.00317

  • Neamtu R, Ahsan R, Rundensteiner EA, Sarkozy G, Keogh E, Dau HA, Nguyen C, Lovering C (2018) Generalized dynamic time warping: unleashing the warping power hidden in point-wise distances. In: IEEE international conference on data engineering

  • Nguyen TL, Gsponer S, Ifrim G (2017) Time series classification by sequence learning in all-subsequence space. In: IEEE international conference on data engineering, pp 947–958

  • Nwe TL, Dat TH, Ma B (2017) Convolutional neural network with multi-task learning scheme for acoustic scene classification. In: Asia-Pacific signal and information processing association annual summit and conference, pp 1347–1350

  • Nweke HF, Teh YW, Al-garadi MA, Alo UR (2018) Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: state of the art and research challenges. Expert Syst Appl 105:233–261

    Google Scholar 

  • Ordón̈ez FJ, Roggen D (2016) Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. Sensors 16:115

    Google Scholar 

  • Pan SJ, Yang Q (2010) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359

    Google Scholar 

  • Papernot N, McDaniel P (2018) Deep k-nearest neighbors: towards confident, interpretable and robust deep learning. arXiv:1803.04765

  • Pascanu R, Mikolov T, Bengio Y (2012) Understanding the exploding gradient problem. arXiv:1211.5063

  • Pascanu R, Mikolov T, Bengio Y (2013) On the difficulty of training recurrent neural networks. In: International conference on machine learning, vol 28, pp III–1310–III–1318

  • Petitjean F, Forestier G, Webb GI, Nicholson AE, Chen Y, Keogh E (2016) Faster and more accurate classification of time series by exploiting a novel dynamic time warping averaging algorithm. Knowl Inf Syst 47(1):1–26

    Google Scholar 

  • Poggio T, Mhaskar H, Rosasco L, Miranda B, Liao Q (2017) Why and when can deep-but not shallow-networks avoid the curse of dimensionality: a review. Int J Autom Comput 14(5):503–519

    Google Scholar 

  • Rajan D, Thiagarajan J (2018) A generative modeling approach to limited channel ecg classification. In: IEEE engineering in medicine and biology society, vol 2018, p 2571

  • Rajkomar A, Oren E, Chen K, Dai AM, Hajaj N, Liu PJ, Liu X, Sun M, Sundberg P, Yee H, Zhang K, Duggan GE, Flores G, Hardt M, Irvine J, Le Q, Litsch K, Marcus J, Mossin A, Tansuwan J, Wang D, Wexler J, Wilson J, Ludwig D, Volchenboum SL, Chou K, Pearson M, Madabushi S, Shah NH, Butte AJ, Howell M, Cui C, Corrado G, Dean J (2018) Scalable and accurate deep learning for electronic health records. NPJ Digit Med 1:18

    Google Scholar 

  • Ratanamahatana CA, Keogh E (2005) Three myths about dynamic time warping data mining. In: SIAM international conference on data mining, pp 506–510

  • Rowe ACH, Abbott PC (1995) Daubechies wavelets and mathematica. Comput Phys 9(6):635–648

    Google Scholar 

  • Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, Berg AC, Fei-Fei L (2015) ImageNet large scale visual recognition challenge. Int J Comput Vis 115(3):211–252

    MathSciNet  Google Scholar 

  • Sainath TN, Mohamed AR, Kingsbury B, Ramabhadran B (2013) Deep convolutional neural networks for LVCSR. In: IEEE international conference on acoustics, speech and signal processing, pp 8614–8618

  • Santos T, Kern R (2017) A literature survey of early time series classification and deep learning. In: International conference on knowledge technologies and data-driven business

  • Schäfer P (2015) The BOSS is concerned with time series classification in the presence of noise. Data Min Knowl Discov 29(6):1505–1530

    MathSciNet  MATH  Google Scholar 

  • Serrà J, Pascual S, Karatzoglou A (2018) Towards a universal neural network encoder for time series. Artif Intell Res Dev Curr Chall New Trends Appl 308:120

    Google Scholar 

  • Silva DF, Giusti R, Keogh E, Batista G (2018) Speeding up similarity search under dynamic time warping by pruning unpromising alignments. Data Min Knowl Discov 32(4):988–1016

    MathSciNet  MATH  Google Scholar 

  • Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15:1929–1958

    MathSciNet  MATH  Google Scholar 

  • Strodthoff N, Strodthoff C (2019) Detecting and interpreting myocardial infarction using fully convolutional neural networks. Physiol Meas 40(1):015001

    Google Scholar 

  • Susto GA, Cenedese A, Terzi M (2018) Time-series classification methods: review and applications to power systems data. In: Big data application in power systems, pp 179 – 220

  • Sutskever I, Vinyals O, Le QV (2014) Sequence to sequence learning with neural networks. In: Neural information processing systems, pp 3104–3112

  • Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions. In: IEEE conference on computer vision and pattern recognition, pp 1–9

  • Tanisaro P, Heidemann G (2016) Time series classification using time warping invariant echo state networks. In: IEEE international conference on machine learning and applications, pp 831–836

  • Tripathy R, Acharya UR (2018) Use of features from RR-time series and EEG signals for automated classification of sleep stages in deep neural network framework. Biocybern Biomed Eng 38:890–902

    Google Scholar 

  • Uemura M, Tomikawa M, Miao T, Souzaki R, Ieiri S, Akahoshi T, Lefor AK, Hashizume M (2018) Feasibility of an AI-based measure of the hand motions of expert and novice surgeons. Comput Math Methods Med 2018:9873273

    Google Scholar 

  • Ulyanov D, Vedaldi A, Lempitsky V (2016) Instance normalization: the missing ingredient for fast stylization. arXiv:1607.08022

  • Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008

  • Wang Z, Oates T (2015a) Encoding time series as images for visual inspection and classification using tiled convolutional neural networks. In: Workshops at AAAI conference on artificial intelligence, pp 40–46

  • Wang Z, Oates T (2015b) Imaging time-series to improve classification and imputation. In: International conference on artificial intelligence, pp 3939–3945

  • Wang Z, Oates T (2015c) Spatially encoding temporal correlations to classify temporal data using convolutional neural networks. arXiv:1509.07481

  • Wang L, Wang Z, Liu S (2016a) An effective multivariate time series classification approach using echo state network and adaptive differential evolution algorithm. Expert Syst Appl 43:237–249

    Google Scholar 

  • Wang W, Chen C, Wang W, Rai P, Carin L (2016b) Earliness-aware deep convolutional networks for early time series classification. arXiv:1611.04578

  • Wang Z, Song W, Liu L, Zhang F, Xue J, Ye Y, Fan M, Xu M (2016c) Representation learning with deconvolution for multivariate time series classification and visualization. arXiv:1610.07258

  • Wang S, Hua G, Hao G, Xie C (2017a) A cycle deep belief network model for multivariate time series classification. Math Probl Eng 2017:1–7

    MathSciNet  MATH  Google Scholar 

  • Wang Z, Yan W, Oates T (2017b) Time series classification from scratch with deep neural networks: a strong baseline. In: International joint conference on neural networks, pp 1578–1585

  • Wang J, Chen Y, Hao S, Peng X, Hu L (2018a) Deep learning for sensor-based activity recognition: a survey. Pattern Recognit Lett

  • Wang J, Wang Z, Li J, Wu J (2018b) Multilevel wavelet decomposition network for interpretable time series analysis. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery and data mining, pp 2437–2446

  • Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1(6):80–83

    Google Scholar 

  • Yang Q, Wu X (2006) 10 challenging problems in data mining research. Inf Technol Decis Mak 05(04):597–604

    Google Scholar 

  • Ye L, Keogh E (2011) Time series shapelets: a novel technique that allows accurate, interpretable and fast classification. Data Min Knowl Discov 22(1):149–182

    MathSciNet  MATH  Google Scholar 

  • Yosinski J, Clune J, Bengio Y, Lipson H (2014) How transferable are features in deep neural networks? In: International conference on neural information processing systems, vol 2, pp 3320–3328

  • Zeiler MD (2012) ADADELTA: an adaptive learning rate method. arXiv:1212.5701

  • Zhang C, Bengio S, Hardt M, Recht B, Vinyals O (2017) Understanding deep learning requires rethinking generalization. In: International conference on learning representations

  • Zhao B, Lu H, Chen S, Liu J, Wu D (2017) Convolutional neural networks for time series classification. Syst Eng Electron 28(1):162–169

    Google Scholar 

  • Zheng Y, Liu Q, Chen E, Ge Y, Zhao JL (2014) Time series classification using multi-channels deep convolutional neural networks. In: Web-age information management, pp 298–310

  • Zheng Y, Liu Q, Chen E, Ge Y, Zhao JL (2016) Exploiting multi-channels deep convolutional neural networks for multivariate time series classification. Front Comput Sci 10(1):96–112

    Google Scholar 

  • Zhou B, Khosla A, Lapedriza A, Oliva A, Torralba A (2016) Learning deep features for discriminative localization. In: IEEE conference on computer vision and pattern recognition, pp 2921–2929

  • Ziat A, Delasalles E, Denoyer L, Gallinari P (2017) Spatio-temporal neural networks for space-time series forecasting and relations discovery. In: IEEE international conference on data mining, pp 705–714

Download references

Acknowledgements

The authors would like to thank the creators and providers of the datasets: Hoang Anh Dau, Anthony Bagnall, Kaveh Kamgar, Chin-Chia Michael Yeh, Yan Zhu, Shaghayegh Gharghabi, Chotirat Ann Ratanamahatana, Eamonn Keogh and Mustafa Baydogan. The authors would also like to thank NVIDIA Corporation for the GPU Grant and the Mésocentre of Strasbourg for providing access to the cluster. The authors would also like to thank François Petitjean and Charlotte Pelletier for the fruitful discussions, their feedback and comments while writing this paper. This work was supported by the ANR TIMES project (Grant ANR-17-CE23-0015) of the French Agence Nationale de la Recherche.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hassan Ismail Fawaz.

Additional information

Responsible editor: Dr. Eamonn Keogh.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ismail Fawaz, H., Forestier, G., Weber, J. et al. Deep learning for time series classification: a review. Data Min Knowl Disc 33, 917–963 (2019). https://doi.org/10.1007/s10618-019-00619-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10618-019-00619-1

Keywords

Navigation