Skip to main content

Designing a New Search Space for Multivariate Time-Series Neural Architecture Search

  • Conference paper
  • First Online:
Advanced Analytics and Learning on Temporal Data (AALTD 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14343))

  • 195 Accesses

Abstract

With the rise of edge computing and the Internet of Things (IoT), there is an increasing demand for models with low memory footprints. These models must be adaptable to embedded system applications, while being able to leverage the large quantities of data recorded in these systems to produce superior performance.

Automatic Neural Architecture Search (NAS) has been an active and successful area of research for a number of years. However, a significant proportion of effort has been aimed at finding architectures which are able to effectively extract and transform the information in image data. This has lead to search space design which is heavily influenced by the heuristics of image classifiers.

We review and incorporate the characteristics of successful time-series methods, while seeking to address traits of conventional NAS search-space design which may be detrimental to performance on time-series.

This paper provides an in-depth look at the effects of each of our design choices with an analysis of time-series network design spaces on two benchmark tasks: Human Activity Recognition (HAR) using the UniMib-SHAR dataset and Electroencephalography (EEG) data from the BCI Competition IV 2a dataset.

Guided by these design principles and the results of our experimental procedure, we produce a search space tailored specifically to time-series tasks. This achieves excellent performance while producing architectures with significantly fewer parameters than other deep learning approaches.

We provide results on a collection of datasets from the UEA Multivariate time-series Classification Archive and achieve comparable performance to both deep learning and state-of-the-art machine learning time-series classification methods, using a simple random search.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zoph, B., Vasudevan, V., Shlens, J., Le, Q.V.: Learning transferable architectures for scalable image recognition (2018). https://doi.org/10.48550/ARXIV.1707.07012. arXiv: 1707

  2. Liu, H., Simonyan, K., Yang, Y.: DARTS: Differentiable architecture search. arXiv: 1806.09055 (2019)

  3. Schrodi, S., Stoll, D., Ru, B., Brox, T., Hutter, F.: Towards discovering neural architectures from scratch, Rhea Sukthanker (2022)

    Google Scholar 

  4. Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.-A.: Transfer learning for time series classification. In: 2018 IEEE International Conference on Big Data (Big Data), pp. 1367–1376 (2018). https://doi.org/10.1109/BigData.2018.8621990

  5. Middlehurst, M., Large, J., Flynn, M., Lines, J., Bostrom, A., Bagnall, A.: Hive-cote 2.0:aA new meta ensemble for time series classification. Mach. Learn. 110(11-12), 3211–3243 (2021)

    Google Scholar 

  6. Dempster, A., Petitjean, F., Webb, G.I.: Rocket: exceptionally fast and accurate time series classification using random convolutional kernels. Data Mining Knowl. Dis. 34(5), 1454–1495 (2020)

    Article  MathSciNet  Google Scholar 

  7. Middlehurst, M., Large, J., Bagnall, A.: The canonical interval forest (cif) classifier for time series classification. In: 2020 IEEE International Conference on Big Data (Big Data), pp. 188–195 (2020). https://doi.org/10.1109/BigData50022.2020.9378424

  8. Wang, Z., Yan, W., Oates, T.: Time series classification from scratch with deep neural networks: A strong baseline. In: 2017 International Joint Conference on Neural Networks (IJCNN), pp. 1578–1585 (2017). https://doi.org/10.1109/IJCNN.2017.7966039

  9. Fawaz, H.I., Lucas, B., Forestier, G., et al.: InceptionTime: finding AlexNet for time series classification. Data Mining Knowl. Dis. 34(6), 1936–1962 (2020). https://doi.org/10.1007/s10618-020-00710-y.

  10. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002). https://doi.org/10.1162/106365602320169811

    Article  Google Scholar 

  11. Miikkulainen, R., Liang, J., Meyerson, E., et al.: Evolving Deep Neural Networks (2017)

    Google Scholar 

  12. Micucci, D., Mobilio, M., Napoletano, P.: Unimib shar: a dataset for human activity recognition using acceleration data from smartphones. Appli. Sci. 7(10) (2017). https://doi.org/10.3390/app7101101. https://www.mdpi.com/2076-3417/7/10/1101, ISSN: 2076–3417

  13. Brunner, C., Leeb, R., Müller-Putz, G., Schlögl, A., Pfurtscheller, G.: Bci competition 2008–graz data set a. In: Institute for Knowledge Discovery (Laboratory of Brain-Computer Interfaces), Graz University of Technology, vol. 16, pp. 1–6 (2008)

    Google Scholar 

  14. Radosavovic, I., Kosaraju, R.P., Girshick, R., He, K., Dollar, P.: Designing network design spaces. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020)

    Google Scholar 

  15. Radosavovic, I., Johnson, J., Xie, S., Lo, W.-Y., Dollár, P.: On network design spaces for visual recognition (2019). arXiv: 1905.13214

  16. Liu, Z., Mao, H., Wu, C.-Y., Feichtenhofer, C., Darrell, T., Xie, S.: A convnet for the 2020s. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 11 976–11 986 (2022)

    Google Scholar 

  17. Oord, A. v. d., Dieleman, S., Zen, H., et al.: Wavenet: A generative model for raw audio, arXiv preprint arXiv:1609.03499 (2016)

  18. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13(1), 281–305 (2012)

    MathSciNet  Google Scholar 

  19. Gao, W., Zhang, L., Teng, Q., He, J., Hao, W.: Danhar: dual attention network for multimodal human activity recognition using wearable sensors. Appl. Soft Comput. 111, 107728 (2021)

    Article  Google Scholar 

  20. Mukherjee, D., Mondal, R., Singh, P.K., Sarkar, R., Bhattacharjee, D.: Ensemconvnet: a deep learning approach for human activity recognition using smartphone sensors for healthcare applications. Multimedia Tools Appli. 79, 31663–31690 (2020)

    Article  Google Scholar 

  21. Al-qaness, M.A.A., Dahou, A., Elaziz, M.A., Helmi, A.M.: Multiresatt: multilevel residual network with attention for human activity recognition using wearable sensors. IEEE Trans. Industrial Inform. 19(1), 144–152 (2023). https://doi.org/10.1109/TII.2022.3165

    Article  Google Scholar 

  22. Helmi, A.M., Al-qaness, M.A., Dahou, A., Abd Elaziz, M.: Human activity recognition using marine predators algorithm with deep learning. Future Generation Comput. Syst. 142, 340–350 (2023)

    Article  Google Scholar 

  23. Teng, Q., Wang, K., Zhang, L., He, J.: The layer-wise training convolutional neural networks using local loss for sensor-based human activity recognition. IEEE Sens. J. 20(13), 7265–7274 (2020)

    Article  Google Scholar 

  24. Ruiz, A.P., Flynn, M., Large, J., Middlehurst, M., Bagnall, A.: “The great multivariate time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Mining Knowl. Dis. 35(2), 401–449 (2021)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christopher MacKinnon .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

MacKinnon, C., Atkinson, R. (2023). Designing a New Search Space for Multivariate Time-Series Neural Architecture Search. In: Ifrim, G., et al. Advanced Analytics and Learning on Temporal Data. AALTD 2023. Lecture Notes in Computer Science(), vol 14343. Springer, Cham. https://doi.org/10.1007/978-3-031-49896-1_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-49896-1_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-49895-4

  • Online ISBN: 978-3-031-49896-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics