Skip to main content
Log in

Predicting cortical oscillations with bidirectional LSTM network: a simulation study

  • Original Paper
  • Published:
Nonlinear Dynamics Aims and scope Submit manuscript

Abstract

It has been stated that up-down-state (UDS) cortical oscillation levels between excitatory and inhibitory neurons play a fundamental role in brain network construction. Predicting the time series behaviors of neurons in periodic and chaotic regimes can help in improving diseases, higher-order human activities, and memory consolidation. Predicting the time series is usually done by machine learning methods. In paper, the deep bidirectional long short-term memory (DBLSTM) network is employed to predict the time evolution of regular, large-scale UDS oscillations produced by a previously developed neocortical network model. In noisy time-series prediction tasks, we compared the DBLSTM performance with two other variants of deep LSTM networks: standard LSTM, LSTM projected, and gated recurrent unit (GRU) cells. We also applied the classic seasonal autoregressive integrated moving average (SARIMA) time-series prediction method as an additional baseline. The results are justified through qualitative resemblance between the bifurcation diagrams of the actual and predicted outputs and quantitative error analyses of the network performance. The results of extensive simulations showed that the DBLSTM network provides accurate short and long-term predictions in both periodic and chaotic behavioral regimes and offers robust solutions in the presence of the corruption process.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data availability

The data generated and analyzed during the current study are available from the corresponding author upon reasonable request. The software code that supports the findings of this study is also available from the corresponding author upon reasonable request.

Abbreviations

UDS:

Up-down state

LSTM:

Long short-term memory

BLSTM:

Bidirectional LSTM

DBLSTM:

Deep BLSTM

GRU:

Gated recurrent unit

SARIMA:

Seasonal autoregressive integrated moving average

EEG:

Electroencephalogram

LFP:

Local field potential

DSFA:

Dendritic spike frequency adaptation

RNN:

Recurrent neural network

RC:

Reservoir computing

TSP:

Time series prediction

RMSE:

Root mean square error

NRMSE:

Normalized RMSE

EX:

Excitatory neurons

IN:

Inhibitory neurons

PC:

Pyramidal cells

S-IN:

Slow Inhibitory neurons

F-IN:

Fast Inhibitory neurons

RKF:

Runge Kutta Fehlberg

PDF:

Probability density function

References

  1. Jercog, D., Roxin, A., Bartho, P., Luczak, A., Compte, A., de la Rocha, J.: Up-down cortical dynamics reflect state transitions in a bistable network. Elife 6, e22425 (2017)

    Article  Google Scholar 

  2. Minati, L., Ito, H., Perinelli, A., Ricci, L., Faes, L., Yoshimura, N., Koike, Y., Frasca, M.: Connectivity influences on nonlinear dynamics in weakly-synchronized networks: insights from rössler systems, electronic chaotic oscillators, model and biological neurons. IEEE Access 7, 174793–174821 (2019)

    Article  Google Scholar 

  3. Minati, L.: Across neurons and silicon: some experiments regarding the pervasiveness of nonlinear phenomena. Acta Phys. Pol. B 49(12), 2029–2094 (2018)

    Article  Google Scholar 

  4. Steriade, M.: Active neocortical processes during quiescent sleep. Arch. Ital. Biol. 139(1), 37–51 (2001)

    Google Scholar 

  5. Ghasemi, M., Zarei, M., Foroutannia, A., Jafari, S.: Study of functional connectivity of central motor system in Parkinson’s disease using copula theory. Biomed. Signal Process. Control 65, 102320 (2021)

    Article  Google Scholar 

  6. Van Dongen, E.V., Takashima, A., Barth, M., Zapp, J., Schad, L.R., Paller, K.A., Fernández, G.: Memory stabilization with targeted reactivation during human slow-wave sleep. PNAS 109(26), 10575–10580 (2012)

    Article  Google Scholar 

  7. Diekelmann, S., Born, J.: The memory function of sleep. Nat. Rev. Neurosci. 11(2), 114–126 (2010)

    Article  Google Scholar 

  8. Parastesh, F., Jafari, S., Azarnoush, H., Shahriari, Z., Wang, Z., Boccaletti, S., Perc, M.: Chimeras. Phys. Rep. 898, 1–114 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  9. Ma, J., Yang, Z.-Q., Yang, L.-J., Tang, J.: A physical view of computational neurodynamics. J. Zhejiang Univ. Sci. A 20(9), 639–659 (2019)

    Article  Google Scholar 

  10. Foroutannia, A., Ghasemi, M., Parastesh, F., Jafari, S., Perc, M.: Complete dynamical analysis of a neocortical network model. Nonlinear Dyn. 100(3), 2699–2714 (2020)

    Article  Google Scholar 

  11. Nghiem, T.-A.E., Tort-Colet, N., Górski, T., Ferrari, U., Moghimyfiroozabad, S., Goldman, J.S., Teleńczuk, B., Capone, C., Bal, T., Di Volo, M., et al.: Cholinergic switch between two types of slow waves in cerebral cortex. Cereb. Cortex 30(6), 3451–3466 (2020)

    Article  Google Scholar 

  12. Levenstein, D., Buzsáki, G., Rinzel, J.: Nrem sleep in the rodent neocortex and hippocampus reflects excitable dynamics. Nat. Commun. 10(1), 1–12 (2019)

    Article  Google Scholar 

  13. Hashemi, N.S., Dehnavi, F., Moghimi, S., Ghorbani, M.: Slow spindles are associated with cortical high frequency activity. Neuroimage 189, 71–84 (2019)

    Article  Google Scholar 

  14. Ghorbani, M., Mehta, M., Bruinsma, R., Levine, A.J.: Nonlinear-dynamics theory of up-down transitions in neocortical neural networks. Phys. Rev. E 85(2), 021908 (2012)

    Article  Google Scholar 

  15. Lipton, Z.C., Berkowitz, J., Elkan, C.: A critical review of recurrent neural networks for sequence learning, arXiv preprint arXiv:1506.00019

  16. Graves, A.: Generating sequences with recurrent neural networks, arXiv preprint arXiv:1308.0850

  17. Lin, T., Horne, B.G., Tino, P., Giles, C.L.: Learning long-term dependencies in narx recurrent neural networks. IEEE Trans. Neural Netw. 7(6), 1329–1338 (1996)

    Article  Google Scholar 

  18. Medsker, L.R., Jain, L.: Recurrent neural networks. Des. Appl. 5, 64–67 (2001)

    Google Scholar 

  19. Zhang, J., He, T., Sra, S., Jadbabaie, A.: Why gradient clipping accelerates training: A theoretical justification for adaptivity, arXiv preprint arXiv:1905.11881

  20. Chen, Y., Gilroy, S., Maletti, A., May, J., Knight, K.: Recurrent neural networks as weighted language recognizers, arXiv preprint arXiv:1711.05408

  21. Tanaka, G., Yamane, T., Héroux, J.B., Nakane, R., Kanazawa, N., Takeda, S., Numata, H., Nakano, D., Hirose, A.: Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019)

    Article  Google Scholar 

  22. Graves, A.: Supervised sequence labelling. In: Supervised sequence labelling with recurrent neural networks, Springer, pp. 5–13 (2012)

  23. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  24. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling, arXiv preprint arXiv:1412.3555

  25. Sachan, D.S., Xie, P., Sachan, M., Xing, E.P.: Effective use of bidirectional language modeling for transfer learning in biomedical named entity recognition. In: Machine learning for healthcare conference, PMLR, pp. 383–402 (2018)

  26. Shoryabi, M., Foroutannia, A., Rowhanimanesh, A., Ghasemi, M.: A novel neural approach for classification of eeg signals for brain-computer interface

  27. Graves, A., Jaitly, N., Mohamed, A.-R.: Hybrid speech recognition with deep bidirectional lstm. In: IEEE Workshop on Automatic Speech Recognition and Understanding. IEEE 2013, 273–278 (2013)

  28. Bartels, J., Tokgoz, K.K., Sihan, A., Fukawa, M., Otsubo, S., Li, C., Rachi, I., Takeda, K.-I., Minati, L., Ito, H.: Tinycownet: memory-and power-minimized rnns implementable on tiny edge devices for lifelong cow behavior distribution estimation. IEEE Access 10, 32706–32727 (2022)

    Article  Google Scholar 

  29. Liu, H., Song, W., Zhang, Y., Kudreyko, A.: Generalized Cauchy degradation model with long-range dependence and maximum Lyapunov exponent for remaining useful life. IEEE Trans. Instrum. Meas. 70, 1–12 (2021)

    Article  Google Scholar 

  30. Zhang, Y., Song, W., Karimi, M., Chi, C.-H., Kudreyko, A.: Fractional autoregressive integrated moving average and finite-element modal: the forecast of tire vibration trend. IEEE Access 6, 40137–40142 (2018)

    Article  Google Scholar 

  31. Längkvist, M., Karlsson, L., Loutfi, A.: A review of unsupervised feature learning and deep learning for time-series modeling. Pattern Recognit. Lett. 42, 11–24 (2014)

    Article  Google Scholar 

  32. Pascanu, R., Mikolov, T., Bengio, Y.: On the difficulty of training recurrent neural networks. In: International Conference on Machine Learning, PMLR, pp. 1310–1318 (2013)

  33. Gulcehre, C., Cho, K., Pascanu, R., Bengio, Y.: Learned-norm pooling for deep feedforward and recurrent neural networks. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Springer, pp. 530–546 (2014)

  34. Utgoff, P.E., Stracuzzi, D.J.: Many-layered learning. Neural Comput. 14(10), 2497–2529 (2002)

    Article  MATH  Google Scholar 

  35. Jaseena, K., Kovoor, B.C.: Decomposition-based hybrid wind speed forecasting model using deep bidirectional lstm networks. Energy Convers. Manag. 234, 113944 (2021)

    Article  Google Scholar 

  36. Zhao, Y., Yang, R., Chevalier, G., Shah, R.C., Romijnders, R.: Applying deep bidirectional lstm and mixture density network for basketball trajectory prediction. Optik 158, 266–272 (2018)

    Article  Google Scholar 

  37. Yildirim, Ö.: A novel wavelet sequence based on deep bidirectional lstm network model for ecg signal classification. Comput. Biol. Med. 96, 189–202 (2018)

    Article  Google Scholar 

  38. Cui, Z., Ke, R., Pu, Z., Wang, Y.: Deep bidirectional and unidirectional lstm recurrent neural network for network-wide traffic speed prediction, arXiv preprint arXiv:1801.02143

  39. Ghasemi, M., Foroutannia, A., Nikdelfaz, F.: A pid controller for synchronization between master-slave neurons in fractional-order of neocortical network model. J. Theor. Biol. 556, 111311 (2023)

    Article  MATH  Google Scholar 

  40. Foroutannia, A., Nazarimehr, F., Ghasemi, M., Jafari, S.: Chaos in memory function of sleep: a nonlinear dynamical analysis in thalamocortical study. J. Theor. Biol. 528, 110837 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  41. Kazemi, S., Jamali, Y.: Phase synchronization and measure of criticality in a network of neural mass models. Sci. Rep. 12(1), 1–18 (2022)

    Article  Google Scholar 

  42. Grimbert, F., Faugeras, O.: Bifurcation analysis of Jansen’s neural mass model. Neural Comput. 18(12), 3052–3068 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  43. Wendling, F., Bartolomei, F., Bellanger, J.J., Chauvel, P.: Epileptic fast activity can be explained by a model of impaired gabaergic dendritic inhibition. Eur. J. Neurosci. 15(9), 1499–1508 (2002)

    Article  Google Scholar 

  44. Hebbink, J., van Gils, S.A., Meijer, H.G.: On analysis of inputs triggering large nonlinear neural responses slow-fast dynamics in the wendling neural mass model. Commun. Nonlinear Sci. Numer. Simul. 83, 105103 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  45. Hermans, M., Schrauwen, B.: Training and analysing deep recurrent neural networks. Adv. Neural Inf. Process. Syst. 5, 26 (2012)

    Google Scholar 

  46. Simos, T.: A Runge-Kutta Fehlberg method with phase-lag of order infinity for initial-value problems with oscillating solution. Comput. Math. Appl. 25(6), 95–101 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  47. Wazwaz, A.-M.: A comparison of modified Runge-Kutta formulas based on a variety of means. Int. J. Comput. Math. 50(1–2), 105–112 (1994)

    Article  MATH  Google Scholar 

  48. Shcherbakov, M.V., Brebels, A., Shcherbakova, N.L., Tyukov, A.P., Janovsky, T.A., Kamaev, V.A., et al.: A survey of forecast error measures. World Appl. Sci. J. 24(24), 171–176 (2013)

    Google Scholar 

Download references

Acknowledgements

We would also like to thank Dr. Fatemeh Hadaeghi (Institute of Computational Neuroscience, University Medical Center Hamburg-Eppendorf (UKE), Hamburg, Germany) and Dr. Sajad Jafari (Department of Biomedical Engineering, Amirkabir University of Technology, Tehran, Iran), who provided insight and expertise that greatly assisted this research.

Funding

In this study, no funding was received from any university or institution.

Author information

Authors and Affiliations

Authors

Contributions

AF: Conceptualization, Validation, Visualization, Software, Methodology & testing, Writing—original draft. MG: Conceptualization, Validation, Visualization, Writing—review & editing.

Corresponding author

Correspondence to Ali Foroutannia.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have influenced the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Foroutannia, A., Ghasemi, M. Predicting cortical oscillations with bidirectional LSTM network: a simulation study. Nonlinear Dyn 111, 8713–8736 (2023). https://doi.org/10.1007/s11071-023-08251-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11071-023-08251-x

Keywords

Navigation