Advertisement

Evolving Recurrent Neural Networks for Time Series Data Prediction of Coal Plant Parameters

  • AbdElRahman ElSaid
  • Steven Benson
  • Shuchita Patwardhan
  • David Stadem
  • Travis DesellEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11454)

Abstract

This paper presents the Evolutionary eXploration of Augmenting LSTM Topologies (EXALT) algorithm and its use in evolving recurrent neural networks (RNNs) for time series data prediction. It introduces a new open data set from a coal-fired power plant, consisting of 10 days of per minute sensor recordings from 12 different burners at the plant. This large scale real world data set involves complex dependencies between sensor parameters and makes for challenging data to predict. EXALT provides interesting new techniques for evolving neural networks, including epigenetic weight initialization, where child neural networks re-use parental weights as a starting point to backpropagation, as well as node-level mutation operations which can improve evolutionary progress. EXALT has been designed with parallel computation in mind to further improve performance. Preliminary results were gathered predicting the Main Flame Intensity data parameter, with EXALT strongly outperforming five traditional neural network architectures on the best, average and worst cases across 10 repeated training runs per test case; and was only slightly behind the best trained Elman recurrent neural networks while being significantly more reliable (i.e., much better average and worst case results). Further, EXALT achieved these results 2 to 10 times faster than the traditional methods, in part due to its scalability, showing strong potential to beat traditional architectures given additional runtime.

Keywords

Neuro-evolution Recurrent neural networks Time series data prediction 

Notes

Acknowledgements

This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Advanced Combustion Systems under Award Number #FE0031547.

References

  1. 1.
    Stanley, K., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)CrossRefGoogle Scholar
  2. 2.
    Desell, T.: Developing a volunteer computing project to evolve convolutional neural networks and their hyperparameters. In: The 13th IEEE International Conference on eScience (eScience 2017), pp. 19–28, October 2017Google Scholar
  3. 3.
    Desell, T.: Large scale evolution of convolutional neural networks using volunteer computing. CoRR abs/1703.05422 (2017). http://arxiv.org/abs/1703.05422
  4. 4.
    Hochrieter, S., Schmidhuber, J.: Long short term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  5. 5.
    Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res. 3(Aug), 115–143 (2002)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Werbos, P.J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990)CrossRefGoogle Scholar
  7. 7.
    Gers, F.A., Schmidhuber, J., Cummins, F.: Learning to forget: continual prediction with LSTM. Neural Comput. 12(10), 2451–2471 (2000)CrossRefGoogle Scholar
  8. 8.
    Donahue, J., et al.: Long-term recurrent convolutional networks for visual recognition and description. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2625–2634 (2015)Google Scholar
  9. 9.
    Chao, L., Tao, J., Yang, M., Li, Y., Wen, Z.: Audio visual emotion recognition with temporal alignment and perception attention. arXiv preprint arXiv:1603.08321 (2016)
  10. 10.
    Eck, D., Schmidhuber, J.: A first look at music composition using lstm recurrent neural networks. Istituto Dalle Molle Di Studi Sull Intelligenza Artificiale 103, (2002)Google Scholar
  11. 11.
    Di Persio, L., Honchar, O.: Artificial neural networks approach to the forecast of stock market price movements. Int. J. Econ. Manag. Syst. 1, (2016)Google Scholar
  12. 12.
    Maknickienė, N., Maknickas, A.: Application of neural network for forecasting of exchange rates and forex trading. In: The 7th international scientific conference Business and Management, pp. 10–11 (2012)Google Scholar
  13. 13.
    Felder, M., Kaifel, A., Graves, A.: Wind power prediction using mixture density recurrent neural networks. In: Poster Presentation gehalten auf der European Wind Energy Conference (2010)Google Scholar
  14. 14.
    Choi, E., Bahadori, M.T., Sun, J.: Doctor AI: Predicting clinical events via recurrent neural networks. arXiv preprint arXiv:1511.05942 (2015)
  15. 15.
    Desell, T., Clachar, S., Higgins, J., Wild, B.: Evolving deep recurrent neural networks using ant colony optimization. In: Ochoa, G., Chicano, F. (eds.) EvoCOP 2015. LNCS, vol. 9026, pp. 86–98. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-16468-7_8CrossRefGoogle Scholar
  16. 16.
    ElSaid, A., El Jamiy, F., Higgins, J., Wild, B., Desell, T.: Optimizing long short-term memory recurrent neural networks using ant colony optimization to predict turbine engine vibration. Appl. Soft Comput. 73, 969–991 (2018)CrossRefGoogle Scholar
  17. 17.
    ElSaid, A., Jamiy, F.E., Higgins, J., Wild, B., Desell, T.: Using ant colony optimization to optimize long short-term memory recurrent neural networks. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 13–20. ACM (2018)Google Scholar
  18. 18.
    Annunziato, M., Lucchetti, M., Pizzuti, S.: Adaptive systems and evolutionary neural networks: a survey. In: Proceedings of EUNITE02, Albufeira, Portugal (2002)Google Scholar
  19. 19.
    Larochelle, H., Bengio, Y., Louradour, J., Lamblin, P.: Exploring strategies for training deep neural networks. J. Mach. Learn. Res. 10(Jan), 1–40 (2009)zbMATHGoogle Scholar
  20. 20.
    Kandel, E.R., Schwartz, J.H., Jessell, T.M., Siegelbaum, S.A., Hudspeth, A.J.: Principles of Neural Science, vol. 4. McGraw-hill, New York (2000)Google Scholar
  21. 21.
    Rawal, A., Miikkulainen, R.: From nodes to networks: Evolving recurrent neural networks. CoRR abs/1803.04439 (2018). http://arxiv.org/abs/1803.04439
  22. 22.
    Rawal, A., Miikkulainen, R.: Evolving deep LSTM-based memory networks using an information maximization objective. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, pp. 501–508. ACM (2016)Google Scholar
  23. 23.
    Desell, T.: Asynchronous Global Optimization for Massive Scale Computing. Ph.D. thesis, Rensselaer Polytechnic Institute (2009)Google Scholar
  24. 24.
    Message Passing Interface Forum: MPI: A message-passing interface standard. The International Journal of Supercomputer Applications and High Performance Computing 8(3/4), 159–416 (Fall/Winter 1994)Google Scholar
  25. 25.
    Pascanu, R., Mikolov, T., Bengio, Y.: On the difficulty of training recurrent neural networks. In: International Conference on Machine Learning, pp. 1310–1318 (2013)Google Scholar
  26. 26.
    Jozefowicz, R., Zaremba, W., Sutskever, I.: An empirical exploration of recurrent network architectures. In: International Conference on Machine Learning, pp. 2342–2350 (2015)Google Scholar
  27. 27.
    Alba, E., Tomassini, M.: Parallelism and evolutionary algorithms. IEEE Trans. Evol. Comput. 6(5), 443–462 (2002)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • AbdElRahman ElSaid
    • 1
  • Steven Benson
    • 2
  • Shuchita Patwardhan
    • 2
  • David Stadem
    • 2
  • Travis Desell
    • 1
    Email author
  1. 1.Rochester Institute of TechnologyRochesterUSA
  2. 2.Microbeam Technologies Inc.Grand ForksUSA

Personalised recommendations