Skip to main content

Data Assimilation Using Heteroscedastic Bayesian Neural Network Ensembles for Reduced-Order Flame Models

  • Conference paper
  • First Online:
Computational Science – ICCS 2021 (ICCS 2021)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12746))

Included in the following conference series:

Abstract

The parameters of a level-set flame model are inferred using an ensemble of heteroscedastic Bayesian neural networks (BayNNEs). The neural networks are trained on a library of 1.7 million observations of 8500 simulations of the flame edge, obtained using the model with known parameters. The ensemble produces samples from the posterior probability distribution of the parameters, conditioned on the observations, as well as estimates of the uncertainties in the parameters. The predicted parameters and uncertainties are compared to those inferred using an ensemble Kalman filter. The expected parameter values inferred with the BayNNE method, once trained, match those inferred with the Kalman filter but require less than one millionth of the time and computational cost of the Kalman filter. This method enables a physics-based model to be tuned from experimental images in real time.

M. L. Croci and U. Sengupta—Equal contribution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Cantera is a suite of tools for problems involving chemical kinetics, thermodynamics, and transport processes [13].

  2. 2.

    The EnKF is fully parallelised: the processors have 32 cores in total, one for each member in the ensemble.

References

  1. Juniper, M.P., Sujith, R.: Sensitivity and nonlinearity of thermoacoustic oscillations. Ann. Rev. Fluid Mech. 50, 661–689 (2018)

    Article  MathSciNet  Google Scholar 

  2. Keller, J.J.: Thermoacoustic oscillations in combustion chambers of gas turbines. AIAA 33–12 (1995)

    Google Scholar 

  3. Strutt, J.W.: The Theory of Sound, vol. II. Macmillan and Co., London (1878)

    Google Scholar 

  4. Smart, A.E., Jones, B., Jewel, N.T.: Measurements of unsteady parameters in a rig designed to study reheat combustion instabilities. AIAA 26-88 (1976)

    Google Scholar 

  5. Pitz, R.W., Daily, J.W.: Experimental study of combustion in a turbulent free shear layer formed at a rearward facing step. AIAA 81-106 (1981)

    Google Scholar 

  6. Smith, D.A., Zukoski, E.E.: Combustion instability sustained by unsteady vortex combustion. AIAA 85-1248 (1985)

    Google Scholar 

  7. Poinsot, T., Trouve, A., Veynante, D., Candel, S., Esposito, E.: Vortex-driven acoustically coupled combustion instabilities. J. Fluid Mech. 177–220, 265–292 (1987)

    Article  Google Scholar 

  8. Crocco, L.: Research on combustion instability in liquid propellant rockets. In: Symposium (International) Combustion, vol. 12, no. 1, pp. 85–99 (1969)

    Google Scholar 

  9. Williams, F.A.: Turbulent combustion. In: The Mathematics of Combustion, pp. 97–131 (1985)

    Google Scholar 

  10. Evensen, G.: Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. J. Geophys. Res. 99, 10143–10162 (1994)

    Article  Google Scholar 

  11. Yu, H., Juniper, M.P., Magri, L.: Combined state and parameter estimation in level-set methods. J. Comput. Phys. 399, 108950 (2019)

    Google Scholar 

  12. Yu, H., Juniper, M.P., Magri, L.: A data-driven kinematic model of a ducted premixed flame. Proc. Combust. Inst. 399, 6231–6239 (2020)

    Google Scholar 

  13. Goodwin, D.G., Speth, R.L., Moffat, H.K., Weber, B.W.: Cantera: an object-oriented software toolkit for chemical kinetics, thermodynamics, and transport processes (2018). https://www.cantera.org. Version 2.4.0

  14. Gal, Y.: Uncertainty in deep learning. Ph.D. thesis (2016)

    Google Scholar 

  15. Damianou, A.C., Lawrence, N.D.: Deep Gaussian processes. In: Proceedings of the 16th International Conference on Artificial Intelligence and Statistics (AISTATS) (2013)

    Google Scholar 

  16. MacKay, D.J.C.: Information Theory, Inference and Learning Algorithms. Cambridge University Press, Cambridge (2003)

    Google Scholar 

  17. Gal, Y., Ghahramani, Z.: Dropout as a Bayesian approximation: representing model uncertainty in deep learning. In: Proceedings of the 33rd International Conference on Machine Learning. NY, USA, New York (2016)

    Google Scholar 

  18. Pearce, T., Zaki, M., Brintrup, A., Anastassacos, N., Neely, A.: Uncertainty in neural networks: Bayesian ensembling. In: International Conference on Artificial Intelligence and Statistics (AISTATS) (2020)

    Google Scholar 

  19. Sengupta, U., Croci, M.L., Juniper, M.P.: Real-time parameter inference in reduced-order flame models with heteroscedastic Bayesian neural network ensembles. In: Machine Learning and the Physical Sciences Workshop at the 34th Conference on Neural Information Processing Systems (NeurIPS) (2020)

    Google Scholar 

  20. Sengupta, U., Amos, M., Hosking, J.S., Rasmussen, C.E., Juniper, M.P, Young, P.J.: Ensembling geophysical models with Bayesian neural networks. In: Advances in Neural Information Processing Systems (NeurIPS) (2020)

    Google Scholar 

  21. Kashinath, K., Li, L.K.B., Juniper, M.P.: Forced synchronization of periodic and aperiodic thermoacoustic oscillations: lock-in, bifurcations and open-loop control. J. Fluid Mech. 838, 690–714 (2018)

    Article  MathSciNet  Google Scholar 

  22. Hemchandra, S.: Dynamics of turbulent premixed flames in acoustic fields. Ph.D. thesis (2009)

    Google Scholar 

  23. Lakshminarayanan, B., Pritzel, A., Blundell, C.: Simple and scalable predictive uncertainty estimation using deep ensembles. In: Advances in Neural Information Processing Systems (NeurIPS), pp. 6402–6413 (2017)

    Google Scholar 

  24. Luo, X., Hoteit, I.: Robust ensemble filtering and its relation to covariance inflation in the ensemble Kalman filter. Monthly Weather Rev. 139(12), 3938–3953 (2011)

    Article  Google Scholar 

Download references

Funding

Disclosure of Funding

This project has received funding from the UK Engineering and Physical Sciences Research Council (EPSRC) award EP/N509620/1 and from the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement number 766264.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maximilian L. Croci .

Editor information

Editors and Affiliations

A Supplementary material: Hyperparameter settings

A Supplementary material: Hyperparameter settings

See Table 1.

Table 1. Hyperparameter settings.

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Croci, M.L., Sengupta, U., Juniper, M.P. (2021). Data Assimilation Using Heteroscedastic Bayesian Neural Network Ensembles for Reduced-Order Flame Models. In: Paszynski, M., Kranzlmüller, D., Krzhizhanovskaya, V.V., Dongarra, J.J., Sloot, P.M. (eds) Computational Science – ICCS 2021. ICCS 2021. Lecture Notes in Computer Science(), vol 12746. Springer, Cham. https://doi.org/10.1007/978-3-030-77977-1_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-77977-1_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-77976-4

  • Online ISBN: 978-3-030-77977-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics