Advertisement

Kernel Embedded Nonlinear Observational Mappings in the Variational Mapping Particle Filter

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11539)

Abstract

Recently, some studies have suggested methods to combine variational probabilistic inference with Monte Carlo sampling. One promising approach is via local optimal transport. In this approach, a gradient steepest descent method based on local optimal transport principles is formulated to deterministically transform point samples from an intermediate density to a posterior density. The local mappings that transform the intermediate densities are embedded in a reproducing kernel Hilbert space (RKHS). This variational mapping method requires the evaluation of the log-posterior density gradient and therefore the adjoint of the observational operator. In this work, we evaluate nonlinear observational mappings in the variational mapping method using two approximations that avoid the adjoint, an ensemble based approximation in which the gradient is approximated by the sample cross-covariances between the state and observational spaces the so-called ensemble space and an RKHS approximation in which the observational mapping is embedded in an RKHS and the gradient is derived there. The approximations are evaluated for highly nonlinear observational operators and in a low-dimensional chaotic dynamical system. The RKHS approximation is shown to be highly successful and superior to the ensemble approximation for non-Gaussian posterior densities.

Keywords

Variational inference Stein discrepancy Data assimilation 

References

  1. 1.
    Burgers, G., Jan van Leeuwen, P., Evensen, G.: Analysis scheme in the ensemble Kalman filter. Monthly Weather Rev. 126, 1719–1724 (1998)CrossRefGoogle Scholar
  2. 2.
    Daum, F., Huang, J.: Nonlinear filters with log-homotopy. In: Signal and Data Processing of Small Targets 2007, vol. 6699, p. 669918 (2007)Google Scholar
  3. 3.
    Evensen, G.: Analysis of iterative ensemble smoothers for solving inverse problems. Comput. Geosci. 22, 885–908 (2018)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Gordon, N.J., Salmond, D.J., Smith, A.F.: Novel approach to nonlinear/non-Gaussian Bayesian state estimation. In: IEE Proceedings F-Radar and Signal Processing, vol. 140, pp. 107–113 (1993)CrossRefGoogle Scholar
  5. 5.
    Hoffman, M.D., Blei, D.M., Wang, C., Paisley, J.: Stochastic variational inference. J. Mach. Learn. Res. 14, 1303–1347 (2013)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Houtekamer, P.L., Mitchell, H.L.: A sequential ensemble Kalman filter for atmospheric data assimilation. Mon. Weather Rev. 129, 123–137 (2001)CrossRefGoogle Scholar
  7. 7.
    Hunt, B.R., Kostelich, E.J., Szunyogh, I.: Efficient data assimilation for spatiotemporal chaos: a local ensemble transform Kalman filter. Physica D 230, 112–126 (2007)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Jordan, M.I., Ghahramani, Z., Jaakkola, T.S., Saul, L.K.: An introduction to variational methods for graphical models. Mach. Learn. 37, 183–233 (1999)CrossRefGoogle Scholar
  9. 9.
    Kingma, D., Ba, J.: Adam: a method for stochastic optimization. In: International Conference on Learning Repres (ICLR). arXiv preprint arXiv:1412.6980 (2015)
  10. 10.
    Liu, Q., Wang, D.: Stein variational gradient descent: a general purpose Bayesian inference algorithm. In: Advances in Neural Information Processing Systems, pp. 2378–2386 (2016)Google Scholar
  11. 11.
    Marzouk, Y., Moselhy, T., Parno, M., Spantini, A.: An introduction to sampling via measure transport. In: Ghanem, R., Higdon, D., Owhadi, H. (eds.) To appear in Handbook of Uncertainty Quantification. Springer (2017). arXiv:1602.05023
  12. 12.
    Posselt, D.J., Vukicevic, T.: Robust characterization of model physics uncertainty for simulations of deep moist convection. Mon. Weather Rev. 138, 1513–1535 (2010)CrossRefGoogle Scholar
  13. 13.
    Posselt, D.J., Bishop, C.H.: Nonlinear parameter estimation: comparison of an ensemble Kalman smoother with a Markov chain Monte Carlo algorithm. Mon. Weather Rev. 140, 1957–1974 (2012)CrossRefGoogle Scholar
  14. 14.
    Posselt, D.J., Hodyss, D., Bishop, C.H.: Errors in Ensemble Kalman Smoother Estimates of Cloud Microphysical Parameters. Mon. Wea. Rev. 142, 1631–1654 (2014)CrossRefGoogle Scholar
  15. 15.
    Posselt, D.J.: A Bayesian examination of deep convective squall line sensitivity to changes in cloud microphysical parameters. J. Atmos. Sci. 73, 637–665 (2016)CrossRefGoogle Scholar
  16. 16.
    Pulido M., vanLeeuwen, P.J.: Kernel embedding of maps for Bayesian inference: the variational mapping particle filter. https://arxiv.org/pdf/1805.11380 (2018)
  17. 17.
    Saeedi, A., Kulkarni, T.D., Mansinghka, V.K., Gershman, S.J.: Variational particle approximations. J. Mach. Learn. Res. 18, 2328–2356 (2017)MathSciNetzbMATHGoogle Scholar
  18. 18.
    Scholkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)Google Scholar
  19. 19.
    Tarantola, A.: Inverse Problem Theory and Methods for Model Parameter Estimation, vol. 89. SIAM (2005)Google Scholar
  20. 20.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, Heidelberg (2013)zbMATHGoogle Scholar
  21. 21.
    Zhou, D.X.: Derivative reproducing properties for kernel methods in learning theory. J. Comput. Appl. Math. 220, 456–463 (2008)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of MeterologyUniversity of ReadingReadingUK
  2. 2.Department of PhysicsUniversidad Nacional del NordesteCorrientesArgentina
  3. 3.Department of Atmospheric ScienceColorado State UniversityFort CollinsUSA
  4. 4.Jet Propulsion LaboratoryCalifornia Institute of TechnologyPasadenaUSA

Personalised recommendations