Abstract
In this paper, a new variational Gaussian regression filter (VGRF) is proposed by constructing the linear parametric Gaussian regression (LPGR) process including variational parameters. Through modeling the measurement likelihood by LPGR to implement the Bayesian update, the nonlinear measurement function will not be directly involved in the state estimation. The complex Monte Carlo computation used in traditional methods is also avoided well. Hence, in PVFF, the inference of state posteriori and variational parameters can be achieved tractably and simply by using variational Bayesian inference approach. Secondly, a filtering evidence lower bound (F-ELBO) is proposed as a quantitative evaluation rule of different filters. Compared with traditional methods, the higher estimation accuracy of VGRF can be explained by the F-ELBO. Thirdly, a relationship between F-ELBO and the monitored ELBO (M-ELBO) is found, i.e., F-ELBO is always larger than M-ELBO. Based on this finding, the accuracy performance improvement of VGRF can be theoretically explained. Finally, three numerical examples are employed to illustrate the effectiveness of VGRF.
Similar content being viewed by others
References
Hu, Q., Ji, H., Zhang, Y.: A standard PHD filter for joint tracking and classification of maneuvering extended targets using random matrix. Signal Process. 144, 352–363 (2018)
Dong, P., Jing, Z., Gong, D., Tang, B.: Maneuvering multi-target tracking based on variable structure multiple model GMCPHD filter. Signal Process. 141, 158–167 (2017)
Zhang, X.M., Ren, K., Wan, M.J., Gu, G.H., Chen, Q.: Infrared small target tracking based on sample constrained particle filtering and sparse representation. Infrared Phys. Technol. 87, 72–82 (2017)
Wang, F., Lin, B., Zhang, J., Li, X.: Object tracking using Langevin Monte Carlo particle filter and locality sensitive histogram based likelihood model. Comput. Graph. 70, 214–223 (2018)
Farhad, M.G., Reza, F.M., Nader, G.: Target tracking with fast adaptive revisit time based on steady state IMM filter. Digital Signal Process. 69, 154–161 (2017)
Afshari, H.H., Gadsden, S.A., Habibi, S.: Gaussian filters for parameter and state estimation: a general review of theory and recent trends. Signal Process. 135, 218–238 (2017)
Zhang, C., Btepage, J., Kjellstrm, H., Mandt, S.: Advances in variational inference. IEEE Trans. Pattern Anal. Mach. Intell. 41(8), 2008–2026 (2018)
Särkkä, S., Nummenmaa, A.: Recursive noise adaptive Kalman filtering by variational Bayesian approximations. IEEE Trans. Auto. Control 54(3), 596–600 (2009)
Dong, P., Jing, Z., Leung, H., Shen, K.: Variational Bayesian adaptive cubature information filter based on Wishart distribution. IEEE Trans. Auto. Control 62(11), 6051–6057 (2017). https://doi.org/10.1109/TAC.2017.2704442
Shen, K., Jing, Z., Dong, P.: A consensus nonlinear filter with measurement uncertainty in distributed sensor networks. IEEE Signal Process. Lett. 24(11), 1631–1635 (2017)
Li, K., Chang, L., Hu, B.: A variational Bayesian-based unscented Kalman filter with both adaptivity and robustness. IEEE Sens. J. 16(18), 6966–6976 (2016)
Huang, Y., Zhang, Y., Wu, Z., Li, N., Chambers, J.: A novel adaptive Kalman filter with inaccurate process and measurement noise covariance matrices. IEEE Trans. Auto. Control 63(2), 594–601 (2018). https://doi.org/10.1109/TAC.2017.2730480
Agamennoni, G., Nieto, J.I., Nebot, E.M.: Approximate inference in state-space models with heavy-tailed noise. IEEE Trans. Signal Process. 60(10), 5024–5037 (2012). https://doi.org/10.1109/TSP.2012.2208106
Taniguchi, A., Fujimoto, K., Nishida, Y.: On variational Bayes for identification of nonlinear state-space models with linearly dependent unknown parameters, in: 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), 2017, pp. 572–576
Ma, Z., Rana, P.K., Taghia, J., Flierl, M., Leijon, A.: Bayesian estimation of Dirichlet mixture model with variational inference. Pattern Recog. 47(9), 3143–3157 (2014)
Safarinejadian, B., Estahbanati, M.E.: A novel distributed variational approximation method for density estimation in sensor networks. Measurement 89, 78–86 (2016)
Hua, J., Li, C.: Distributed variational Bayesian algorithms over sensor networks. IEEE Trans. Signal Process. 64(3), 783–798 (2016)
García-Fernández, Á.F., Svensson, L., Morelande, M.R., Särkkä, S.: Posterior linearization filter:principles and implementation using sigma points. IEEE Trans. Signal Process. 63(20), 5561–5573 (2015)
Gultekin, S., Paisleyi, J.: Nonlinear Kalman filtering with divergence minimization. IEEE Trans. Signal Process. 65(23), 6319–6331 (2017)
Tronarp, F., García-Fernández, Á.F., Särkkä, S.: Iterative filtering and smoothing in nonlinear and non-Gaussian systems using conditional moments. IEEE Signal Process. Lett. 25(3), 408–412 (2018)
Hu, Y., Wang, X., Lan, H., Wang, Z., Moran, B., Pan, Q.: An iterative nonlinear filter using variational Bayesian optimization. Sensors 18(12), 4222 (2018)
Li, T.-C., Su, J.-Y., Liu, W., Corchado, J.M.: Approximate gaussian conjugacy: parametric recursive filtering under nonlinearity, multimodality, uncertainty, and constraint, and beyond. Front. Inf. Technol. Electron. Eng. 18(12), 1913–1939 (2017)
Doucet, A., Godsill, S., Andrieu, C.: On sequential monte carlo sampling methods for bayesian filtering. Stat. comput. 10(3), 197–208 (2000)
Afshari, H.H., Gadsden, S.A., Habibi, S.: Gaussian filters for parameter and state estimation: a general review of theory and recent trends. Signal Process. 135, 218–238 (2017)
Julier, S., Uhlmann, J., Durrant-Whyte, H.F.: A new method for the nonlinear transformation of means and covariances in filters and estimators. IEEE Trans. Auto. Control 45(3), 477–482 (2000). https://doi.org/10.1109/9.847726
Arasaratnam, I., Haykin, S.: Cubature Kalman filters. IEEE Trans. Auto. Control 54(6), 1254–1269 (2009). https://doi.org/10.1109/TAC.2009.2019800
Nørgaard, M., Poulsen, N.K., Ravn, O.: New developments in state estimation for nonlinear system. Automatica 36(11), 1627–1638 (2000)
Jia, B., Xin, M., Cheng, Y.: Sparse-grid quadrature nonlinear filtering. Automatica 48(2), 327–341 (2012)
Mdl, V., Quinn, A.: The Variational Bayes Method in Signal Processing. Springer, Berlin (2006)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer Science Business Media, Berlin (2006)
Constantinopoulos, C., Titsias, M.K., Likas, A.: Bayesian feature and model selection for Gaussian mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 28(6), 1013–1018 (2006)
Sung, J., Ghahramani, Z., Bang, S.: Second-order latent-space variational bayes for approximate bayesian inference. IEEE Signal Process. Lett. 15, 918–921 (2008)
Blei, D.M., Kucukelbir, A., McAuliffe, J.D.: Variational inference: a review for statisticians. J. Am. Stat. Assoc. 112(518), 859–877 (2017)
Murphy, K.P.: Machine Learning: A Probabilistic Perspective. MIT press, New York (2012)
Scardua, L.A., Jaime, J.: Complete offline tuning of the unscented kalman filter. Automatica 80, 54–61 (2017)
Jia, B., Xin, M., Cheng, Y.: High-degree cubature Kalman filter. Automatica 49(2), 510–518 (2013)
Singh, A.K., Bhaumik, S.: Transformed cubature quadrature Kalman filter. IET Signal Process. 11(9), 1095–1103 (2017). https://doi.org/10.1049/iet-spr.2017.0074
Wang, X.X., Liang, Y., Pan, Q., Wang, Y.G.: Measurement random latency probability identification. IEEE Trans. Auto. Control 61(12), 4210–4216 (2016)
Wang, X.X., Liang, Y., Pan, Q., Zhao, C.H., Yang, F.: Design and implementation of Gaussian filter for nonlinear system with randomly delayed measurements and correlated noises. Appl. Math. Comput. 232, 1011–1024 (2014)
Wang, X.X., Liang, Y., Pan, Q., Yang, F.: A Gaussian approximation recursive filter for nonlinear systems with correlated noises. Automatica 48(9), 2290–2297 (2012)
Ranganath, S. G. R, Blei, D. M.: Black Box Variational Inferenc., In: International Conference on Artificial Intelligence and Statistics, Vol. 33, pp. 814–822 (2014)
Fei, Z., Liu, K., Huang, B., Zheng, Y., Xiang, X.: Dirichlet process mixture model based nonparametric Bayesian modeling and variational inference. Chin. Auto. Cong. CAC 2019, 3048–3051 (2019)
Xu, H., Duan, K.Q., Yuan, H.D., Xie, W.C., Wang, Y.L.: Black box variational inference to adaptive Kalman filter with unknown process noise covariance matrix. Signal Process. 169, 107413 (2020)
Trusheim, F., Condurache, A., Mertins, A.: Boosting black-box variational inference by incorporating the natural gradient. In: 2018 24th International Conference on Pattern Recognition (ICPR), pp. 19–24 (2018)
Robbins, H., Monro, S.: A stochastic approximation method. Ann. Math. Stat. 22(3), 400–407 (1951)
Hsieh, Chien-Shu , Chen, Fu-Chuang: Optimal solution of the two-stage kalman estimator. In: Proceedings of 1995 34th IEEE Conference on Decision and Control, Vol. 2, pp. 1532–1537. https://doi.org/10.1109/CDC.1995.480355 (1995)
Bell, B.M., Cathey, F.W.: The iterated Kalman filter update as a Gauss-Newton method. IEEE Trans. Auto. Control 38(2), 294–297 (1993)
Arasaratnam, I., Haykin, S.: Cubature Kalman smoothers. Automatica 47(10), 2245–2250 (2011)
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
We have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work was supported in part by the National Natural Science Foundation of China under Grants 61873208, 61573287, 61203234, 61135001, and 61374023, in part by the Shaanxi Natural Science Foundation of China under Grant 2017JM6006, in part by the Aviation Science Foundation of China under Grant 2016ZC53018, in part by the Fundamental Research Funds for Central Universities under Grant 3102017jghk02009, and in part Equipment Pre-research Foundation under Grant 2017-HT-XG.
Appendices
Appendix A: The proof of theorem 1
Given \(p({x_k},\mu ,\beta ,\lambda ,Z_1^k)\), use the mean field theory to yield that
where for the brevity u is omitted in our derivation the filter’s structure and will be brought back into the final result. We have
For the following deduction, we define
Obviously, \(prio({z_k})\) is a constant independent of state. By rewriting \({\log q({x_k})q(\beta )}\) with \(prio({z_k})\) , we have
Then, by re-organizing \(\log q({x_k})q(\beta )\), (30)-(33) are obtained.
Appendix B: The proof of theorem 2
According to the mean field theory, from \(p({x_k},\mu ,\beta ,\lambda , Z_1^k)\) it is easy to obtain
where
For the following deduction, we define
where \(B = {\left( {[e \otimes {I_m}]{M}_{\mathrm{0}}^{ - 1}{{[e \otimes {I_m}]}^T} + {\bar{R}}_k^{ - 1}} \right) ^{ - 1}}\). Obviously, \(p({\bar{Z}_k})\) is independent of states and decomposed as
Then, rewriting \({\log q(\mu ,\lambda )}\) with \({\log p({{\bar{Z}}_k})}\), we have
Then, by re-organizing \({\log q(\mu ,\lambda )}\), Theorem 2 is derived.
Appendix C: Calculation of M-ELBO
Based on (42), M-ELBO is expressed as (82). Then, the calculations of expectations in (82) are given by (83)–(92). For the convenience of expression, the subscript for expectation computation is omitted.
Rights and permissions
About this article
Cite this article
Wang, X., Cui, H., Li, T. et al. A novel nonlinear filter through constructing the parametric Gaussian regression process. Nonlinear Dyn 105, 579–602 (2021). https://doi.org/10.1007/s11071-021-06626-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11071-021-06626-6