Simulation and Testing for Vehicle Technology pp 245-265 | Cite as
Optimal Steady-State Base-Calibration of Model Based ECU-Functions
Abstract
- 1.
At first the relevant system variables for the calibration, such as air mass flow and gas temperature after mixture with EGR, will be predicted depending on the combination of the inputs, current statistical models for the relevant submodels and the physical structure of the system
- 2.
Based on the predicted system variables an extended Kalman filter can be employed to estimate the variance of the measurement points for calibration of the submodels
- 3.
The information content of the predicted measurement points for the calibration of the submodels is calculated and summed up. It is defined by the reduction of the uncertainty of the unmeasured region in each Gaussian process model by adding the predicted measurement point to the current data set.
Keywords
Steady-state base-calibration of ECU-functions Sequential experimental design Gaussian process regression Mutual information Extended Kalman filterReferences
- 1.Gibbs, M.N.: Bayesian Gaussian processes for regression and classification. Dissertation, University of Cambridge (1997)Google Scholar
- 2.Rasmussen, C.E., Williams, C.K.I.: Gaussian processes for machine learning. The MIT Press, Cambridge (2006)Google Scholar
- 3.Mackay, D.J.C.: Gaussian processes: a replacement for supervised neural networks? Tutorial lecture notes for NIPS 1997, University of Cambridge (1997)Google Scholar
- 4.Ankenman, B., Nelson, B.L., Staum, J.: Stochastic Kriging for simulation metamodeling. Oper. Res. 58(2), 371–382 (2010)Google Scholar
- 5.Costa, J.-P., Pronzato, L., Thierry, E.: A comparison between Kriging and radial basis function networks for nonlinear prediction. In: International Workshop on Nonlinear Signal and Image Processing, NSIP’99, Antalya, Turkey, Paper number: 155 (1999)Google Scholar
- 6.Bishop, C.M.: Pattern recognition and machine learning (Information science and statistics). Springer, Heidelberg (2007)Google Scholar
- 7.Chapelle, O.: Some thoughts about Gaussian processes model selection and large scale. In: Nips Workshop on Gaussian Processes. Max Planck Institute for Bioligical Cybernetics (2005)Google Scholar
- 8.Castro, R.M.: Active learning and adaptive sampling for non-parametric inference. Dissertation, Rice University (2007)Google Scholar
- 9.Chen Quin Lam, M.S.: Sequential adaptive designs in computer experiments for response surface model fit. Dissertation, The Ohio State University (2008)Google Scholar
- 10.Murphy, K.P.: Machine learning: a probabilistic perspective. The MIT Press, Cambridge (2012)MATHGoogle Scholar
- 11.Krause, A., Singh, A., Guestrin, C.: Near-optimal sensor placements in Gaussian processes: theory, efficient algorithms and empirical studies. J. Mach. Learn. Res. 9, 235–284 (2008)MATHGoogle Scholar
- 12.Kalman, R.E.: A new approach to linear filtering and prediction problems. J. Basic Eng. 82, 35–45 (1960)Google Scholar