On Gaussian Process NARX Models and Their Higher-Order Frequency Response Functions
One of the most versatile and powerful approaches to the identification of nonlinear dynamical systems is the NARMAX (Nonlinear Auto_regressive Moving Average with eXogenous inputs) method. The model represents the current output of a system by a nonlinear regression on past inputs and outputs and can also incorporate a nonlinear noise model in the most general case. Although the NARMAX model is most often given a polynomial form, this is not a restriction of the method and other formulations have been proposed based on nonparametric machine learning paradigms, for example. All of these forms of the NARMAX model allow the computation of Higher-order Frequency Response Functions (HFRFs) which encode the model in the frequency domain and allow a direct interpretation of how frequencies interact in the nonlinear system under study. Recently, a NARX (no noise model) formulation based on Gaussian Process (GP) regression has been developed. One advantage of the GP NARX form is that confidence intervals are a natural part of the prediction process. The objective of the current paper is to discuss the GP formulation and show how to compute the HFRFS corresponding to GP NARX. Examples will be given based on simulated data.
KeywordsNonlinear system identification NARMAX models Higher-order Frequency Response Functions (HFRFs) Gaussian processes
The authors would like to thank Dr James Hensman of the University of Sheffield Centre for Translational Neuroscience for a number of interesting and useful discussions.
- 6.Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, New York (1998)Google Scholar
- 12.Murray-Smith, R., Johansen, T. A., Shorten, R.: On transient dynamics, off-equilibrium behaviour and identification in blended multiple model structures. In: European Control Conference, Karlsruhe, BA-14 (1999)Google Scholar
- 13.Kocijan, J.: Dynamic GP models: an overview and recent developments. In: ASM12, Proc. 6th Int. Conf. Appl. Maths Sim. Mod., pp. 38–43 (2012)Google Scholar
- 14.Krige, D.G.: A Statistical Approach to Some Mine Valuations and Allied Problems at the Witwatersrand. Master’s Thesis, University of Witwatersrand (1951)Google Scholar
- 15.Neal, R.M.: Monte Carlo implementation of Gaussian process models for Bayesian regression and classification. Arxiv preprint physics/9701026, (1997)Google Scholar
- 16.MacKay, D.J.C.: Gaussian processes - a replacement for supervised neural networks. Lecture Notes for Tutorial at Int. Conf. Neural Inf. Proc. Sys. (1997)Google Scholar
- 17.Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2005)Google Scholar
- 19.Snelson, E. Ghahramani, Z.: Sparse Gaussian processes using pseudo-inputs. In: AAdvances in Neural Information Processing Systems. MIT Press, Cambridge (2006)Google Scholar
- 20.Giraud, A.: Approximate Methods for Propagation of Uncertainty with Gaussian Process Models. PhD Thesis, University of Glasgow (2004)Google Scholar
- 21.Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes: The Art of Scientific Computing, 3rd edn. Cambridge University Press, New York (2007)Google Scholar