Advertisement

Artificial Life and Robotics

, Volume 17, Issue 1, pp 35–40 | Cite as

Non-parametric identification of continuous-time Hammerstein systems using Gaussian process model and particle swarm optimization

Original Article

Abstract

This paper deals with a non-parametric identification of continuous-time Hammerstein systems using Gaussian process (GP) models. A Hammerstein system consists of a memoryless non-linear static part followed by a linear dynamic part. The identification model is derived using the GP prior model which is described by the mean function vector and the covariance matrix. This prior model is trained by the separable least-squares (LS) approach combining the linear LS method with particle swarm optimization to minimize the negative log marginal likelihood of the identification data. Then the non-linear static part is estimated by the predictive mean function of the GP, and the confidence measure of the estimated non-linear static part is evaluated by the predictive covariance function of the GP. Simulation results are shown to illustrate the proposed method.

Keywords

Continuous-time system Gaussian process model Hammerstein system Particle swarm optimization System identification 

References

  1. 1.
    Adachi S, Murakami H (1995) Generalized predictive control system design based on non-linear identification by using Hammerstein model (in Japanese). Trans ISCIE 8(3):115–121CrossRefGoogle Scholar
  2. 2.
    Al-Duwaish H, Karim MN (1997) A new method for the identification of Hammerstein model. Automatica 33(10):1871–1875MathSciNetMATHCrossRefGoogle Scholar
  3. 3.
    Hatanaka T, Uosaki K, Koga M (2002) Evolutionary computation approach to Hammerstein model identification, In: Proceedings of the fourth Asian Control Conference, Singapore, Sep 25–27, 2002, pp 1730–1735Google Scholar
  4. 4.
    Hachino T, Takata H (2010) Identification of discrete-time Hammerstein systems using Gaussian process model. J Signal Process 14(2):129–137Google Scholar
  5. 5.
    O’Hagan A (1978) Curve fitting and optimal design for prediction (with discussion). J Roy Stat Soc B 40:1–42MathSciNetMATHGoogle Scholar
  6. 6.
    Williams CKI, Rasmussen CE (1996) Gaussian processes for regression. In: Advances in Neural Information Processing Systems 8, MIT Press, Cambridge, Massachusetts, pp 514–520Google Scholar
  7. 7.
    Rasmussen CE, Williams CKI (2006) Gaussian processes for machine learning. MIT Press, Cambridge, MassachusettsGoogle Scholar
  8. 8.
    Kennedy J, Eberhart RC (1995) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, Perth, Australia, Nov 27–Dec 1, 1995, pp 1942–1948Google Scholar
  9. 9.
    Yoshida H, Kawata K, Fukuyama Y et al (2000) A particle swarm optimization for reactive power and voltage control considering voltage security assessment. IEEE Trans Power Syst 15(4):1232–1239CrossRefGoogle Scholar
  10. 10.
    Ide A, Yasuda K (2004) A basic study of the adaptive particle swarm optimization (in Japanese). IEEJ Trans EIS 124(2):550–557CrossRefGoogle Scholar
  11. 11.
    Tsang KM, Billings SA (1994) Identification of continuous time nonlinear systems using delayed state variable filters. Int J Control 60(2):159–180MathSciNetMATHCrossRefGoogle Scholar
  12. 12.
    Kudo A, Kamimura H (1983) Statistical mathematics (in Japanese). Kyoritsu Shuppan, TokyoGoogle Scholar
  13. 13.
    Mises R (1964) Mathematical theory of probability and statistics. Academic Press, New YorkMATHGoogle Scholar
  14. 14.
    Hachino T, Nagatomo K, Takata H (2006) Identification of continuous-time Hammerstein systems using RBF networks and genetic algorithm. In: Proceedings of the 2006 RISP international workshop on nonlinear circuits and signal processing, Honolulu, Hawaii, USA, March 3–5, 2006, pp 21–24Google Scholar

Copyright information

© ISAROB 2012

Authors and Affiliations

  1. 1.Department of Electrical and Electronics EngineeringGraduate School of Science and Engineering, Kagoshima UniversityKagoshimaJapan

Personalised recommendations