By using the stochastic martingale theory, convergence properties of stochastic gradient (SG) identification algorithms are studied under weak conditions. The analysis indicates that the parameter estimates by the SG algorithms consistently converge to the true parameters, as long as the information vector is persistently exciting (i.e., the data product moment matrix has a bounded condition number) and that the process noises are zero mean and uncorrelated. These results remove the strict assumptions, made in existing references, that the noise variances and high-order moments exist, and the processes are stationary and ergodic and the strong persistent excitation condition holds. This contribution greatly relaxes the convergence conditions of stochastic gradient algorithms. The simulation results with bounded and unbounded noise variances confirm the convergence conclusions proposed.
recursive identification parameter estimation least squares stochastic gradient multivariable systems convergence properties martingale convergence theorem
This is a preview of subscription content, log in to check access
Lai T L, Wei C Z. Least squares estimates in stochastic regression models with applications to identification and control of dynamic systems. Ann Stat, 1982, 10(1): 154–166CrossRefMathSciNetGoogle Scholar
Lai T L, Wei C Z. Extended least squares and their applications to adaptive control and prediction in linear systems. IEEE Trans Automatic Contr, 1986, 31(10): 898–906MATHCrossRefMathSciNetGoogle Scholar
Wei C Z. Adaptive prediction by least squares prediction in stochastic regression models. Ann Stat, 1987, 15(4): 1667–1682MATHCrossRefGoogle Scholar
Ding F, Chen T. Performance bounds of forgetting factor least squares algorithm for time-varying systems with finite measurement data. IEEE Trans Circ Syst-I: Regular papers, 2005, 52(3): 555–566CrossRefMathSciNetGoogle Scholar