By Topic

Convergence analysis of forgetting gradient algorithm by using martingale hyperconvergence theorem

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
3 Author(s)
Ding, Feng ; Department of Automation, Tsinghua University, Beijing 100084, China ; Yang, Jiaben ; Xu, Yongmao

The stochastic gradient (SG) algorithm has less of a computational burden than the least squares algorithms, but it can not track time-varying parameters and has a poor convergence rate. In order to improve the tracking properties of the SG algorithm, the forgetting gradient (FG) algorithm is presented, and its convergence is analyzed by using the martingale hyperconvergence theorem. The results show that: (1) for time-invariant deterministic systems, the parameter estimates given by the FG algorithm converge consistently to their true values; (2) for stochastic time-varying systems, the parameter tracking error is bounded, that is, the parameter tracking error is small when both the parameter change rate and the observation noise are small.

Published in:

Tsinghua Science and Technology  (Volume:5 ,  Issue: 2 )