Skip to Main Content
Analysis of stochastic gradient based adaptive algorithms with general cost functions is carried out. The analysis holds under mild assumptions on the inputs and the cost function. Previous analyses typically consider mean and mean square behavior, we consider almost sure behavior. The parameter estimates are shown to enter a small neighborhood about the optimum value and remain there for a finite length of time. The asymptotic distribution of the parameter estimates is shown to be Gaussian. Adaptive algorithms which fall under the framework of this paper are signed error LMS, dual sign LMS, quantized state LMS, least mean fourth, dead zone algorithms, and momentum algorithms. Some discussion is presented regarding stochastic gradient algorithms where the regressor is replaced with a general function of the regressor.