Skip to Main Content
This paper presents a new convergence analysis of the least mean fourth (LMF) adaptive algorithm, in the mean square sense. The analysis improves previous results, in that it is valid for non-Gaussian noise distributions and explicitly shows the dependence of algorithm stability on the initial conditions of the weights. Analytical expressions are derived presenting the relationship between the step size, the initial weight error vector, and mean-square stability. The analysis assumes a white zero-mean Gaussian reference signal and an independent, identically distributed (i.i.d.) measurement noise with any even probability density function (pdf). It has been shown by Nascimento and Bermudez ["Probability of Divergence for the Least-Mean Fourth (LMF) Algorithm," IEEE Transactions on Signal Processing, vol 54, no. 4, pp. 1376-1385, Apr. 2006] that the LMF algorithm is not mean-square stable for reference signals whose pdfs have infinite support. However, the probability of divergence as a function of the step size value tends to rise abruptly only when it moves past a given threshold. Our analysis provides a simple (and yet precise) estimate of the region of quick rise in the probability of divergence. Hence, the present analysis is useful for predicting algorithm instability in most practical applications.