Cart (Loading....) | Create Account
Close category search window
 

On the statistical efficiency of the LMS algorithm with nonstationary inputs

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)

A fundamental relationship exists between the quality of an adaptive solution and the amount of data used in obtaining it. Quality is defined here in terms of "misadjustment," the ratio of the excess mean square error (mse) in an adaptive solution to the minimum possible mse. The higher the misadjustment, the lower the quality is. The quality of the exact least squares solution is compared with the quality of the solutions obtained by the orthogonalized and the conventional least mean square (LMS) algorithms with stationary and nonstationary input data. When adapting with noisy observations, a filter trained with a finite data sample using an exact least squares algorithms will have a misadjustment given byM=frac{n}{N}=frac{number of weights}{number of training samples}If the same adaptive filter were trained with a steady flow of data using an ideal "orthogonalized LMS" algorithm, the misadjustment would beM=frac{n}{4tau_{mse}}=frac{number of weights}{number of training samples}Thus, for a given time constanttau_{mse}of the learning process, the ideal orthogonalized LMS algorithm will have about as low a misadjustment as can be achieved, since this algorithm performs essentially as an exact least squares algorithm with exponential data weighting. It is well known that when rapid convergence with stationary data is required, exact least squares algorithms can in certain cases outperform the conventional Widrow-Hoff LMS algorithm. It is shown here, however, that for an important class of nonstationary problems, the misadjustment of conventional LMS is the same as that of orthogonalized LMS, which in the stationary case is shown to perform essentially as an exact least squares algorithm.

Published in:

Information Theory, IEEE Transactions on  (Volume:30 ,  Issue: 2 )

Date of Publication:

Mar 1984

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.