By Topic

Square root covariance ladder algorithms

Sign In

Full text access may be available.

To access full text, please use your member or institutional sign in.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Porat, B. ; Stanford University, Stanford, CA, USA ; Friedlander, Benjamin ; Morf, M.

Square root normalized ladder algorithms provide an efficient recursive solution to the problem of multichannel autoregressive model fitting. A simplified derivation of the general update formulas for such ladder forms is presented, and is used to develop the growing memory and sliding memory covariance ladder algorithms. New ladder form realizations for the identified models are presented, leading to convenient methods for computing the model parameters from estimated reflection coefficients. A complete solution to the problem of possible singularity in the ladder update equations is also presented.

Published in:

Automatic Control, IEEE Transactions on  (Volume:27 ,  Issue: 4 )