By Topic

Cumulant-based parameter estimation using structured networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Wang, L.X. ; Dept. of Electr. Eng.-Syst., Univ. of Southern California, Los Angeles, CA, USA ; Mendel, J.M.

A two-level three-layer structured network is developed to estimate the moving-average model parameters based on second-order and third-order cumulant matching. The structured network is a multilayer feedforward network composed of linear summers in which the weights of these summers have a clear physical meaning. The first level is composed of random access memory units, which are used to control the connectivities of the second-level summers. The second level is composed of three layers of linear summers in which the weight of any summer represents the moving-average parameter to be estimated. The connectivities among these summers are controlled by the first-level memory units in such a way that the outputs of the second-level structured network equal the desired second-order or third-order statistics if the summer weights equal their corresponding true moving-average parameter values. Each second-order and third-order cumulant is viewed as a pattern which the structured network needs to learn, and a steepest-descent algorithm is proposed for training the structured network. The author also presents extensions to particular sorts of estimation, and results of simulations

Published in:

Neural Networks, IEEE Transactions on  (Volume:2 ,  Issue: 1 )