By Topic

Memory-Efficient Fully Coupled Filtering Approach for Observational Model Building

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Hamse Y. Mussa ; Unilever Centre for Molecular Sciences Informatics, Chemical Laboratories, University of Cambridge, Cambridge, U.K. ; Robert C. Glen

Generally, training neural networks with the global extended Kalman filter (GEKF) technique exhibits excellent performance at the expense of a large increase in computational costs which can become prohibitive even for networks of moderate size. This drawback was previously addressed by heuristically decoupling some of the weights of the networks. Inevitably, ad hoc decoupling leads to a degradation in the quality (accuracy) of the resultant neural networks. In this paper, we present an algorithm that emulates the accuracy of GEKF, but avoids the construction of the state covariance matrix-the source of the computational bottleneck in GEKF. In the proposed algorithm, all the synaptic weights remain connected while the amount of computer memory required is similar to (or cheaper than) the memory requirements in the decoupling schemes. We also point out that the new method can be extended to derivative-free nonlinear Kalman filters, such as the unscented Kalman filter and ensemble Kalman filters.

Published in:

IEEE Transactions on Neural Networks  (Volume:21 ,  Issue: 4 )