By Topic

Extended Kalman filter learning algorithm for hyper-complex multilayer neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
H. C. S. Rughooputh ; Univ. of Mauritius, Reduit, Mauritius ; S. D. D. V. Rughooputh

A new type of multilayer perceptron (MLP), developed in quaternion algebra, has been used to perform hyper-complex nonlinear mappings and time series prediction employing a reduced network complexity with respect to conventional MLPs. The extended Kalman filter (EKF) technique has been used as an online algorithm to train MLP neural networks. This training technique significantly reduces the convergence and memory requirement compared with the standard backpropagation method. In this paper, a new learning algorithm, the hyper-complex EKF is derived for the hyper-complex MLP neural network. As an application, a short term prediction of a time-series generated from the hyperchaotic Saito circuit is considered

Published in:

Neural Networks, 1999. IJCNN '99. International Joint Conference on  (Volume:3 )

Date of Conference:

1999