By Topic

Structural properties of gradient recurrent high-order neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
E. B. Kosmatopoulos ; Dept. of Electron. & Comput. Eng., Tech. Univ. of Crete, Chania, Greece ; M. A. Christodoulou

The structural properties of Recurrent High-Order Neural Networks (RHONN) whose weights are restricted to satisfy the symmetry property, are investigated. First, it is shown that these networks are gradient and stable dynamical systems and moreover, they remain stable when either bounded deterministic or multiplicative stochastic disturbances concatenate their dynamics. Then, we prove that such networks are capable of approximating arbitrarily close, a large class of dynamical systems of the form χ˙=F(χ). Appropriate learning laws, that make these neural networks able to approximate (identify) unknown dynamical systems are also proposed. The learning laws are based on Lyapunov stability theory, and they ensure error stability and robustness

Published in:

IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing  (Volume:42 ,  Issue: 9 )