By Topic

Improving the capacity of complex-valued neural networks with a modified gradient descent learning rule

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Donq Liang Lee ; Dept. of Electron. Eng., Ta-Hwa Inst. of Technol., Hsin-Chu, Taiwan

Jankowski et al. proposed (1996) a complex-valued neural network (CVNN) which is capable of storing and recalling gray-scale images. The convergence property of the CVNN has also been proven by means of the energy function approach. However, the memory capacity of the CVNN is very low because they use a generalized Hebb rule to construct the connection matrix. In this letter, a modified gradient descent learning rule (MGDR) is proposed to enhance the capacity of the CVNN. The proposed technique is derived by applying gradient search over a complex error surface. Simulation shows that the capacity of CVNN with MGDR is greatly improved

Published in:

Neural Networks, IEEE Transactions on  (Volume:12 ,  Issue: 2 )