By Topic

Learning of fast transforms and spectral domain neural computing

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Ersoy, O.K. ; Sch. of Electr. Eng., Purdue Univ., West Lafayette, IN, USA ; Chen, C.-H.

The interaction between neural networks and fast transforms is examined. It is shown that the development, discovery, and the study of transforms can be efficiently carried out through the use of learning algorithms used in neural networks. In turn, these transforms can be used for a number of tasks in neural networks, such as network reduction and simplification, fast convergence during learning, fast memory retrieval, reduced cost and increased speed of implementation, feature extraction, invariance to distortions, better generalization, and increased quality of performance in the presence of noise and incomplete knowledge. Learning with the unconstrained part of the neural network of reduced size or minimized number of interconnections is performed in the spectral domain only, thereby considerably easing the problems of convergence and implementation. The techniques described can be especially useful in dynamic neural networks

Published in:

Circuits and Systems, IEEE Transactions on  (Volume:36 ,  Issue: 5 )