By Topic

Daugman's Gabor transform as a simple generative back propagation network

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $31
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Coheh, D. ; R. Holloway & Bedford New Coll., Egham, UK ; Shawe-Taylor, J.

Much work has been performed on learning mechanisms for neural networks. A particular area of interest has been the use of neural networks for image processing problems. Two important pieces of work in this area are unified. An architecture and learning scheme for neural networks called generative back propagation has been previously developed and a system for image compression and filtering based on 2-D Gabor transformations which used a neural network type architecture described. Daugman's procedure is exactly replicated, a procedure which used a four layer neural network as a two-layer generative back propagation network with half of the units. The GBP update rule is shown to perform the same change as Daugman's rule, but more efficiently.

Published in:

Electronics Letters  (Volume:26 ,  Issue: 16 )