By Topic

A parallel learning filter system that learns the KL-expansion from examples

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
R. Lenz ; Linkoping Univ., Sweden ; M. Osterberg

A new method for learning in a single-layer linear neural network is investigated. It is based on an optimality criterion that maximizes the information in the outputs and simultaneously concentrates the outputs. The system consists of a number of so-called basic units and it is shown that the stable states of these basic units correspond to the (pure) eigenvectors of the input correlation matrix. The authors show that the basic units learn in parallel and that the communication between the units is kept to a minimum. They discuss two different implementations of the learning rule, a heuristic one and one based on the Newton-rule. They demonstrate the properties of the system with the help of two classes of examples: waveform analysis and simple OCR-reading. In the waveform-analysis case the eigenfunctions of the systems are known from the group-theoretical studies and the authors show that the system indeed stabilizes in these states

Published in:

Neural Networks for Signal Processing [1991]., Proceedings of the 1991 IEEE Workshop

Date of Conference:

30 Sep-1 Oct 1991