By Topic

Hebbian feature discovery improves classifier efficiency

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)

Two neural network implementations of principal component analysis (PCA) are used to reduce the dimension of speech signals. The compressed signals are then used to train a feedforward classification network for vowel recognition. A comparison is made of classification performance, network size, and training time for networks trained with both compressed and uncompressed data. Results show that a significant reduction in training time, fivefold in the present case, can be achieved without a sacrifice in classifier accuracy. This reduction includes the time required to train the compression network. Thus, dimension reduction, as performed by unsupervised neural networks, is a viable tool for enhancing the efficiency of neural classifiers

Published in:

Neural Networks, 1990., 1990 IJCNN International Joint Conference on

Date of Conference:

17-21 June 1990