By Topic

OSLVQ: a training strategy for optimum-size learning vector quantization classifiers

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
S. Cagnoni ; Dept. of Electron. Eng., Florence Univ., Italy ; G. Valli

In this paper we describe OSLVQ (optimum-size learning vector quantization), an algorithm for training learning vector quantization (LVQ) classifiers that achieves effective sizing of networks through a multistep procedure. In each step of the algorithm the network is first trained with a few iterations of one of the LVQ algorithms. After this partial training the structure of the network is updated according to the performances achieved in classifying the training set: we add neurons whose weight vectors are enclosed in regions of the pattern space where several misclassified patterns are found, while we remove neurons which are activated by too few training patterns. Neurons are also removed if their presence is redundant, i.e. when, in their absence, other neurons representing the same class would respond to the same patterns. Results obtained on a set of patterns representing phonemes are reported and compared with the ones achieved by a standard-LVQ classifier of similar size

Published in:

Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on  (Volume:2 )

Date of Conference:

27 Jun-2 Jul 1994