By Topic

Fast Gauss mixture image classification based on the central limit theorem

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Ozonat, K.M. ; Dept. of Electr. Eng., Stanford Univ., CA, USA ; Gray, R.M.

The Gauss mixture model (GMM)-based vector quantizer with the quadratic discriminant analysis (QDA) distortion measure provides an approach to statistical image classification problems. Recent work has concentrated on designing tree-structured vector quantizers for image classification problems using the QDA distortion measure and the BFOS algorithm for pruning. It has been shown that the tree-structured design often increases the correct classification rate for the same design complexity, avoids over-fitting by pruning and makes it possible to include other classification algorithms such as adaptive boosting. Both the full-search design and the tree-structured design are based on clustering using the Lloyd algorithm. Even when the true underlying distribution of the feature vectors follows (approximately) a Gauss mixture distribution, the variances of the Gaussian components estimated by the clustering algorithm tend to be less than those of the true distribution. Hence, clustering introduces a variance bias. The work reported here intends to reduce the effects of the variance bias using the independent central limit theorem when the feature vectors are formed as (weighted) sums of the image block pixels. This is done through a joint quantization of the means and covariances of the image blocks and the feature vectors derived from the image blocks. Our simulations indicate that, both for the full-search design and the tree-structured design, our algorithm leads to an improvement in the classification accuracy. Finally, for the tree-structured classifier, we introduce a fast algorithm, which uses only the median eigenvalue of the covariance matrix (instead of the full covariance matrix) of each Gaussian component in the classification stage.

Published in:

Multimedia Signal Processing, 2004 IEEE 6th Workshop on

Date of Conference:

29 Sept.-1 Oct. 2004