By Topic

Kullback-Leibler distance between complex generalized Gaussian distributions

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Nafornita, C. ; Politeh. Univ. of Timisoara, Timisoara, Romania ; Berthoumieu, Y. ; Nafornita, I. ; Isar, A.

In texture classification, feature extraction can be made in a transform domain. A possibility to preserve the translation invariance is to use a complex transform like the Hyperanalytic Wavelet transform. It exhibits a circularly symmetric density function for subband coefficients so it can be modeled by a particular form of the complex generalized Gaussian (CGGD) distribution function. The Kullback-Leibler (KL) divergence, or distance, can be used to measure the similarity between subbands density function. We derive in this paper a closed-form expression for the KL divergence between two complex generalized Gaussian distributions.

Published in:

Signal Processing Conference (EUSIPCO), 2012 Proceedings of the 20th European

Date of Conference:

27-31 Aug. 2012