By Topic

Convolutive blind source separation by minimizing mutual information between segments of signals

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
K. E. Hild ; Dept. of Radiol., California Univ., San Francisco, CA, USA ; D. Pinto ; D. Erdogmus ; J. C. Principe

A method to perform convolutive blind source separation of super-Gaussian sources by minimizing the mutual information between segments of output signals is presented. The proposed approach is essentially an implementation of an idea previously proposed by Pham. The formulation of mutual information in the proposed criterion makes use of a nonparametric estimator of Renyi's α-entropy, which becomes Shannon's entropy in the limit as α approaches 1. Since α can be any number greater than 0, this produces a family of criteria having an infinite number of members. Interestingly, it appears that Shannon's entropy cannot be used for convolutive source separation with this type of estimator. In fact, only one value of α appears to be appropriate, namely α=2, which corresponds to Renyi's quadratic entropy. Four experiments are included to show the efficacy of the proposed criterion.

Published in:

IEEE Transactions on Circuits and Systems I: Regular Papers  (Volume:52 ,  Issue: 10 )