Skip to Main Content
Information measures based on Renyi entropy provide a distance measure among a group of probability densities with tunable and flexible parameters to allow differentially granular differences in data. We interpret a recently developed measure, a α-JR divergence, as an alternative to mutual information (MI). We also present in this paper, its potential as an improved ICA criterion, and demonstrate its performance. We also propose a computationally efficient technique to approximate Renyi mutual divergence and apply it to analyze dependent data.
Statistical Signal Processing, 2003 IEEE Workshop on
Date of Conference: 28 Sept.-1 Oct. 2003