Cart (Loading....) | Create Account
Close category search window
 

Renyi entropy based divergence measures for ICA

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
Yufang Bao ; Dept. of Radiol. Sch. of Medicine, Miami Univ., FL, USA ; Krim, H.

Information measures based on Renyi entropy provide a distance measure among a group of probability densities with tunable and flexible parameters to allow differentially granular differences in data. We interpret a recently developed measure, a α-JR divergence, as an alternative to mutual information (MI). We also present in this paper, its potential as an improved ICA criterion, and demonstrate its performance. We also propose a computationally efficient technique to approximate Renyi mutual divergence and apply it to analyze dependent data.

Published in:

Statistical Signal Processing, 2003 IEEE Workshop on

Date of Conference:

28 Sept.-1 Oct. 2003

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.