Abstract:
The information-bottleneck (IB) principle is defined in terms of mutual information. This study defines mutual information between two random variables using the Jensen-S...Show MoreMetadata
Abstract:
The information-bottleneck (IB) principle is defined in terms of mutual information. This study defines mutual information between two random variables using the Jensen-Shannon (JS) divergence instead of the standard definition which is based on the Kullback-Leibler (KL) divergence. We reformulate the information-bottleneck principle using the proposed mutual information and apply it to the problem of pairwise clustering. We show that applying IB to clustering tasks using JS divergences instead of KL yields improved results. This indicates that JS-based mutual information has an expressive power at least as the standard KL-based mutual information.
Published in: ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 12-17 May 2019
Date Added to IEEE Xplore: 17 April 2019
ISBN Information: