Information-bottleneck Based on the Jensen-shannon Divergence with Applications to Pairwise Clustering | IEEE Conference Publication | IEEE Xplore

Information-bottleneck Based on the Jensen-shannon Divergence with Applications to Pairwise Clustering


Abstract:

The information-bottleneck (IB) principle is defined in terms of mutual information. This study defines mutual information between two random variables using the Jensen-S...Show More

Abstract:

The information-bottleneck (IB) principle is defined in terms of mutual information. This study defines mutual information between two random variables using the Jensen-Shannon (JS) divergence instead of the standard definition which is based on the Kullback-Leibler (KL) divergence. We reformulate the information-bottleneck principle using the proposed mutual information and apply it to the problem of pairwise clustering. We show that applying IB to clustering tasks using JS divergences instead of KL yields improved results. This indicates that JS-based mutual information has an expressive power at least as the standard KL-based mutual information.
Date of Conference: 12-17 May 2019
Date Added to IEEE Xplore: 17 April 2019
ISBN Information:

ISSN Information:

Conference Location: Brighton, UK

Contact IEEE to Subscribe

References

References is not available for this document.