By Topic

Sharp Inequalities for f -Divergences

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Adityanand Guntuboyina ; Univ. of California, Berkeley, Berkeley, CA, USA ; Sujayam Saha ; Geoffrey Schiebinger

f-divergences are a general class of divergences between probability measures which include as special cases many commonly used divergences in probability, mathematical statistics, and information theory such as Kullback-Leibler divergence, chi-squared divergence, squared Hellinger distance, total variation distance, and so on. In this paper, we study the problem of maximizing or minimizing an f-divergence between two probability measures subject to a finite number of constraints on other f-divergences. We show that these infinite-dimensional optimization problems can all be reduced to optimization problems over small finite dimensional spaces which are tractable. Our results lead to a comprehensive and unified treatment of the problem of obtaining sharp inequalities between f-divergences. We demonstrate that many of the existing results on inequalities between f-divergences can be obtained as special cases of our results. We also improve on some existing non-sharp inequalities.

Published in:

IEEE Transactions on Information Theory  (Volume:60 ,  Issue: 1 )