Skip to Main Content
Many problems in information theory involve optimizing the Kullback-Leibler (KL) divergence between probability distributions. Since KL divergence is difficult to analyze, these optimizations are often intractable. We simplify these problems by assuming the distributions of interest to be close to each other. Under this assumption, the KL divergence behaves like a squared Euclidean distance. With this simplification, we solve the open problem of broadcasting with degraded message sets, as a canonical example of network information theory problems.