Skip to Main Content
A unified approach is given for constructing cross entropy and dissimilarity measures between probability distributions, based on a given entropy function or a diversity measure. Special properties of quadratic entropy introduced by Rao  are described. In particular it is shown that the square root of the Jensen difference (dissimilarity measure) arising out of a quadratic entropy provides a metric on a probability space. Several characterizations of quadratic entropy are obtained.