Skip to Main Content
In the past decade, the new concept of coherent risk measure has found many applications in finance, insurance and operations research. In this paper, we introduce a new class of coherent risk measures constructed by using information-type pseudo-distances that generalize the Kullback-Leibler divergence, also known as the relative entropy. We first analyze the primal and dual representations of this class. We then study entropic value-at-risk (EVaR) which is the member of this class associated with relative entropy. We also show that conditional value-at-risk (CVaR), which is the most popular coherent risk measure, belongs to this class and is a lower bound for EVaR.