Loading [MathJax]/extensions/MathMenu.js
Divergence measures based on the Shannon entropy | IEEE Journals & Magazine | IEEE Xplore

Divergence measures based on the Shannon entropy


Abstract:

A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not...Show More

Abstract:

A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness.<>
Published in: IEEE Transactions on Information Theory ( Volume: 37, Issue: 1, January 1991)
Page(s): 145 - 151
Date of Publication: 06 August 2002

ISSN Information:

No metrics found for this document.

Usage
Select a Year
2025

View as

Total usage sinceJan 2011:8,619
020406080100120140JanFebMarAprMayJunJulAugSepOctNovDec667897118128109000000
Year Total:596
Data is updated monthly. Usage includes PDF downloads and HTML views.
Contact IEEE to Subscribe

References

References is not available for this document.