Cumulative residual entropy, a new measure of information & its application to image alignment | IEEE Conference Publication | IEEE Xplore

Cumulative residual entropy, a new measure of information & its application to image alignment


Abstract:

We use the cumulative distribution of a random variable to define the information content in it and use it to develop a novel measure of information that parallels Shanno...Show More

Abstract:

We use the cumulative distribution of a random variable to define the information content in it and use it to develop a novel measure of information that parallels Shannon entropy, which we dub cumulative residual entropy (CRE). The key features of CRE may be summarized as, (1) its definition is valid in both the continuous and discrete domains, (2) it is mathematically more general than the Shannon entropy and (3) its computation from sample data is easy and these computations converge asymptotically to the true values. We define the cross-CRE (CCRE) between two random variables and apply it to solve the uni- and multimodal image alignment problem for parameterized (rigid, affine and projective) transformations. The key strengths of the CCRE over using the now popular mutual information method (based on Shannon's entropy) are that the former has significantly larger noise immunity and a much larger convergence range over the field of parameterized transformations. These strengths of CCRE are demonstrated via experiments on synthesized and real image data.
Date of Conference: 13-16 October 2003
Date Added to IEEE Xplore: 03 April 2008
Print ISBN:0-7695-1950-4
Conference Location: Nice, France

Contact IEEE to Subscribe

References

References is not available for this document.