Skip to Main Content
In this paper, we present a new algorithm to estimate both entropy and divergence of two finite-alphabet, finite-memory tree sources, using only information provided by a realization from each of the two sources. Our algorithm outperforms a previous LZ-based method. It is motivated by data compression based on the Burrows-Wheeler block sorting transform, using the fact that if the input is a finite-memory tree source, then the divergence between the output distribution and a piecewise stationary memoryless distribution vanishes as the length of the input sequence goes to infinity.
Date of Conference: 2002