Skip to Main Content
In this paper a framework is proposed for efficient entropy coding of data which can be represented by a parametric distribution model. Based on the proposed framework, an entropy coder achieves coding efficiency by estimating the parameters of the statistical model (for the coded data), either via maximum a posteriori (MAP) or Maximum Likelihood (ML) parameter estimation techniques. The problem of optimal entropy coding for transmission of a block of data x1,,x2,...xN , can be formulated by assuming that the data comes from a source with a parametric probability mass function (pmf) P(X1,X2,...XN;thetas) with parameter thetas (in general thetas is a vector). The parametric model assumption makes it possible to assign a probability to the event of observing x1,,x2,...xN, and use this probability for entropy coding of this data, only by conveying the parameter thetas.The impressive results from the simple parametric model, based on a geometric distribution of coded data for compression of natural images, are encouraging to further investigate the effect of more complicated data models such as Poisson distribution and mixture models.