Skip to Main Content
Stochastic models of images are commonly represented in terms of three random processes (random fields) defined on the region of support of the image. The observed image process G is considered as a composite of two random process: a high level process X, which represents the regions (or classes) that form the observed image; and a low level process Y, which describes the statistical characteristics of each region (or class). The representation G = (X, Y) has been widely used in the image processing literature in the past two decades. In this paper we show how to use expectation maximization (EM) algorithm to get accurate model for the low level image by using mixture of normal distribution. The main idea of the proposed algorithm is as follow: first, we will use the EM algorithm to get the most dominance mixtures in the given density (empirical density), and then we assume that the absolute error between the empirical density and the estimated density is another density and we use the EM algorithm to estimate the number of mixtures in this error and the parameters for each mixtures. Then the estimated density for the absolute error is added or subtracted from the estimated density according to the sign (error). Convergence to the true distribution is tested using the Levy distance. A popular model for the high level process X has been the Gibbs-Markov random field (GMRF) model. In this paper we will use the same approach, which described in A. El-Baz et al. (July 2003) to estimate the parameters of GMRF. The approach has been applied on real images (spiral CT slices) and provides satisfactory results.
Statistical Signal Processing, 2003 IEEE Workshop on
Date of Conference: 28 Sept.-1 Oct. 2003