Skip to Main Content
In the context of remote sensing image analysis, Markov random field (MRF) models are relevant image-analysis tools, thanks to their ability to integrate the use of contextual information associated to the image data in the analysis process. However, especially when dealing with supervised classification, the estimation of the internal parameters of the adopted MRF model is still an open issue, typically addressed by using time-expensive "trial-and-error" procedures. In the present paper, an automatic supervised MRF parameter optimization algorithm is proposed, that can be applied to a broad class of MRF models. The method formulates the parameter estimation problem as the solution of a set of linear inequalities, solved by extending to the present context the Ho-Kashyap algorithm, proposed to compute a linear discriminant function for binary classification. The method is validated experimentally on three different (single-date and multitemporal) data sets, endowed with distinct MRF models.