Skip to Main Content
Theories of evidence have already been applied more or less successfully in the fusion of remote sensing images. These attempts were based on the classical evidential reasoning which works under the condition that all sources of evidence and their fusion results are related to the same invariable (static) frame of discernment. When working with multitemporal remote sensing images, some change occurrences are possible between two images obtained at a different period of time, and these changes need to be detected efficiently in particular applications. The classical evidential reasoning is adapted for working with an invariable frame of discernment over time, but it cannot efficiently detect nor represent the occurrence of change from heterogeneous remote sensing images when the frame is possibly changing over time. To overcome this limitation, dynamic evidential reasoning (DER) is proposed for the sequential fusion of multitemporal images. A new state-transition frame is defined in DER, and the change occurrences can be precisely represented by introducing a statetransition operator. Two kinds of dynamical combination rules working in the free model and in the constrained model are proposed in this new framework for dealing with the different cases. Moreover, the prior probability of state transitions is taken into account, and the link between DER and Dezert-Smarandache theory is presented. The belief functions used in DER are defined similarly to those defined in the Dempster-Shafer theory. As shown in the last part of this paper, DER is able to estimate efficiently the correct change detections as a postprocessing technique. Two applications are given to illustrate the interest of DER: The first example is based on a set of two SPOT images acquired before and after a flood, and the second example uses three QuickBird images acquired during an earthquake event.