Skip to Main Content
The aim of this paper is to introduce a novel statistical model-based image fusion method for Synthetic Aperture Radar (SAR) and optical images. The current fusion algorithms are effective only in specific areas of the scene. Hence, the fused image may not contain enough information for subsequent processing like classification and feature extraction. Our proposed method aims to keep the maximum contextual and spatial information from the source data by exploiting the relationship between spatial domain cumulants and wavelet domain cumulants. Our contributions are in integrating the relationship between spatial and wavelet domain cumulants of source images into an image fusion process as well as in employing these wavelet cumulants for optimization of weights in a Cauchy convolution based image fusion scheme. The superior performance of the proposed algorithm is demonstrated in comparison to existing fusion algorithms using real SAR and optical images.