Skip to Main Content
Multispectral (MS) images provided by Earth observation satellites have generally a poor spatial resolution while panchromatic images (PAN) exhibit a spatial resolution two or four times better. Data fusion is a means to synthesize MS images at higher spatial resolution than original by exploiting the high spatial resolution of the PAN. This process is often called pansharpening. The synthesis property states that the synthesized MS images should be as close as possible to those that would have been acquired by the corresponding sensors if they had this high resolution. The methods based on the concept Amélioration de la Résolution Spatiale par Injection de Structures (ARSIS) are able to deliver synthesized images with good spectral quality but whose geometrical quality can still be improved. We propose a more precise definition of the synthesis property in terms of geometry. Then, we present a method that takes explicitly into account the difference in modulation transfer function (MTF) between PAN and MS in the fusion process. This method is applied to an existing ARSIS-based fusion method, i.e., A trou wavelet transform-model 3. Simulated images of the sensors Pleiades and SPOT-5 are used to illustrate the performances of the approach. Although this paper is limited in methods and data, we observe a better restitution of the geometry and an improvement in all indices classically used in quality budget in pansharpening. We present also a means to assess the respect of the synthesis property from an MTF point of view.