Skip to Main Content
The 2012 Data Fusion Contest organized by the Data Fusion Technical Committee (DFTC) of the IEEE Geoscience and Remote Sensing Society (GRSS) aimed at investigating the potential use of very high spatial resolution (VHR) multi-modal/multi-temporal image fusion. Three different types of data sets, including spaceborne multi-spectral, spaceborne synthetic aperture radar (SAR), and airborne light detection and ranging (LiDAR) data collected over the downtown San Francisco area were distributed during the Contest. This paper highlights the three awarded research contributions which investigate (i) a new metric to assess urban density (UD) from multi-spectral and LiDAR data, (ii) simulation-based techniques to jointly use SAR and LiDAR data for image interpretation and change detection, and (iii) radiosity methods to improve surface reflectance retrievals of optical data in complex illumination environments. In particular, they demonstrate the usefulness of LiDAR data when fused with optical or SAR data. We believe these interesting investigations will stimulate further research in the related areas.