I. Introduction
Hyperspectral imaging is concerned with the measurement, analysis, and interpretation of spectra acquired from a given scene at a given distance by a satellite sensor. Two systems currently active and operated from airborne platforms are NASA Jet Propulsion Laboratory’s Airborne Visible/InfraRed Imaging Spectrometer (AVIRIS) and Naval Research Laboratory’s HYDICE sensor. Many more are under development. HYDICE sensor was developed by Hughes Danbury Optical Systems. Hyperspectral sensors are widely used in many fields such as geology, agriculture, and intelligence. A significant number of researchers work on hyperspectral image processing, such as automatic spectral target recognition (ASTR), image classification, and image fusion [1]–[6]. The major advantage of a hyperspectral sensor is its significantly improved spectral and spatial resolution. These improvements also mean that many unknown signals can be uncovered as anomalies without prior knowledge. This has significantly expanded the domain of many analysis techniques. The sensors have also been shown to detect targets with size smaller than a single pixel. In order to detect these targets, one must rely on their spectral properties and identify them on subpixel scale: a task that cannot be accomplished using traditional spatial-based image processing techniques. Real-time or nearly real-time processing of hyperspectral images is required for swift decisions, and it depends on fast data processing.