Skip to Main Content
The high dimension of hyperspectral data leads to poor parameter estimates in conventional classification methods when a fixed amount of training data is available. Features in hyperspectral datasets are usually highly correlated, which further complicates estimation by introducing numerical instabilities in the covariance matrix estimates. To alleviate these problems several dimension reduction strategies has been proposed in the literature, mostly in the class of linear transforms. Regularization of parameter estimates has also been suggested, meant to counter the instabilities in covariance estimates. To benchmark some of these methods we compare several dimension reduction and regularization methods on a difficult landcover classification problem.