By Topic

Regression Approaches to Small Sample Inverse Covariance Matrix Estimation for Hyperspectral Image Classification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)

A key component in most parametric classifiers is the estimation of an inverse covariance matrix. In hyperspectral images, the number of bands can be in the hundreds, leading to covariance matrices having tens of thousands of elements. Lately, the use of linear regression in estimating the inverse covariance matrix has been introduced in the time-series literature. This paper adopts and expands these ideas to ill-posed hyperspectral image classification problems. The results indicate that at least some of the approaches can give a lower classification error than traditional methods such as the linear discriminant analysis and the regularized discriminant analysis. Furthermore, the results show that, contrary to earlier beliefs, estimating long-range dependencies between bands appears necessary to build an effective hyperspectral classifier and that the high correlations between neighboring bands seem to allow differing sparsity configurations of the inverse covariance matrix to obtain similar classification results.

Published in:

IEEE Transactions on Geoscience and Remote Sensing  (Volume:46 ,  Issue: 10 )