Skip to Main Content
For many applications in spectral image analysis, the quantitative model used to describe the data is based on first and second order statistics, linear mixture models (i.e., the convex hull), and / or linear subspaces. An example of this for anomaly detection is the well known RX algorithm, a statistical measure of the anomalousness of individual pixels when compared to the mean and covariance of the image. While these models perform well for several applications, as sensor resolution improves, and thus, complex clutter in the image has a stronger impact, the simple assumptions behind these algorithms are not necessarily well-met. Here, we propose a novel approach to spectral image processing that instead models the data using the concept of a high dimensional graph. The pixels are considered in the spectral space as the nodes (vertices) of the graph and edges are created connecting them if two nodes satisfy some similarity criterion. Given this spectral graph, there are several metrics that can be computed related to the overall graph connectivity, as well as the connectivity of individual pixels to the graph. This latter metric is used here as the anomaly detection metric. A hyperspectral image is tiled (thus providing a computational advantage in addition to a spatially adaptive background model) and the graph computed per tile. Each pixel in the tile is then assigned an anomalousness score based on its weighted vertex volume, or connectivity, to the graph. Results are presented for a reflective hyperspectral image with known targets and are shown to be comparable to the RX algorithm.