Skip to Main Content
We developed a Textural Classifier Neural Network Algorithm (TCNNA) to process Synthetic Aperture Radar (SAR) data to map oil spills. The algorithm processes SAR data and wind model outputs (CMOD5) using a combination of two neural networks. The first neural network filters out areas of the image that do not need to be processed by flagging pixels as oil candidates; the second neural network performs a statistical textural analysis to differentiate between pixels of sea surface with or without floating oil. By combining the two neural networks, we are able to process a full resolution geotiff SAR image (16 bit, ~ 350 MB) in less than one minute on a conventional PC. The algorithm performs efficiently for all radar incidence angles when wind conditions are above 3 m/s. When low wind conditions are present, the performance of the neural network classification is limited, however the algorithm output allows the user to easily discard any elements of the classification and export the final product as a map of the water covered by oil. The results of this algorithm allowed us to process rapidly all of the images collected by Envisat during the Gulf of Mexico (GOM) Deepwater Horizon (DWH) oil spill event. By normalizing oil detections by the frequency that each area was sampled, we estimate that oil covered a mean daily area of 10,750 km2 (with a total extent of 119,600 km2 of the GOM surface waters), and approximately 1,300 km of the Northern GOM shoreline was threatened by the presence of drifting oil.