Skip to Main Content
Current art uses metadata associated with satellite images to facilitate their retrieval from image repositories. Typical metadata are geographic location, time, and data type. Because the metadata do not indicate which regions within an image are obscured by clouds, retrieval with such metadata may produce an image within which the region of interest (ROI) for the user is not visible. We report a system that can automatically determine whether an ROI is visible in the image, and can incorporate this into the metadata for individual images to enhance searching capability. The goal is to annotate each image with metadata regarding a number of ROIs. An experiment with the system annotated 236 advanced very high resolution radiometer (AVHRR) images of the North Atlantic from a five-month viewing period with descriptors that expressed the visibility of an ROI centered on Long Island, NY. For ground truth, we used the classifications of three human subjects to determine visibility of the same region of interest, and labeled the ROI with the majority decision of the three subjects. Partial cloud cover made the human determination subjective, and resulted in disagreements among the subjects. Using randomly selected training subsets of the images, we found the two images whose regions were most like those in images for which the Long Island region was visible.