Skip to Main Content
We explore a promising new approach to understanding the neural filter mechanisms intermediate in motion processing at low luminance. We carefully account for the known filter properties of early stages of visual processing in a nocturnal moth, and then measured spatiotemporal tuning of higher order neurons. We then use a computational model to identify likely strategies used to reject noisy signals at higher-order stages of motion detection. In so doing, we provide the first description of the spatial and temporal `pooling' filters in motion vision of nocturnal insects.