Skip to Main Content
This paper studies the issue of which filters should be used for feature point detection. Classical feature point detection methods, e.g., SIFT, are based on the scale-space theory in which Gaussian filters are proven to be optimal under the scale-space axiom. However, the recent method SURF demonstrates empirically that a box filter can also achieve good performance even though it violates the scale-space axiom. This leads to the question: Is Gaussian filters necessary for feature point detection? Based on the analysis using filter bank and detection theory, we show that theoretically it is possible for a box filter to perform better than the Gaussian filter. Additionally, we show that a new filter, pyramid filter, performs better than both box and Gaussian filters in some situations.