Skip to Main Content
Kernel based methods have recently been used widely in image denoising. Tuning the parameters of these algorithms directly affects their performance. In this paper, an iterative method is proposed which optimizes the performance of any kernel based denoising algorithm in the mean-squared error (MSE) sense, even with arbitrary parameters. In this work we estimate the MSE in each image patch, and use this estimate to guide the iterative application to a stop, hence leading to improve performance. We propose a new estimator for the risk (i.e. MSE) which is different than the often-employed SURE method. We illustrate that the proposed risk estimate can outperform SURE in many instances.