Skip to Main Content
The radial basis function neural network (RBFNN) is a well known method for many kinds of application, including function approximation, classification, and prediction. However, the traditional RBFNN is not robust for the training data which contains outliers. In this paper, we propose a two-stage learning rule for RBFNN to eliminate the influence of outliers. The concept of the Chebyshev theorem for detecting outlier is adopted to filter out the potential outliers in the first stage, and the M-estimator is used for dealing with the insignificant outliers in the second stage. The experimental results show that the proposed method can reduce the prediction error compared with other methods. Furthermore, even though fifty percent of all observations are the outliers this method still has a good performance.