Skip to Main Content
The success of neural network architectures depends heavily on the availability of effective learning algorithms. Radial basis function (RBF) neural networks provide attractive possibilities for solving signal processing and pattern classification problems. Gradient descent training (GD) of RBF networks has proven to be much more effective than more conventional methods. However, gradient descent training can be computationally expensive and its learning speed is very slow. The paper compares (GD) to methods based on either Kalman filtering (KF) or decoupled Kalman filter (DEKF). These new methods prove to be quicker than gradient descent training while still providing good performance at the same level of effectiveness when they are used in fingerprint-based positioning.