Skip to Main Content
An accelerated hybrid learning algorithm is proposed for the training of fuzzy wavelet neural networks (FWNNs). The algorithm gives the initial parameters by the clustering algorithm and then updates them with a combination of backpropagation and recursive least-squares methods. The parameters are updated in the direction of steepest descent but with a local adaptive learning rate that is different for each epoch and only depends on the sign of the gradient error function. This adaptive learning rate is chosen, such as to accelerate convergence. The learning algorithm is inspired from the halving method to find polynomials roots. Even though the results are quite satisfactory, the algorithm is much simpler than others that have been reported. In addition, it does includes no excessive terms in adapting formulation, unlike much of the research in this area. Furthermore, this algorithm somewhat increases accuracy, while using far fewer parameters. Simulation results indicate a superior convergence speed, in comparison with other training methods in FWNN.