Skip to Main Content
The BP algorithm (error back propagation), which based on the error square sum rule and sigmoid or hyperbolical function, has been used widely. In this paper, in order to deal with the shortcomings and limitations of the BP, the author sets forth the cross entropy theory with formulae deduction in detail. A new activation function has also been put forward and the value range of it is 0 to 1. The new algorithm has the ability to adjust the learning speed by an adjustable parameter. By using the cross entropy theory and the new activation function, the processing of the FNN is speeded up and has a better dynamic performance. Test shows that the new algorithm is faster than the BP algorithm in every processing step and without worry about to put it into the non-convergence state. At the meantime, by introducing the fuzzy system, the linearity property of the BP has been transformed to non-linearity and gets rid of the black box problem, which makes the output of the neural network uneasily to be understood.