Skip to Main Content
The traditional neural networks with continuous weights easy to implement in software might often be very burdensome in the embedded hardware systems and therefore more costly. Hardware-friendly neural networks are essential to ensure the functionality and effectiveness of the embedded implementation. To achieve this aim, A GA-based algorithm for training neural networks with discrete weights and quantized on-linear activation function is presented in this paper. The performance of this procedure is evaluated by comparing it with multi-threshold method and continuous discrete learning method based on computing the gradient of the error function, and the simulation results show this new learning algorithm outperforms the other two greatly in convergence and generalization.