Skip to Main Content
Neural network models are used as strong interpolation tools to model digital I/O buffer circuits accurately. Training a neural network involves use of complex training algorithms. Optimizing a neural network is complicated due to a large number of variable parameters involved in the process. Genetic algorithms are used to optimize a problem with a very large number of possible solutions as they can quickly find a near optimal solution without having to do an exhaustive search of the solution space. In this paper, a methodology based on genetic algorithms is proposed to optimize a neural network model to accurately capture the nonlinearity of digital driver circuits. The proposed methodology is tested on IBM driver circuits and results show significant improvement in the accuracy of the neural network model.